Copilot settings, IDE support, and compatibility reference

tech
prompt-engineering
github-copilot
reference
Quick-reference tables for VS Code Copilot settings, deprecated settings migration, BYOK provider details, legacy chatmode migration, and JetBrains IDE compatibility.
Author

Dario Airoldi

Published

March 1, 2026

Copilot settings, IDE support, and compatibility reference

This article is a lookup reference for GitHub Copilot’s configuration surface. It consolidates settings tables, deprecated settings with migration paths, BYOK provider details, legacy file migration steps, and JetBrains IDE compatibility into one place. For conceptual explanations of why these settings exist and how they fit the customization stack, see the concept articles in this series β€” in particular Understanding prompt files, instructions, and context layers and Understanding LLM models and model selection.

Table of contents


βš™οΈ VS Code settings reference

The tables below list every VS Code setting that controls GitHub Copilot’s customization features. All settings live under settings.json and can be configured at user, workspace, or folder level.

Prompt file settings

Setting Description Default
chat.promptFiles Enable or disable prompt file support true
chat.promptFilesLocations Additional folders to search for .prompt.md files []
chat.promptFilesRecommendations Show prompts as recommended actions when starting new chat true

Instructions settings

Setting Description Default
chat.instructionsFilesLocations Additional folders to search for .instructions.md files []
github.copilot.chat.reviewSelection.instructions Custom instructions for code review []
github.copilot.chat.commitMessageGeneration.instructions Instructions for commit message generation []
github.copilot.chat.pullRequestDescriptionGeneration.instructions Instructions for PR description generation []

Agent settings

Setting Description Default
chat.useNestedAgentsMdFiles Experimental: Enable AGENTS.md files in subfolders false
chat.useAgentSkills Preview: Enable Agent Skills from .github/skills/ false

MCP settings

Setting Description Default
chat.mcp.discovery.enabled Auto-discover MCP servers from mcp.json files true
chat.mcp.gallery.enabled Enable the MCP server gallery in Extensions view true

Settings Sync

You can synchronize prompt files and instructions across devices using VS Code’s Settings Sync. Enable via:

Settings Sync: Configure β†’ Check Prompts and Instructions

This ensures .prompt.md files and .instructions.md rules follow your VS Code profile across machines.


⚠️ Deprecated VS Code settings

The following settings are deprecated as of VS Code 1.102. Use .instructions.md files instead.

Deprecated setting Migration
github.copilot.chat.codeGeneration.instructions Use .github/instructions/*.instructions.md with applyTo patterns
github.copilot.chat.testGeneration.instructions Use .github/instructions/*.instructions.md with applyTo patterns

Recommendation: Remove these deprecated settings from your VS Code configuration and migrate to instruction files for better maintainability and version control. Instruction files are scoped with applyTo glob patterns, live in source control, and work across VS Code, Visual Studio, and SDK apps. For details on structuring instruction files, see How to structure content for Copilot instruction files.


πŸ”§ Language Models Editor and BYOK reference

VS Code 1.107 introduced the Language Models Editor and expanded bring-your-own-key (BYOK) support. For the conceptual explanation of model selection strategies, token economics, and when to choose which model, see Understanding LLM models and model selection. The tables below cover the practical configuration surface.

Language Models Editor features

Access via Settings β†’ GitHub Copilot β†’ Language Models or Command Palette β†’ β€œGitHub Copilot: Manage Language Models”.

Feature Description
Model Visibility Hide unused models from the picker for a cleaner interface
Capability Filtering Filter by vision support, tool calling, context window size
Provider Management Add models from installed providers without leaving the editor
Search with Highlighting Find models by name, provider, or capability

Filter syntax

Filter Description
@provider:openai Show only OpenAI models
@capability:vision Models with vision or image support
@capability:tools Models with tool or function calling
@visible:true/false Show visible or hidden models

BYOK providers

Bring-your-own-key lets you use models from external providers with your own API keys:

Provider Models available Notes
Cerebras Llama 3.3, DeepSeek v3.2, GLM-4.6 Extremely fast inference
OpenRouter 100+ models Unified API for multiple providers
Ollama Local models Fully local, no API calls
Azure OpenAI GPT-4o, GPT-4 Turbo Enterprise deployment
Anthropic (direct) Claude models Direct API access

Adding a BYOK provider

  1. Open Command Palette β†’ β€œGitHub Copilot: Add API Key”
  2. Select provider (Cerebras, OpenRouter, etc.)
  3. Enter your API key
  4. New models appear in the model picker

Model capabilities display

The model picker shows capability indicators:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Select Model                                             β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Claude Sonnet 4        [200K] [πŸ‘οΈ] [πŸ”§]                 β”‚
β”‚  GPT-4o                 [128K] [πŸ‘οΈ] [πŸ”§]                 β”‚
β”‚  DeepSeek v3.2          [64K]  [πŸ”§]                       β”‚
β”‚  Llama 3.3 70B          [128K] [πŸ”§]                       β”‚
β”‚  o3                     [200K] [πŸ”§] [πŸ’­]                  β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Legend: [context] [πŸ‘οΈ vision] [πŸ”§ tools] [πŸ’­ reasoning] β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

HuggingFace integration

The HuggingFace Inference Provider extension enables access to open-weights models:

Feature Description
Multiple Inference Providers HuggingFace Inference API, Nebius, SambaNova, Together AI, more
Fastest/Cheapest Mode Automatic routing to optimal provider based on cost or speed
Open-Weights Models Access to Llama, Mistral, DeepSeek, Qwen, and other open models

Installation

  1. Install β€œHuggingFace for VS Code” extension from marketplace
  2. Sign in with HuggingFace account
  3. Models appear in Copilot’s model picker
  4. Select fastest or cheapest based on your preference

Extension ID: huggingface.huggingface-vscode

Quota considerations

Important: BYOK models don’t consume GitHub Copilot quota, but:

  • An active Copilot subscription is still required
  • Background query refinement (using GPT-4o Mini) doesn’t count against quota
  • BYOK costs are billed directly by the provider
  • Full prompt logging is available in the output channel for debugging and optimization

Claude skills support

VS Code 1.107 added support for Claude skills β€” reusable capability definitions that extend Claude’s abilities:

  • Skills can be referenced in agent files via the tools field
  • Skills provide specialized behaviors (web browsing, file analysis, etc.)
  • Growing ecosystem of community and official skills

πŸ”„ Legacy .chatmode.md migration

Prior to VS Code 1.106, custom agents were called β€œchat modes” and used different file conventions.

Legacy Current
.chatmode.md extension .agent.md extension
.github/chatmodes/ folder .github/agents/ folder

Migration steps

  1. VS Code automatically recognizes legacy .chatmode.md files
  2. A Quick Fix action appears to migrate files to the new format
  3. Rename files from *.chatmode.md to *.agent.md
  4. Move files from .github/chatmodes/ to .github/agents/

Recommendation: Migrate to the new format for consistency with current documentation and to ensure future compatibility. For details on creating agents with the current format, see How to structure content for Copilot agent files.


πŸ–₯️ JetBrains IDE support

JetBrains IDEs (IntelliJ IDEA, PyCharm, WebStorm, etc.) support GitHub Copilot with their own conventions. The tables below summarize what’s supported, where global instructions live, and how JetBrains differs from VS Code.

Supported files

File type Location Notes
Workspace instructions .github/copilot-instructions.md Same as VS Code and Visual Studio
Global instructions OS-specific path (see below) Personal instructions across all projects
Prompt files .github/prompts/*.prompt.md Invoked via /promptName commands

Global instructions location

OS Path
Windows %APPDATA%\JetBrains\<product><version>\copilot\global-copilot-instructions.md
macOS ~/Library/Application Support/JetBrains/<product><version>/copilot/global-copilot-instructions.md
Linux ~/.config/JetBrains/<product><version>/copilot/global-copilot-instructions.md

Replace <product><version> with your IDE β€” for example, IntelliJIdea2024.3 or PyCharm2024.3.

Differences from VS Code

  • JetBrains doesn’t support .agent.md files (custom agents)
  • JetBrains doesn’t support MCP server configuration
  • Path-specific .instructions.md files work the same as VS Code
  • Prompt files use /promptName syntax (same as VS Code)

🎯 Conclusion

Key takeaways

  • VS Code settings control prompt files, instructions, agents, and MCP discovery β€” all defaulting to sensible values
  • Deprecated settings (codeGeneration.instructions, testGeneration.instructions) should be migrated to .instructions.md files
  • BYOK providers (Cerebras, OpenRouter, Ollama, Azure OpenAI, Anthropic) let you bring external models into the Copilot interface without consuming subscription quota
  • Legacy .chatmode.md files are auto-detected and can be migrated to .agent.md with a Quick Fix
  • JetBrains IDEs support workspace instructions and prompt files but don’t support custom agents or MCP configuration

Next steps


πŸ“š References

VS Code Copilot Customization Overview [πŸ“˜ Official] Microsoft’s comprehensive guide to customizing GitHub Copilot in VS Code. Covers custom agents, instructions, prompt files, and MCP configuration. The authoritative source for current settings and defaults.

VS Code v1.107 Release Notes [πŸ“˜ Official] December 2024 release introducing Agent HQ, the Language Models Editor, BYOK support, and Claude skills. Primary source for the model management and BYOK reference tables in this article.

VS Code v1.102 Release Notes [πŸ“˜ Official] Release that deprecated codeGeneration.instructions and testGeneration.instructions settings in favor of .instructions.md files. Primary source for the deprecated settings migration table.

JetBrains Copilot Documentation [πŸ“˜ Official] JetBrains documentation for GitHub Copilot integration. Covers supported file types, global instruction paths, and feature comparison with VS Code.

HuggingFace VS Code Extension [πŸ“˜ Official] VS Code Marketplace page for the HuggingFace Inference Provider extension. Provides access to open-weights models through multiple inference providers.