GitHub Copilot
Code AIby GitHub / Microsoft
AI-powered code completion and chat assistant integrated into IDEs. Supports extensions via MCP (Model Context Protocol).
Registration & API Key Steps
Enable Copilot Free or Subscribe
Free tier available with limited features. Pro at $10/mo, Pro+ at $39/mo.
Open linkInstall IDE Extension
Install Copilot extension in VS Code, JetBrains, Vim, Neovim, or Visual Studio.
Build Extensions with MCP
Build custom extensions using Model Context Protocol (MCP). GitHub Apps-based extensions deprecated Nov 2025.
Pricing
| Tier | Price | Features |
|---|---|---|
| Copilot Free | Free | Limited completions, 50 premium requests/month, Select features |
| Copilot Pro | $10/month ($100/year) | Unlimited completions, 300 premium requests/month, All models, Copilot agent |
| Copilot Pro+ | $39/month | 1,500 premium requests/month, All Pro features, Priority access |
| Copilot Business | $19/user/month | Organization management, Policy controls, Audit logs |
| Copilot Enterprise | $39/user/month | All Business features, Knowledge bases, Fine-tuning |
Application Tips
Free Tier Available
GitHub Copilot Free provides limited but functional access at no cost. Great for trying it out.
MCP Replaces Extensions
GitHub Apps-based Extensions deprecated Nov 2025. Use MCP servers for new integrations.
Premium Requests
Premium requests power Chat, agent mode, code reviews, and model selection. Extra requests cost $0.04 each.
Students Get Free Pro
Verified students and open-source maintainers can get Copilot Pro for free.
China Access Solutions
Direct Access
GitHub is generally accessible in China. Copilot works through the IDE extension.
GitHub Proxy
If GitHub access is slow, use a proxy or mirror for better connectivity.
Code Example
// GitHub Copilot is used via IDE extensions, not direct API calls.
// For building MCP extensions:
// 1. Install MCP SDK
// npm install @modelcontextprotocol/sdk
// 2. Create an MCP Server
import { McpServer, ResourceTemplate } from '@modelcontextprotocol/sdk/server/mcp.js';
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
const server = new McpServer({
name: 'my-copilot-extension',
version: '1.0.0',
});
// Add a tool
server.tool('get-weather', { city: { type: 'string' } }, async ({ city }) => ({
content: [{ type: 'text', text: `Weather in ${city}: Sunny, 25°C` }],
}));
// Connect via stdio
const transport = new StdioServerTransport();
await server.connect(transport);Rate Limits
| Tier | Limits |
|---|---|
| Free | 50 premium requests/month |
| Pro | 300 premium requests/month + unlimited completions |
| Pro+ | 1,500 premium requests/month |
Recommended Use Cases
Related API Guides
OpenAI GPT-4o / GPT-4.1 / o3
OpenAI
OpenAI's flagship LLM family including GPT-4o for multimodal tasks, GPT-4.1 for long-context coding, and o3 for advanced reasoning. Industry-leading models with the largest developer ecosystem.
Anthropic Claude (Sonnet 4.5 / Opus 4.5)
Anthropic
Anthropic's Claude model family excels in nuanced reasoning, safety, and long-context tasks. Claude Sonnet 4.5 offers the best balance of cost and performance, while Opus 4.5 delivers frontier intelligence.
Google Gemini (2.5 Pro / 2.5 Flash)
Google's Gemini models offer a generous free tier, 1M token context window, and strong multimodal capabilities. Gemini 2.5 Pro leads in reasoning, while Flash models provide cost-effective alternatives.
Meta Llama 4 (Scout / Maverick)
Meta
Meta's open-source Llama 4 models are free to use and available through multiple cloud providers. Llama 4 Scout and Maverick offer competitive performance at extremely low cost through partner APIs.