This package provides an OpenAI-compatible interface for GitHub Copilot API, designed to work seamlessly with Vercel AI SDK v6.
- Full TypeScript support
- Seamless integration with Vercel AI SDK v6
- Easy to use API matching other AI SDK providers
- Automatic endpoint switching: Uses
/responsesendpoint for Codex models,/chat/completionsfor others - Smart request formatting: Automatically converts
messagesarray to OpenAI Responses APIinputformat for Codex models
npm install @opeoginni/github-copilot-openai-compatibleIf you need AI SDK v5 support, use the ai-v5 tag:
npm install @opeoginni/github-copilot-openai-compatible@ai-v5To get your GitHub Copilot API token, check out opencode-copilot-auth.
GitHub Copilot requires specific headers for authentication. While the provider handles your API key, you may need to configure additional headers:
const githubCopilot = createGithubCopilotOpenAICompatible({
apiKey: process.env.COPILOT_TOKEN,
headers: {
"Copilot-Integration-Id": "vscode-chat",
"User-Agent": "GitHubCopilotChat/0.26.7",
"Editor-Version": "vscode/1.104.1",
"Editor-Plugin-Version": "copilot-chat/0.26.7"
},
});These headers identify your application to GitHub Copilot. You may need to update version numbers based on your integration.
import { createGithubCopilotOpenAICompatible } from '@opeoginni/github-copilot-openai-compatible';
import { generateText } from 'ai';
const githubCopilot = createGithubCopilotOpenAICompatible({
apiKey: process.env.COPILOT_TOKEN,
headers: {
"Copilot-Integration-Id": "vscode-chat",
"User-Agent": "GitHubCopilotChat/0.26.7",
"Editor-Version": "vscode/1.104.1",
"Editor-Plugin-Version": "copilot-chat/0.26.7"
},
});
// Use Codex model (recommended)
const { text } = await generateText({
model: githubCopilot('gpt-5.1-codex'),
prompt: 'Write a Python function to sort a list',
});
console.log(text);This package fully supports OpenAI's Codex models, which use the advanced OpenAI Responses API:
gpt-5-codex- GPT-5 Codexgpt-5.1-codex- GPT-5.1 Codexgpt-5.1-codex-mini- GPT-5.1 Codex Minigpt-5.1-codex-max- GPT-5.1 Codex Max
For non-Codex models (standard chat completions), check your GitHub Copilot settings to see which models are available to you. You can use any model ID that Copilot supports - they'll automatically route to the /chat/completions endpoint.
Note: GitHub Copilot may provide access to various models (Claude, GPT, Gemini, etc.) based on your subscription. Check your Copilot settings for the full list of available models.
The provider automatically routes requests to the correct endpoint based on the model ID:
-
Codex Models (
gpt-5-codexor any model containing 'codex'):- Uses
/responsesendpoint - Converts
messagesarray to OpenAI Responses APIinputformat - Transforms message content parts (e.g.,
text→input_text,image_url→input_image) - System messages become
developerrole messages
- Uses
-
All Other Models:
- Uses standard
/chat/completionsendpoint - Uses standard OpenAI-compatible
messagesarray format
- Uses standard
This means you don't need to worry about the underlying API differences - the provider handles it automatically!
Creates a new GitHub Copilot provider instance.
Options:
apiKey?: Your GitHub Copilot API tokenbaseURL?: Base URL for API calls (default:https://api.githubcopilot.com)name?: Provider name (default:githubcopilot)headers?: Custom headers to include in requestsfetch?: Custom fetch implementation
Returns: A provider instance.
Pre-configured default provider instance. For production use, create your own instance with proper headers:
import { githubCopilot } from '@opeoginni/github-copilot-openai-compatible';
import { generateText } from 'ai';
// Create your own configured instance
const copilot = createGithubCopilotOpenAICompatible({
apiKey: process.env.COPILOT_TOKEN,
headers: {
"Copilot-Integration-Id": "vscode-chat",
"User-Agent": "GitHubCopilotChat/0.26.7",
"Editor-Version": "vscode/1.104.1",
"Editor-Plugin-Version": "copilot-chat/0.26.7"
},
});
const { text } = await generateText({
model: copilot('gpt-5.1-codex'),
prompt: 'Hello, world!',
});MIT