github-copilot-router
v1.2.2
Published
OpenAI & Anthropic compatible API router for GitHub Copilot SDK - Use Copilot with Claude Code, OpenAI Codex CLI, and more
Maintainers
Readme
GitHub Copilot Router
One command. No extra API keys. Just your existing GitHub Copilot plan.
Quick Start
Install GitHub Copilot CLI
# macOS/Linux brew install copilot-cli # Windows winget install GitHub.Copilot # npm (macOS, Linux, and Windows) npm install -g @github/copilotAuthenticate
copilot # Inside the CLI, type: /loginInstall and run
npm install -g github-copilot-router gcr cc # Launch Claude Code gcr cx # Launch OpenAI Codex gcr # Start the router server onlyThe server will start at
http://localhost:7318.
Tip: You can also authenticate via
GITHUB_TOKENenvironment variable with a PAT that has "Copilot Requests" permission.
Claude Code Integration
Claude Code can be configured to use this router as its backend, allowing you to use GitHub Copilot models through Claude Code's interface.
Quick Launch (Recommended)
The easiest way to use Claude Code with the router - no configuration needed:
gcr cc
# or: gcr claude-codeThis starts the router, launches Claude Code with the correct environment variables, and cleans up when you exit. All arguments are passed through:
gcr cc --resume
gcr cc --dangerously-skip-permissionsManual Setup
If you prefer to run the router separately:
Start the router (keep it running in a terminal):
gcrConfigure Claude Code by creating/editing
.claude/settings.jsonin your project:{ "env": { "ANTHROPIC_BASE_URL": "http://localhost:7318", "ANTHROPIC_AUTH_TOKEN": "not-required", "ANTHROPIC_API_KEY": "", "ANTHROPIC_DEFAULT_HAIKU_MODEL": "github-copilot/claude-haiku-4.5", "ANTHROPIC_DEFAULT_SONNET_MODEL": "github-copilot/claude-sonnet-4.5", "ANTHROPIC_DEFAULT_OPUS_MODEL": "github-copilot/claude-opus-4.5" } }Restart Claude Code to pick up the new configuration.
Notes
ANTHROPIC_AUTH_TOKENcan be any non-empty string (authentication is handled by GitHub Copilot)ANTHROPIC_API_KEYshould be empty or omitted- Model names in the config should match models available in GitHub Copilot
OpenAI Codex Integration
OpenAI Codex CLI can be configured to use this router as a custom model provider.
Quick Launch (Recommended)
The easiest way to use Codex with the router - no configuration needed:
gcr cx
# or: gcr codexThis starts the router, launches Codex with the correct provider configuration, and cleans up when you exit. All arguments are passed through:
gcr cx --model gpt-4o
gcr cx --full-auto "fix the tests"Manual Setup
If you prefer to run the router separately:
Start the router (keep it running in a terminal):
gcrConfigure Codex CLI by creating/editing
~/.codex/config.toml:model = "gpt-5.2-codex" model_provider = "proxy" [model_providers.proxy] name = "OpenAI using GitHub Copilot Router" base_url = "http://localhost:7318/v1" wire_api = "responses"Run Codex as normal:
codexIt will now route requests through GitHub Copilot.
Notes
- Model name should match a model available in GitHub Copilot (e.g.,
gpt-5.2-codex,gpt-4o,claude-sonnet-4.5) - No API key configuration needed - authentication is handled by GitHub Copilot
API Endpoints
| Endpoint | Method | Format | Description |
|----------|--------|--------|-------------|
| /v1/responses | POST | OpenAI | Responses API (recommended) |
| /v1/responses/input_tokens | POST | OpenAI | Token counting |
| /v1/chat/completions | POST | OpenAI | Chat completions (legacy) |
| /v1/models | GET | OpenAI | List available models |
| /v1/messages | POST | Anthropic | Messages API |
| /v1/messages/count_tokens | POST | Anthropic | Token counting |
| /health | GET | - | Health check |
Note: The
/v1/responsesendpoint is the newer OpenAI Responses API format, which is recommended over/v1/chat/completions. Some clients like OpenAI Codex CLI usewire_api = "responses"configuration.
Usage Examples
With curl (OpenAI format)
# Non-streaming
curl http://localhost:7318/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [{"role": "user", "content": "Hello!"}]
}'
# Streaming
curl http://localhost:7318/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [{"role": "user", "content": "Hello!"}],
"stream": true
}'With curl (Anthropic format)
curl http://localhost:7318/v1/messages \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4.5",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello!"}]
}'With OpenAI Python SDK
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:7318/v1",
api_key="not-required"
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)With Anthropic Python SDK
from anthropic import Anthropic
client = Anthropic(
base_url="http://localhost:7318",
api_key="not-required"
)
response = client.messages.create(
model="claude-sonnet-4.5",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.content[0].text)Configuration
CLI Commands
| Command | Description |
|---------|-------------|
| gcr | Start the router server |
| gcr claude-code | Launch Claude Code through the router |
| gcr cc | Alias for claude-code |
| gcr codex | Launch OpenAI Codex through the router |
| gcr cx | Alias for codex |
Note:
copilot-routeris an alias forgcr(e.g.,copilot-router ccworks too).
Options:
--port, -p <port>- Port for the router (default: 7318)--help, -h- Show help--version, -v- Show version
Environment Variables
| Variable | Default | Description |
|----------|---------|-------------|
| PORT | 7318 | Server port |
| GITHUB_TOKEN | - | GitHub PAT for authentication |
Troubleshooting
"AUTHENTICATION REQUIRED" error
You're not authenticated with GitHub Copilot. Follow the authentication steps above.
"copilot" command shows AWS Copilot
You have AWS Copilot installed which conflicts with GitHub Copilot CLI. Either:
- Uninstall both GitHub Copilot and AWS Copilot:
brew uninstall copilot-cli, and then install GitHub Copilot again. - Or ensure GitHub Copilot CLI is first in your PATH
License
MIT
