copilot-api-node20
v0.20.4
Published
Turn GitHub Copilot into OpenAI/Anthropic API compatible server. Usable with Claude Code! (Node v20+ fork)
Maintainers
Readme
copilot-api
A proxy server that turns a GitHub Copilot subscription into fully compatible OpenAI and Anthropic API endpoints. Use Claude Code, Codex CLI, and any tool that speaks the API -- all through the Copilot plan you already pay for, no separate API keys required.
Table of Contents
Key Features
| Feature | Description | |:--|:--| | Full Claude Code support | All Claude models, streaming, extended thinking, and the complete 1M token context window | | Full Codex CLI support | Complete OpenAI Responses API implementation purpose-built for Codex CLI | | Web search | Optional Tavily-powered web search available to both Claude Code and Codex sessions | | Dual API compatibility | OpenAI Chat Completions, Anthropic Messages, and OpenAI Responses APIs side by side | | Zero configuration | Single command to start -- authenticates via GitHub device-code OAuth |
Quick Start
Prerequisites: Node.js 20+ (or Bun) and an active GitHub Copilot subscription.
Claude Code
Start the proxy with Claude Code mode:
npx copilot-api-node20@latest start --claude-codeOn first run, you authenticate via GitHub device-code OAuth. Once authenticated, the proxy starts on http://localhost:4141 and prints a ready-to-paste command. In another terminal:
ANTHROPIC_BASE_URL=http://localhost:4141 claudeTo skip the interactive model picker, specify models upfront:
npx copilot-api-node20@latest start \
--claude-code \
--model claude-opus-4.6-1m \
--small-model claude-haiku-4.5Codex CLI
Start the proxy with Codex mode:
npx copilot-api-node20@latest start --codex --model gpt-5.3-codexConfigure ~/.codex/config.toml:
model = "gpt-5.3-codex"
model_provider = "local"
[model_providers.local]
name = "Local Server"
base_url = "http://localhost:4141/v1"
env_key = "OPENAI_API_KEY"Then run:
OPENAI_API_KEY=dummy codexWeb Search
Provide a Tavily API key to enable web search tool calls in both Claude Code and Codex sessions:
npx copilot-api-node20@latest start --claude-code --tavily-api-key <key>Alternatively, set TAVILY_API_KEY as an environment variable. The proxy operates normally when no key is provided -- web search is simply unavailable.
API Endpoints
All endpoints are served at http://localhost:4141 by default.
| Method | Path | Compatibility |
|:--|:--|:--|
| POST | /v1/chat/completions | OpenAI Chat Completions |
| POST | /v1/responses | OpenAI Responses API |
| POST | /v1/messages | Anthropic Messages |
| POST | /v1/messages/count_tokens | Anthropic Token Counting |
| POST | /v1/embeddings | OpenAI Embeddings |
| GET | /v1/models | OpenAI Models |
All OpenAI-compatible routes are also available without the /v1/ prefix.
CLI Reference
Commands
| Command | Description |
|:--|:--|
| copilot-api start | Start the proxy server |
| copilot-api auth | Authenticate with GitHub |
| copilot-api check-usage | Display Copilot usage quota |
| copilot-api debug | Print diagnostic information |
start Flags
--claude-codeand--codexare mutually exclusive.
| Flag | Alias | Default | Description |
|:--|:--|:--|:--|
| --port | -p | 4141 | Port to listen on |
| --claude-code | -c | | Enable Claude Code compatibility mode |
| --codex | | | Enable Codex CLI compatibility mode |
| --model | -m | | Primary model to use |
| --small-model | -s | | Lightweight model for fast tasks (Claude Code) |
| --rate-limit | -r | | Minimum seconds between requests |
| --timeout | -t | 600000 | Request timeout in ms (default 10 min) |
| --account-type | -a | individual | individual, business, or enterprise |
| --github-token | -g | | Provide a GitHub token directly (skips OAuth) |
| --show-token | | | Display tokens on fetch and refresh |
| --tavily-api-key | | | Tavily API key for web search |
| --verbose | -v | | Enable debug-level logging |
Telemetry
This tool collects anonymous usage telemetry via OpenTelemetry. Only operational metrics are collected -- request counts, latency, model usage, and error rates. No prompts, completions, or personal information is ever collected.
By using copilot-api, you agree to this telemetry data collection.
Closed Source
The source code is not publicly available.
This fork represents ~9 months of sustained engineering work -- bug fixes, new features, and hardening that go well beyond the original project. Keeping it closed-source protects that investment and ensures a single, well-maintained distribution.
The original open-source project is available at ericc-ch/copilot-api.
Notices
This project relies on reverse-engineered, undocumented GitHub APIs. It is not affiliated with or endorsed by GitHub, Microsoft, Anthropic, or OpenAI and may break without notice.
- Use
--rate-limitto throttle requests and reduce abuse-detection risk. - Review the GitHub Acceptable Use Policies and GitHub Copilot Terms before use.
- This software is provided as-is with no warranty.
Attribution
Originally created by Erick Christian -- ericc-ch/copilot-api. His work built the foundation: authentication flow, API translation layer, and streaming implementation.
This fork has diverged significantly over ~9 months with extensive bug fixes and major new features -- Anthropic Messages API, OpenAI Responses API, Claude Code mode, Codex CLI support, web search integration, usage telemetry, and more. It is maintained as a separate project by johnib.
