npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

copilot-cursor-proxy

v1.2.1

Published

Proxy that bridges GitHub Copilot API to Cursor IDE — translates Anthropic format, bridges Responses API for GPT 5.x, and more

Readme

🚀 Copilot Proxy for Cursor

Forked from jacksonkasi1/copilot-for-cursor with full Anthropic → OpenAI conversion + Responses API bridge.

Unlock the full power of GitHub Copilot in Cursor IDE.

Use all Copilot models (GPT-5.4, Claude Opus 4.6, Gemini 3.1, etc.) in Cursor — including Plan mode, Agent mode, and tool calls.


⚡ Quick Start

One Command (npm)

npx copilot-cursor-proxy

Requires Bun installed. First run will prompt GitHub authentication.

This starts both copilot-api (port 4141) and the proxy (port 4142) in a single terminal.

Or from source

git clone https://github.com/CharlesYWL/copilot-for-cursor.git
cd copilot-for-cursor
bun run start.ts

Enable Max Mode (auto-compact long conversations)

bun run start.ts --max

Max mode automatically compacts conversation history when the estimated token count exceeds 80% of the model's input token limit. It summarizes older messages into a structured summary while keeping the most recent messages intact — letting you have much longer coding sessions without hitting token limits.

Then start an HTTPS tunnel

Cursor requires HTTPS. In a second terminal:

# Cloudflare (free, no signup)
cloudflared tunnel --url http://localhost:4142

# Or ngrok
ngrok http 4142

Copy the HTTPS URL (e.g., https://xxxxx.trycloudflare.com).


🏗 Architecture

Cursor → (HTTPS tunnel) → proxy-router (:4142) → copilot-api (:4141) → GitHub Copilot
  • Port 4141 (copilot-api): Authenticates with GitHub, provides the OpenAI-compatible API, and natively handles the Responses API for GPT-5.x models.
  • Port 4142 (proxy-router): Converts Anthropic-format messages to OpenAI format, bridges Responses API for GPT-5.x models, handles the cus- prefix, and serves the dashboard.
  • HTTPS tunnel: Cursor requires HTTPS — a tunnel exposes the local proxy.

Proxy Router Modules

| File | Responsibility | |---|---| | proxy-router.ts | Entrypoint — Bun.serve, routing, CORS, dashboard, model list | | anthropic-transforms.ts | Anthropic → OpenAI normalization (fields, tools, messages) | | responses-bridge.ts | Chat Completions → Responses API bridge for GPT-5.x / goldeneye | | responses-converters.ts | Responses API → Chat Completions format (sync & streaming SSE) | | stream-proxy.ts | Streaming passthrough with chunk logging and error detection | | debug-logger.ts | Request/response debug logging helpers | | start.ts | One-command launcher for copilot-api + proxy-router | | max-mode.ts | Auto-compaction for long conversations (--max flag) | | usage-db.ts | Persistent request/token usage tracking | | auth-config.ts | API key generation, validation, and config persistence | | upstream-auth.ts | Upstream copilot-api authentication and key management |


⚙️ Cursor Configuration

  1. Go to Settings (Gear Icon) → Models.
  2. Add a new OpenAI Compatible model:
    • Base URL: https://your-tunnel-url.trycloudflare.com/v1
    • API Key: dummy (any value works)
    • Model Name: Use a prefixed name — e.g., cus-gpt-5.4, cus-claude-opus-4.6

⚠️ Important: You must use the cus- prefix. Without it, Cursor routes the request to its own backend.

💡 Tip: Visit the Dashboard to see all available models and copy their IDs.

Tested Models (19/20 passing)

| Cursor Model Name | Actual Model | Status | |---|---|---| | cus-gpt-4o | GPT-4o | ✅ | | cus-gpt-4.1 | GPT-4.1 | ✅ | | cus-gpt-41-copilot | GPT-4.1 Copilot | ❌ Not supported by GitHub | | cus-gpt-5-mini | GPT-5 Mini | ✅ | | cus-gpt-5.1 | GPT-5.1 | ✅ (deprecating 2026-04-15) | | cus-gpt-5.2 | GPT-5.2 | ✅ | | cus-gpt-5.2-codex | GPT-5.2 Codex | ✅ | | cus-gpt-5.3-codex | GPT-5.3 Codex | ✅ | | cus-gpt-5.4 | GPT-5.4 | ✅ | | cus-gpt-5.4-mini | GPT-5.4 Mini | ✅ | | cus-claude-haiku-4.5 | Claude Haiku 4.5 | ✅ | | cus-claude-sonnet-4 | Claude Sonnet 4 | ✅ | | cus-claude-sonnet-4.5 | Claude Sonnet 4.5 | ✅ | | cus-claude-sonnet-4.6 | Claude Sonnet 4.6 | ✅ | | cus-claude-opus-4.5 | Claude Opus 4.5 | ✅ | | cus-claude-opus-4.6 | Claude Opus 4.6 | ✅ | | cus-gemini-2.5-pro | Gemini 2.5 Pro | ✅ | | cus-gemini-3-flash-preview | Gemini 3 Flash | ✅ | | cus-gemini-3.1-pro-preview | Gemini 3.1 Pro | ✅ | | cus-text-embedding-3-small | Text Embedding 3 Small | N/A (embedding model) |

All GPT-5.x models now work thanks to the switch to @jeffreycao/copilot-api, which natively supports the Responses API. The proxy also includes its own Responses API bridge as a fallback.

Cursor Settings Configuration


✨ Features

What the proxy handles

| Cursor sends (Anthropic format) | Proxy converts to (OpenAI format) | |---|---| | system as top-level field | System message | | tool_use blocks in assistant messages | tool_calls array | | tool_result blocks in user messages | tool role messages | | input_schema on tools | parameters (cleaned) | | tool_choice objects (auto/any/tool) | OpenAI format (auto/required/function) | | stop_sequences | stop | | thinking / cache_control blocks | Stripped | | metadata / anthropic_version | Stripped | | Images in Claude requests | [Image Omitted] placeholder | | GPT-5.x max_tokens | Converted to max_completion_tokens | | GPT-5.x Responses API | Bridge built in (needs copilot-api support) |

Supported Workflows

  • 💬 Chat & Reasoning: Full conversation context with all models
  • 📋 Plan Mode: Works with tool calls and multi-turn conversations
  • 🤖 Agent Mode: File editing, terminal, search, MCP tools
  • 📂 File System: Read, Write, StrReplace, Delete
  • 💻 Terminal: Shell (run commands)
  • 🔍 Search: Grep, Glob, SemanticSearch
  • 🔌 MCP Tools: External tools (Neon, Playwright, etc.)
  • 🗜️ Max Mode: Auto-compact long conversations to stay within token limits (--max)

🔒 Security

Dashboard Password

The dashboard is password-protected. On first visit, set a password to prevent unauthorized access.

API Key Management

Manage API keys directly from the Endpoint tab in the dashboard:

  1. Toggle "Require API Key" to enable authentication
  2. Click "+ Create Key" to generate a new cpk-xxx key
  3. Copy the key (shown only once!) and paste it into Cursor's API Key field
  4. Enable/disable or delete keys as needed

When enabled, all /v1/* requests must include Authorization: Bearer <your-key>.

Dashboard

| Usage Tab | Console Log Tab | |---|---| | Usage | Console |


📊 Dashboard

Access the dashboard at http://localhost:4142

Three tabs:

  • Endpoint — Proxy URL, API key management, model list
  • Usage — Request stats, token counts, per-model breakdown, recent requests
  • Console Log — Real-time proxy logs with color-coded levels

⚠️ Known Limitations

| Feature | Status | |---|---| | Basic chat & tool calling | ✅ Works | | Streaming | ✅ Works | | Plan mode | ✅ Works | | Agent mode | ✅ Works | | All GPT-5.x models | ✅ Works | | Max mode (long session compaction) | ✅ Works (--max flag) | | Extended thinking (chain-of-thought) | ❌ Stripped | | Prompt caching (cache_control) | ❌ Stripped | | Claude Vision | ❌ Not supported via Copilot | | Tunnel URL changes on restart | ⚠️ Use paid plan for fixed subdomain |


📝 Troubleshooting

"Model name is not valid" in Cursor: Make sure you're using the cus- prefix (e.g., cus-gpt-5.4, not gpt-5.4).

Plan mode response cuts off: Ensure idleTimeout: 255 is set in proxy-router.ts (already configured). Slow models like Opus need longer timeouts.

GPT-5.x returns "use /v1/responses": The proxy auto-routes these. Make sure you're running the latest version.

"connection refused": Ensure services are running: bun run start.ts or check http://localhost:4142.

Max mode not compacting: Compaction only triggers when estimated tokens exceed 80% of the model's limit and there are at least 15 messages. Check the console log for 🗜️ Max mode messages.


⚠️ DISCLAIMER: This project is unofficial and for educational purposes only. It interacts with undocumented internal APIs of GitHub Copilot and Cursor. Use at your own risk. The authors are not affiliated with GitHub, Microsoft, or Anysphere (Cursor). Please use your API credits responsibly and in accordance with the provider's Terms of Service.