cursor-oauth-opencode
v0.1.8
Published
OpenCode plugin that connects Cursor's API to OpenCode via OAuth, model discovery, and a local OpenAI-compatible proxy.
Maintainers
Readme
cursor-oauth-opencode
OpenCode plugin that connects to Cursor's API, giving you access to Cursor models inside OpenCode with full tool-calling support.
Install
npx cursor-oauth-opencode setup --global
opencode auth login --provider cursorFor project-local setup:
npx cursor-oauth-opencode setup --project
opencode auth login --provider cursorThe setup command is idempotent. It adds the npm plugin entry and a fallback
provider.cursor model registry to opencode.json, preserving existing user
model overrides.
Manual config
If you do not want to use the setup command, add this to
~/.config/opencode/opencode.json:
{
"$schema": "https://opencode.ai/config.json",
"plugin": [
"cursor-oauth-opencode@latest"
],
"provider": {
"cursor": {
"name": "Cursor",
"npm": "@ai-sdk/openai-compatible",
"api": "http://127.0.0.1:65535/v1",
"models": {
"composer-2-fast": {
"name": "Composer 2 Fast",
"reasoning": true,
"temperature": true,
"attachment": false,
"tool_call": true,
"limit": {
"context": 200000,
"output": 64000
}
}
}
}
}
}The fallback API URL is only a placeholder so OpenCode can list the provider before login. After auth, the plugin starts a local proxy and replaces Cursor models with live Cursor model discovery.
Authenticate
opencode auth login --provider cursorThis opens Cursor OAuth in the browser. Tokens are stored in
~/.local/share/opencode/auth.json and refreshed automatically.
Use
Start OpenCode and select any Cursor model. The plugin starts a local OpenAI-compatible proxy on demand and routes requests through Cursor's gRPC API.
How it works
- OAuth — browser-based login to Cursor via PKCE.
- Model discovery — queries Cursor's gRPC API for all available models.
- Local proxy — translates
POST /v1/chat/completionsinto Cursor's protobuf/HTTP/2 Connect protocol. - Native tool routing — rejects Cursor's built-in filesystem/shell tools and exposes OpenCode's tool surface via Cursor MCP instead.
HTTP/2 transport runs through a Node child process (h2-bridge.mjs) because
Bun's node:http2 support is not reliable against Cursor's API.
Architecture
OpenCode --> /v1/chat/completions --> Bun.serve (proxy)
|
Node child process (h2-bridge.mjs)
|
HTTP/2 Connect stream
|
api2.cursor.sh gRPC
/agent.v1.AgentService/RunTool call flow
1. Cursor model receives OpenAI tools via RequestContext (as MCP tool defs)
2. Model tries native tools (readArgs, shellArgs, etc.)
3. Proxy rejects each with typed error (ReadRejected, ShellRejected, etc.)
4. Model falls back to MCP tool -> mcpArgs exec message
5. Proxy emits OpenAI tool_calls SSE chunk, pauses H2 stream
6. OpenCode executes tool, sends result in follow-up request
7. Proxy resumes H2 stream with mcpResult, streams continuationDevelop locally
bun install
bun run build
bun test/smoke.ts