cursor-sdk-gateway
v0.1.0
Published
Use your own model API with @cursor/sdk local agents.
Maintainers
Readme
Cursor SDK Gateway
The Cursor Agent SDK is great, but every request routes through Cursor's backend and needs a Cursor API key. This is a small library that lets you point it at any provider you want, like DeepSeek, Kimi, OpenRouter, Vercel AI Gateway, LiteLLM, vLLM, or any OpenAI-compatible endpoint.
Your code keeps using @cursor/sdk, the local executor still runs all the tools the same way, and only the model call goes somewhere else.
import { configureCursorGateway } from "cursor-sdk-gateway"
await configureCursorGateway({
provider: "ai-gateway",
apiKey: process.env.AI_GATEWAY_API_KEY!,
})
const { Agent } = await import("@cursor/sdk")
const agent = await Agent.create({
model: { id: "deepseek/deepseek-v4-flash" },
local: { cwd: process.cwd() },
})
const run = await agent.send("Summarize this repository")
for await (const event of run.stream()) {
console.log(event)
}Install
npm install cursor-sdk-gateway @cursor/sdkQuick start
1. Configure once before @cursor/sdk is loaded
import { configureCursorGateway } from "cursor-sdk-gateway"
await configureCursorGateway({
provider: "ai-gateway",
apiKey: process.env.AI_GATEWAY_API_KEY!,
})
const { Agent } = await import("@cursor/sdk")2. Use Cursor SDK normally
const agent = await Agent.create({
model: { id: "deepseek/deepseek-v4-flash" },
local: { cwd: process.cwd() },
})
const run = await agent.send("Create a README section for local setup")
await run.wait()3. Remove it cleanly
Delete the configureCursorGateway() call and switch back to a Cursor model id. That's it.
If a file in your project already imports @cursor/sdk, put the gateway setup in your entrypoint before that import. The setup has to run first.
Vercel AI Gateway
export AI_GATEWAY_API_KEY="vck_..."await configureCursorGateway({
provider: "ai-gateway",
apiKey: process.env.AI_GATEWAY_API_KEY!,
})Use any provider model id your gateway supports. The examples default to deepseek/deepseek-v4-flash because it's cheap.
OpenAI-compatible endpoints
For OpenRouter, LiteLLM, vLLM, LocalAI, or your own gateway. The endpoint needs to support OpenAI-compatible chat completions with streaming and tool/function calls, since plain chat-only endpoints won't work for Cursor's agent loop.
await configureCursorGateway({
provider: "openai-compatible",
baseURL: "https://openrouter.ai/api/v1",
apiKey: process.env.OPENROUTER_API_KEY!,
})Local endpoint:
await configureCursorGateway({
provider: "openai-compatible",
baseURL: "http://localhost:4000/v1",
apiKey: "local-key",
})A few extra provider options pass through too:
- AI Gateway:
baseURL,headers,metadataCacheRefreshMillis,imageModel,image - OpenAI-compatible:
headers,queryParams,includeUsage,imageModel,image
For image generation:
await configureCursorGateway({
provider: "ai-gateway",
apiKey: process.env.AI_GATEWAY_API_KEY!,
image: { model: "openai/gpt-image-1", size: "1024x1024" },
})Without an image model, generateImage calls return a Cursor-shaped error result. The lifecycle event still fires correctly.
Migrating an existing Cursor SDK app
Before:
import { Agent } from "@cursor/sdk"
const agent = await Agent.create({
apiKey: process.env.CURSOR_API_KEY!,
model: { id: "composer-2" },
local: { cwd: process.cwd() },
})After:
import { configureCursorGateway } from "cursor-sdk-gateway"
await configureCursorGateway({
provider: "openai-compatible",
baseURL: process.env.OPENAI_COMPATIBLE_BASE_URL!,
apiKey: process.env.OPENAI_COMPATIBLE_API_KEY!,
})
const { Agent } = await import("@cursor/sdk")
const agent = await Agent.create({
model: { id: "deepseek/deepseek-v4-flash" },
local: { cwd: process.cwd() },
})Existing code that passes apiKey: process.env.CURSOR_API_KEY keeps working. For local gateway runs the gateway supplies the placeholder Cursor key the SDK expects.
Features
- Use any provider you already have a key for, including Vercel AI Gateway, OpenRouter, LiteLLM, vLLM, LocalAI, or any OpenAI-compatible endpoint.
- Drop into any existing
@cursor/sdkapp with one config call before the SDK import. - All Cursor local file and shell tools keep working:
write,edit,read,ls,grep,glob,delete,shell. - MCP servers, project hooks, subagents, and background shell with
write_shell_stdinall run as normal. - Optional image generation by passing an image model in the config.
npm testruns an offline parity check against every public Cursor tool, no API key needed.
Scope
Local agents only.
This doesn't replace Cursor's cloud features like VMs, hosted artifacts, the web Agents Window, or PR automation. For those, use Cursor's own runtime. agent.listArtifacts() returns an empty list in local mode, matching Cursor's own local SDK.
Examples
Runnable examples live in examples/. Each one is a normal @cursor/sdk local agent script:
node examples/basic/run.mjs
node examples/local-tools/run.mjs
node examples/mcp/run.mjs
node examples/hooks/run.mjs
node examples/subagents/run.mjs
node examples/background-shell/run.mjs
node examples/resume-generator/run.mjs "Person Name"resume-generator is adapted from Anthropic's Claude Agent SDK demos, with the same workflow swapped to @cursor/sdk and cursor-sdk-gateway.
See examples/README.md for setup and a one-line summary of each.
How it works
configureCursorGateway() starts a local endpoint that speaks Cursor SDK's protocol, points @cursor/sdk at it via CURSOR_BACKEND_URL, and lets Cursor's own local executor keep handling files, shell, MCP, subagents, and hooks. Only the model call is rerouted.
@cursor/sdk local agent
-> cursor-sdk-gateway local endpoint
-> Cursor local executor (files, shell, MCP, subagents)
-> gateway hook runner (.cursor/hooks.json)
-> Vercel AI SDK streamText
-> Vercel AI Gateway or @ai-sdk/openai-compatibleStack:
@cursor/sdkfor the public API your app importsaifor streaming viastreamTextand Vercel AI Gateway@ai-sdk/openai-compatiblefor OpenRouter, LiteLLM, vLLM, LocalAI, and private endpointszodfor model-facing tool schemas
Disclaimer
This isn't an official Cursor or Anysphere project, just something I built on top of their SDK.
License
MIT
Built with pattrns.ai by @divyaranjan_.
