claude-responses-bridge
v0.3.5
Published
Local Claude-style Messages to OpenAI Responses bridge and wrapper CLI
Maintainers
Readme
Claude Responses Bridge
Claude Responses Bridge is a local CLI that lets Claude-style /v1/messages
clients talk to an upstream OpenAI-compatible /v1/responses server.
This project now ships with:
- an interactive startup console
- direct provider and token configuration inside the CLI
- multi-provider management with create, switch, update, and delete flows
- a local OpenAI-compatible
/v1/chat/completionsbridge for Cursor and Cline - a local OpenAI-compatible
/v1/responsespassthrough for tools that already use Responses API - a Cursor integration command that can detect Cursor, install Continue, and write a bridge-backed Continue config
- backward compatibility for older single-provider
config.local.jsonfiles - a three-section terminal layout with Header, Status Table, and Interactive Menu
- Chinese UI copy, ANSI colors, box borders, and keyboard navigation
- local-first smart routing with
single,failover, andround-robin - live provider telemetry from
/bridge/status
Install
npm install -g claude-responses-bridgeQuick Start
Open the interactive console:
crbOr run the local file:
node .\cli.jsThe console shows the current bridge state and gives you guided actions for:
- starting the bridge
- launching Claude through the bridge
- editing bridge settings
- managing providers
- running diagnostics
The interactive console now uses:
- a branded ASCII Art header
- a status dashboard with provider and bridge state
- an arrow-key menu instead of typing
1,2,3 - a short loader before high-impact actions such as starting the bridge
Direct CLI Configuration
You can also configure everything without opening the console:
crb configure --name Main --base-url https://your-upstream.example.com --api-key sk-xxxxThat command creates or updates the active provider and writes the config file.
You can override bridge settings at the same time:
crb configure `
--name Main `
--base-url https://your-upstream.example.com `
--api-key sk-xxxx `
--port 3456 `
--host 127.0.0.1 `
--timeout 600000 `
--map-default gpt-5.1-codexCursor Plugin Setup
If Cursor Free blocks the native BYOK model picker, use the bridge through an official third-party extension instead of trying to unlock Cursor's built-in named models.
Guided Cursor integration:
crb cursorNon-interactive install + config write:
crb cursor --install --write-configChoose a specific model from the current provider when writing config:
crb cursor --write-config --model gpt-5.2This flow:
- detects the local Cursor command path
- checks whether the official
Continueextension is installed - optionally installs
Continueinto Cursor - optionally writes
~/.continue/config.yamlso Continue uses the local bridge - can select a model from the current provider's live
/v1/modelslist and persist it for future runs
This flow does not:
- unlock Cursor's built-in named model picker
- remove Cursor's native free-plan restriction
- modify your upstream provider key inside Cursor's native account system
Provider Management
List providers:
crb provider listAdd a provider:
crb provider add --name Backup --base-url https://backup.example.com --api-key sk-backupAdd and activate it immediately:
crb provider add --name Backup --base-url https://backup.example.com --api-key sk-backup --activateSwitch the active provider:
crb provider use backupUpdate or replace a provider:
crb provider update backup --base-url https://new-upstream.example.com --api-key sk-newDelete a provider:
crb provider remove backupThe CLI prevents deleting the last remaining provider so the bridge does not fall into an unusable state.
Start the Bridge
Start the local bridge with the active provider:
crb serveStart it with a specific provider:
crb serve --provider backupOn startup the CLI now prints a short session overview that shows the selected provider, upstream, token preview, and local listen address.
Launch Claude Through the Bridge
crb claudeChoose a provider for one run:
crb claude --provider backupPass Claude CLI arguments after --:
crb claude --provider backup -- -p "Reply with just OK."Diagnostics
Human-readable doctor output:
crb doctorJSON doctor output:
crb doctor --jsonThe doctor report includes:
- config path
- active provider
- upstream base URL
- masked token state
- Claude CLI detection
- local listen URL
- warnings
Cursor and VSCode / Cline
Start the bridge:
node .\cli.js servePrint IDE-ready local settings:
node .\cli.js ide
node .\cli.js ide --jsonOr use the guided Cursor integration flow:
node .\cli.js cursorThe bridge now exposes an OpenAI-compatible local endpoint:
http://127.0.0.1:3456/v1Use this local endpoint inside Cursor or Cline instead of your upstream proxy:
- Base URL:
http://127.0.0.1:3456/v1 - API Key:
bridge-local - Recommended model:
gpt-5.2-codex
Cursor
In Settings -> Models -> API Keys:
- enable
OpenAI API Key - set the key to
bridge-local - enable
Override OpenAI Base URL - set the base URL to
http://127.0.0.1:3456/v1 - choose
gpt-5.2-codex
If your Cursor plan blocks native BYOK flows, install Cline inside Cursor and use
the Cline setup below, or run crb cursor to set up Continue automatically.
VSCode / Cursor + Cline
In Cline settings:
API Provider:OpenAI CompatibleBase URL:http://127.0.0.1:3456/v1API Key:bridge-localModel ID:gpt-5.2-codexNative Tool Call: optional, but supported by the bridge
Smart Routing
This bridge is no longer just a static relay. It now supports:
single: always use the selected providerfailover: use the selected provider first, then automatically retry other enabled providersround-robin: distribute requests across enabled providers
Show the current route mode:
crb route showSwitch to automatic failover:
crb route set failoverInspect live provider health:
crb statusLocal status endpoint:
GET /bridge/statusConfig File
The config format now supports multiple providers. A simplified example:
{
"schemaVersion": 2,
"port": 3456,
"listenHost": "127.0.0.1",
"upstreamBaseUrl": "https://your-upstream.example.com",
"apiKey": "<YOUR_ACTIVE_PROVIDER_TOKEN>",
"requestTimeoutMs": 600000,
"selectedProviderId": "main",
"providers": [
{
"id": "main",
"name": "Main Provider",
"baseUrl": "https://your-upstream.example.com",
"apiKey": "<YOUR_ACTIVE_PROVIDER_TOKEN>"
}
],
"modelMap": {
"default": "gpt-5.1-codex",
"opus": "gpt-5.1-codex-max",
"sonnet": "gpt-5.1-codex",
"haiku": "gpt-5.1-codex-mini"
}
}Older configs with only upstreamBaseUrl and apiKey still work. They are
normalized into the new provider model when the CLI loads them.
Endpoints
GET /healthGET /modelsGET /v1/modelsGET /v1/models/:idPOST /chat/completionsPOST /v1/chat/completionsPOST /responsesPOST /v1/responsesPOST /v1/messagesPOST /v1/messages/count_tokens
Notes
config.local.jsonremains ignored by git.- Keep real domains and tokens out of screenshots and logs.
- Use
config.example.jsonas a safe public example.
