codex-cursor
v0.0.2
Published
Local OpenAI-compatible proxy that routes Cursor IDE traffic to your ChatGPT/Codex subscription via the codex CLI tokens.
Maintainers
Readme
codex-cursor
Local OpenAI-compatible HTTP proxy that lets Cursor (or anything else that speaks the OpenAI Chat Completions API) consume your ChatGPT/Codex subscription instead of a metered OpenAI API key.
It reuses the access tokens that the codex CLI stores in
~/.codex/auth.json and translates Cursor's /v1/chat/completions traffic
to the ChatGPT backend's Responses API.
Calls a private ChatGPT/Codex backend with the same credentials and limits as the
codexCLI. Personal use only.
Prerequisites
buninstalled.- The
codexCLI installed and signed in (codex login), so~/.codex/auth.jsonexists.
Run it
# one-shot, no install:
bunx codex-cursor --api-key "$(openssl rand -hex 16)"
# or from a clone:
bun install
bun run src/index.ts --api-key "$(openssl rand -hex 16)"Flags / env vars:
| Flag | Env var | Default |
| ------------------------ | ----------------------------- | --------------------- |
| --host <addr> | CODEX_SUB_HOST | 127.0.0.1 |
| --port <n> | CODEX_SUB_PORT | 4141 |
| --api-key <secret> | CODEX_SUB_API_KEY | no auth required |
| --auth-path <path> | CODEX_SUB_AUTH_PATH | ~/.codex/auth.json |
| --reasoning-effort lvl | CODEX_SUB_REASONING_EFFORT | xhigh (minimal, low, medium, high, xhigh) |
| --quiet / --verbose / --log-level lvl | CODEX_SUB_LOG_LEVEL | info (quiet, info, verbose) |
Always set --api-key when exposing the proxy via a tunnel — the public
URL is otherwise an open Codex-subscription faucet.
Expose it to Cursor
Cursor's chat runs on Cursor's cloud backend, which calls your custom base
URL. It refuses private addresses, so http://127.0.0.1:4141/v1 will fail
with Access to private networks is forbidden.
Use a Cloudflare quick tunnel:
brew install cloudflared
# terminal A:
bunx codex-cursor --api-key "$(openssl rand -hex 16)"
# terminal B:
cloudflared tunnel --url http://127.0.0.1:4141cloudflared prints a https://random-words.trycloudflare.com URL.
Point Cursor at it
- Cursor → Settings → Models → "OpenAI API Key" panel
- Toggle Override OpenAI Base URL and set:
- Base URL:
https://<your-tunnel>.trycloudflare.com/v1 - API Key: the hex string you passed to
--api-key.
- Base URL:
- Click Verify.
- In the model picker, add a custom model. Working slugs:
gpt-5.5,gpt-5.4,gpt-5.4-mini,gpt-5.3-codex,gpt-5.3-codex-spark. The exact list is returned byGET /v1/models.
Caveats
- The proxy issues stateless single-turn calls; Cursor's full history is sent every time.
- If
codex logoutinvalidates your refresh token, the proxy fails withrefresh_token_expireduntil youcodex loginagain.
