@nzpr/codex-responses-api-proxy
v0.0.0-20260324.75c7f8518
Published
Local proxy for Codex CLI with auth.json/API-key auth and explicit Unix-socket secret redaction.
Maintainers
Readme
@nzpr/codex-responses-api-proxy
@nzpr/codex-responses-api-proxy is a modified fork of OpenAI's Codex responses proxy. It is meant to be paired with the normal Codex CLI and can authenticate using your usual ~/.codex/auth.json login, not only an API key.
It runs as a local proxy in front of Codex CLI and redacts only the secret values that another local process explicitly sends over a Unix socket before forwarding requests upstream.
This package distributes the prebuilt Codex Responses API proxy binary for macOS and Linux.
What This Is For
Use this package if you want:
- Codex CLI to keep using your normal ChatGPT or Codex CLI login from
auth.json - a local proxy layer between Codex CLI and the upstream responses endpoint
- explicit secret redaction before requests leave your machine
- an optional Unix socket where another local process can push extra secrets to redact
This package does not replace Codex CLI. You install Codex separately and point it at this proxy.
Quickstart
Install the package globally:
npm i -g @nzpr/codex-responses-api-proxyConfirm the binary is available:
codex-responses-api-proxy --helpUse Your Existing Codex Login
If you already use Codex CLI with auth.json, start the proxy like this:
codex-responses-api-proxy --auth-json --http-shutdown --server-info /tmp/server-info.jsonThis reads auth from CODEX_HOME/auth.json (default ~/.codex/auth.json).
If the auth in auth.json is a ChatGPT login, the proxy automatically:
- uses
https://chatgpt.com/backend-api/codex/responsesas the upstream - forwards
ChatGPT-Account-IDwhen present
Push Extra Secrets Over A Unix Socket
If you want another local process to supply additional secrets for redaction, start the proxy with --secret-socket /tmp/codex-secrets.sock. Only the values sent over that socket are filtered:
codex-responses-api-proxy \
--auth-json \
--secret-socket /tmp/codex-secrets.sock \
--http-shutdown \
--server-info /tmp/server-info.jsonThen connect to that Unix socket with either:
- a JSON array of strings
- a JSON object of
NAME: valuepairs - newline-delimited strings
- newline-delimited
NAME=valueorNAME: valueentries
Example:
python3 - <<'PY'
import json
import socket
payload = json.dumps(["internal-token-1", "db-password-2"]).encode()
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
sock.connect("/tmp/codex-secrets.sock")
sock.sendall(payload)
sock.close()
PYFor NAME=value / object input, the proxy uses only the values for redaction so env var names stay visible. Each socket write replaces the previous socket-provided list for subsequent requests. If you never send any secrets, nothing is redacted.
Use An API Key
If you want to start the proxy with an API key instead:
printenv OPENAI_API_KEY | env -u OPENAI_API_KEY \
codex-responses-api-proxy --http-shutdown --server-info /tmp/server-info.jsonPoint Codex At The Proxy
Read the port from the startup file:
PROXY_PORT=$(jq .port /tmp/server-info.json)
PROXY_BASE_URL="http://127.0.0.1:${PROXY_PORT}"Run Codex through the proxy:
codex exec \
-c "model_providers.openai_proxy={ name='OpenAI Proxy', base_url='${PROXY_BASE_URL}/v1', wire_api='responses' }" \
-c "model_provider='openai_proxy'" \
"Your prompt here"You can use the same -c settings with interactive codex as well.
When finished, stop the proxy:
curl --fail --silent --show-error "${PROXY_BASE_URL}/shutdown"More Docs
For the full CLI reference and behavior details, see:
Notes
- macOS and Linux vendor binaries are included in the npm package.
--auth-jsonis the easiest option if you already use Codex CLI with ChatGPT sign-in.--server-infois the easiest way to discover the local port that was selected.--secret-socketis the only source of redacted secret values.- The main use case is Codex CLI with normal
auth.jsonauth plus explicit socket-fed redaction.
