npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@nzpr/codex-responses-api-proxy

v0.0.0-20260324.75c7f8518

Published

Local proxy for Codex CLI with auth.json/API-key auth and explicit Unix-socket secret redaction.

Readme

@nzpr/codex-responses-api-proxy

@nzpr/codex-responses-api-proxy is a modified fork of OpenAI's Codex responses proxy. It is meant to be paired with the normal Codex CLI and can authenticate using your usual ~/.codex/auth.json login, not only an API key.

It runs as a local proxy in front of Codex CLI and redacts only the secret values that another local process explicitly sends over a Unix socket before forwarding requests upstream.

This package distributes the prebuilt Codex Responses API proxy binary for macOS and Linux.

What This Is For

Use this package if you want:

  • Codex CLI to keep using your normal ChatGPT or Codex CLI login from auth.json
  • a local proxy layer between Codex CLI and the upstream responses endpoint
  • explicit secret redaction before requests leave your machine
  • an optional Unix socket where another local process can push extra secrets to redact

This package does not replace Codex CLI. You install Codex separately and point it at this proxy.

Quickstart

Install the package globally:

npm i -g @nzpr/codex-responses-api-proxy

Confirm the binary is available:

codex-responses-api-proxy --help

Use Your Existing Codex Login

If you already use Codex CLI with auth.json, start the proxy like this:

codex-responses-api-proxy --auth-json --http-shutdown --server-info /tmp/server-info.json

This reads auth from CODEX_HOME/auth.json (default ~/.codex/auth.json).

If the auth in auth.json is a ChatGPT login, the proxy automatically:

  • uses https://chatgpt.com/backend-api/codex/responses as the upstream
  • forwards ChatGPT-Account-ID when present

Push Extra Secrets Over A Unix Socket

If you want another local process to supply additional secrets for redaction, start the proxy with --secret-socket /tmp/codex-secrets.sock. Only the values sent over that socket are filtered:

codex-responses-api-proxy \
  --auth-json \
  --secret-socket /tmp/codex-secrets.sock \
  --http-shutdown \
  --server-info /tmp/server-info.json

Then connect to that Unix socket with either:

  • a JSON array of strings
  • a JSON object of NAME: value pairs
  • newline-delimited strings
  • newline-delimited NAME=value or NAME: value entries

Example:

python3 - <<'PY'
import json
import socket

payload = json.dumps(["internal-token-1", "db-password-2"]).encode()
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
sock.connect("/tmp/codex-secrets.sock")
sock.sendall(payload)
sock.close()
PY

For NAME=value / object input, the proxy uses only the values for redaction so env var names stay visible. Each socket write replaces the previous socket-provided list for subsequent requests. If you never send any secrets, nothing is redacted.

Use An API Key

If you want to start the proxy with an API key instead:

printenv OPENAI_API_KEY | env -u OPENAI_API_KEY \
  codex-responses-api-proxy --http-shutdown --server-info /tmp/server-info.json

Point Codex At The Proxy

Read the port from the startup file:

PROXY_PORT=$(jq .port /tmp/server-info.json)
PROXY_BASE_URL="http://127.0.0.1:${PROXY_PORT}"

Run Codex through the proxy:

codex exec \
  -c "model_providers.openai_proxy={ name='OpenAI Proxy', base_url='${PROXY_BASE_URL}/v1', wire_api='responses' }" \
  -c "model_provider='openai_proxy'" \
  "Your prompt here"

You can use the same -c settings with interactive codex as well.

When finished, stop the proxy:

curl --fail --silent --show-error "${PROXY_BASE_URL}/shutdown"

More Docs

For the full CLI reference and behavior details, see:

Notes

  • macOS and Linux vendor binaries are included in the npm package.
  • --auth-json is the easiest option if you already use Codex CLI with ChatGPT sign-in.
  • --server-info is the easiest way to discover the local port that was selected.
  • --secret-socket is the only source of redacted secret values.
  • The main use case is Codex CLI with normal auth.json auth plus explicit socket-fed redaction.