@yadimon/codex-to-llm-server
v1.0.0
Published
Use Codex as a simple OpenAI-compatible Responses server
Downloads
974
Maintainers
Readme
@yadimon/codex-to-llm-server
OpenAI-compatible Responses server on top of the raw prompt core in @yadimon/codex-to-llm.
Install
npm install -g @yadimon/codex-to-llm-serverOr run it without installing globally:
npx @yadimon/codex-to-llm-serverRequirements:
- Node.js
>=20 - installed
codexCLI inPATHorCODEX_TO_LLM_CLI_PATH - valid Codex auth in
~/.codex/auth.jsonorCODEX_TO_LLM_AUTH_PATH
Endpoints
POST /v1/responsesGET /v1/modelsGET /healthz
Start
npx @yadimon/codex-to-llm-serverThen call:
curl http://127.0.0.1:3000/healthz
curl http://127.0.0.1:3000/v1/modelsExample response request:
curl http://127.0.0.1:3000/v1/responses \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-5.3-codex-spark",
"input": "Say hello in one short sentence."
}'Streaming example:
curl http://127.0.0.1:3000/v1/responses \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-5.3-codex-spark",
"stream": true,
"input": "Count from 1 to 3."
}'Local development commands:
npm run start --workspace @yadimon/codex-to-llm-server
npm run start:mock --workspace @yadimon/codex-to-llm-serverAuthentication
If you set CODEX_TO_LLM_SERVER_API_KEY, only POST /v1/responses requires a bearer token. GET /healthz and GET /v1/models stay public.
Example:
curl http://127.0.0.1:3000/v1/responses \
-H "Authorization: Bearer your-token" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-5.3-codex-spark",
"input": "Hello"
}'Runtime Configuration
| Variable | Default | Description |
|---|---|---|
| CODEX_TO_LLM_SERVER_HOST | 127.0.0.1 | HTTP bind host. |
| CODEX_TO_LLM_SERVER_PORT | 3000 | HTTP bind port. |
| CODEX_TO_LLM_SERVER_DEFAULT_MODEL | gpt-5.3-codex-spark | Fallback model when the request omits model. |
| CODEX_TO_LLM_SERVER_MODELS | default model | Comma-separated allowlist of accepted models. |
| CODEX_TO_LLM_SERVER_API_KEY | - | Bearer token accepted for POST /v1/responses. |
| CODEX_TO_LLM_SERVER_MOCK_MODE | - | Enables the mock runner for local testing. |
| CODEX_TO_LLM_SERVER_MOCK_RESPONSE | mock response | Mock response text returned by the mock runner. |
| CODEX_TO_LLM_AUTH_PATH | ~/.codex/auth.json | Path to the Codex auth file. |
| CODEX_TO_LLM_CLI_PATH | codex | Path to the Codex CLI binary. |
| CODEX_TO_LLM_CONFIG_HOME | temp dir | Temporary Codex config directory for a run. |
| CODEX_TO_LLM_WORKSPACE | temp dir | Workspace directory used for Codex execution. |
| CODEX_TO_LLM_WEB_SEARCH | disabled | Web search mode forwarded to the core runner. |
| CODEX_TO_LLM_IGNORE_RULES | false | When truthy, pass --ignore-rules to the core runner. |
| CODEX_TO_LLM_IGNORE_USER_CONFIG | false | When truthy, pass --ignore-user-config to the core runner. |
| CODEX_TO_LLM_REASONING_EFFORT | low | Default reasoning effort passed to the core runner. |
| CODEX_TO_LLM_SANDBOX | read-only | Default sandbox mode passed to the core runner. |
Behavior Notes
GET /healthzandGET /v1/modelsstay public even when bearer auth is configuredPOST /v1/responsesvalidates requested models againstCODEX_TO_LLM_SERVER_MODELSmax_output_tokensandreasoning.effortare forwarded to the core runner- server CLI supports
--search,--web-search,--ignore-rules, and--ignore-user-config - unsupported request fields such as
tools,tool_choice, orinput_imagereturn400 - the server owns prompt adaptation for
instructionsand multi-message dialog input before calling the raw core runner - streaming emits one
response.output_text.deltaper Codexagent_message, not per token; clients expecting token-level deltas will see one large delta followed byresponse.completed - multi-message dialog input is flattened into a single text prompt with
### roleheaders; user-supplied content is not escaped, so a message that mimics those headers is observable in the prompt the model receives. Validate untrusted input upstream before forwarding it
Docker
Build from the repository root:
docker build -f packages/codex-to-llm-server/Dockerfile .
docker run -p 3000:3000 -v ~/.codex/auth.json:/run/secrets/codex-auth.json:ro \
-e CODEX_TO_LLM_AUTH_PATH=/run/secrets/codex-auth.json codex-to-llm-serverDevelopment
npm run build --workspace @yadimon/codex-to-llm-server
npm run lint --workspace @yadimon/codex-to-llm-server
npm run typecheck --workspace @yadimon/codex-to-llm-server