openai-claude
v1.0.2
Published
Anthropic-compatible proxy for Azure OpenAI so Claude Code can use GPT deployments
Readme
Claude Code Proxy
TypeScript server that exposes an Anthropic-compatible API backed by Azure OpenAI, allowing Claude Code to communicate with Azure deployments via localhost:9999.
Prerequisites
- Node.js 18+
~/.codex/config.tomlwith anazuremodel_providerentry (the existing CLI config works)- Environment variable matching the
env_keydefined in the config (e.g.OPENAI_API_KEY)
Install
npm installRun
# optional: override port
export PORT=9999
npm run devOr run the compiled CLI directly:
npm run build
node dist/cli.js --port 9999After publishing you can install it globally:
npm install -g openai-claude
openai-claude --port 9999 --host 0.0.0.0Set Claude Code to use the proxy:
export ANTHROPIC_BASE_URL="http://0.0.0.0:9999"
export ANTHROPIC_AUTH_TOKEN="your-shared-secret"
export ANTHROPIC_MODEL="gpt-5-codex"
# optional: verbose proxy logging
# export DEBUG=trueBy default the server requires x-api-key/Bearer auth that matches ANTHROPIC_AUTH_TOKEN. If you omit that environment variable the proxy skips authentication.
Notes
- The proxy reads
~/.codex/config.tomlon startup to discover the Azure endpoint, wire API, and API key environment variable. - If the configured
env_key(e.g.OPENAI_API_KEY) is unset, the proxy falls back toANTHROPIC_AUTH_TOKENso you can reuse the same secret for both Claude Code and Azure. - Optional
model_reasoning_effortin~/.codex/config.tomlis forwarded to Azure via thereasoning.effortfield unless a request provides its own override. /v1/messagesnow supports both standard and streaming responses. Streaming is proxied as SSE using Anthropic-compatible events.- Tool/function calls are translated between Anthropic and Azure
responsesformats, so Claude Code can invoke tools and return results through the proxy. - Requests are forwarded to the Azure
responsesAPI and the response is translated back to the Anthropic message format expected by Claude Code. - Tool outputs should be returned to the assistant as Anthropic
tool_resultcontent blocks; the proxy relays them to Azure asfunction_call_outputentries automatically. npm run buildis executed automatically beforenpm publishso the generateddist/folder is packaged; the published module exposes a globalopenai-claudeexecutable.- Image content blocks from Claude Code (
type: "image") are converted to Azureinput_imagepayloads, so vision prompts work through the proxy.
