codex-proxy
v2.1.0
Published
Format conversion proxy enabling Claude Code CLI to work with GLM-4.5-Air model including full tool calling support
Downloads
27
Maintainers
Readme
Claude Code Proxy - Claude Code to GLM-4.5-Air
A format conversion proxy server that enables using GLM-4.5-Air model with Claude Code CLI including full tool calling support.
What It Does
This proxy acts as a bridge between Claude Code and the GLM-4.5-Air model with format conversion:
- ✅ Receives requests from Claude Code in Anthropic's Messages API format
- ✅ Converts to OpenAI Chat Completions format
- ✅ Forwards to GLM-4.5-Air API
- ✅ Converts responses back to Anthropic format
- ✅ Full tool calling support (Anthropic
tool_use↔ OpenAItool_calls) - ✅ Handles both streaming and non-streaming responses
- ✅ Comprehensive logging for debugging
Result: Use Claude Code with FREE GLM-4.5-Air model with full functionality!
Quick Start (NPM Installation)
3 simple steps to get started:
1. Install globally via npm
npm install -g claude-code-proxy2. Start the proxy
claude-code-proxy # Normal mode
claude-code-proxy --debug # Debug mode (verbose logging)
claude-code-proxy --port 4000 # Custom port
claude-code-proxy --help # Show all options3. Configure Claude Code
In a new terminal:
export ANTHROPIC_AUTH_TOKEN="dummy"
export ANTHROPIC_BASE_URL="http://localhost:3333"
claudeThat's it! Start using Claude Code with free Chutes GLM models.
Local Development Setup
Install dependencies
npm installConfigure environment variables
cp .env.example .env # Edit .env with your GLM-4.5-Air API credentialsStart the proxy server
Normal mode (minimal logging):
npm run proxyDebug mode (verbose logging):
npm run proxy:debug
Configure Claude Code
To use the proxy with Claude Code CLI:
Option 1: Using environment variables (Recommended)
export ANTHROPIC_AUTH_TOKEN="dummy"
export ANTHROPIC_BASE_URL="http://localhost:3333"
claudeOption 2: Update ~/.claude/settings.json
{
"env": {
"ANTHROPIC_AUTH_TOKEN": "dummy",
"ANTHROPIC_BASE_URL": "http://localhost:3333"
}
}Usage
Start the proxy server (in one terminal):
npm run proxy # or for debug mode with full logs: npm run proxy:debugUse Claude Code normally (in another terminal):
export ANTHROPIC_AUTH_TOKEN="dummy" export ANTHROPIC_BASE_URL="http://localhost:3333" claudeClaude Code will now use GLM-4.5-Air model with full tool calling support!
To switch back to regular Claude:
unset ANTHROPIC_AUTH_TOKEN ANTHROPIC_BASE_URL claude
Configuration
The proxy uses the following environment variables (in .env):
GLM_API_TOKEN: Your GLM-4.5-Air API token (required)GLM_API_URL: GLM-4.5-Air API endpoint (default: https://llm.chutes.ai/v1/chat/completions)GLM_MODEL: Model to use (default: zai-org/GLM-4.5-Air)PORT: Proxy server port (default: 3333)
Features
✅ Full tool calling support - Anthropic tool_use ↔ OpenAI tool_calls conversion
✅ Streaming responses - Real-time token-by-token output
✅ Comprehensive logging - Color-coded debug logs for troubleshooting
✅ Format conversion - Seamless Anthropic ↔ OpenAI translation
✅ Free GLM-4.5-Air API - Use GLM-4.5-Air at no cost
✅ Multiple models - Maps to Haiku, Sonnet, and Opus tiers
✅ TypeScript support - Full type definitions and IntelliSense support
Logging
The proxy provides color-coded logs for easy debugging:
- 📥 Blue - Incoming requests from Claude Code
- 🔄 Magenta - Format conversion operations
- 📤 Cyan - Requests forwarded to Chutes API
- ✅ Green - Successful operations
- ❌ Red - Errors and failures
- 🌊 Cyan - Streaming responses
Normal mode: Clean, minimal logs Debug mode: Full request/response bodies and conversion details
Documentation
RESEARCH.md- Complete research findings and solution approachesTEST-RESULTS.md- GLM-4.5-Air API test resultsTESTING-GUIDE.md- Comprehensive guide for testing and debugging the proxyTYPESCRIPT.md- TypeScript usage guide and type definitions
Support
For issues or inquiries, contact: [email protected]
License
MIT
