claude-code-gemini-proxy
v2.0.1
Published
Use Google Gemini models in Claude Code via Vertex AI - A transparent proxy that replaces Claude with Gemini
Maintainers
Readme
🚀 Claude Code Gemini Proxy
Use Google Gemini models in Claude Code via Vertex AI
This proxy server transparently translates Anthropic API calls to Google Vertex AI, allowing you to use powerful Gemini models (Flash, Pro, and Experimental) directly in Claude Code.
✨ Features
- 🔄 Transparent API Translation - Claude Code thinks it's talking to Claude, but it's actually using Gemini
- ⚡ Multiple Models - Choose between Gemini 2.0 Flash, Pro, or Experimental
- 🌊 Streaming Support - Real-time streaming responses just like Claude
- 🔐 Flexible Auth - Supports Service Account, gcloud CLI, or Application Default Credentials
- 🎨 Interactive Setup - Simple CLI wizard to configure everything
- 🌍 Multi-Region - Deploy in any GCP region
📦 Installation
npm install -g claude-code-gemini-proxyOr install locally:
git clone https://github.com/stormdaemon/claude-code-gemini-proxy.git
cd claude-code-gemini-proxy
npm install
npm run build
npm link🚀 Quick Start
1. Setup (One-time)
Run the interactive setup wizard:
gemini-proxy setupYou'll be prompted to:
- ✅ Choose a Gemini model (Flash / Pro / Experimental)
- ✅ Select authentication method
- ✅ Enter your GCP Project ID
- ✅ Choose a region
- ✅ Test the connection
2. Start the Proxy
gemini-proxy startThe proxy will start on http://localhost:8080 (or your configured port).
3. Configure Claude Code
In your terminal, set these environment variables:
export ANTHROPIC_BASE_URL=http://localhost:8080
export ANTHROPIC_API_KEY=dummy-keyAdd them to your ~/.bashrc or ~/.zshrc to make them permanent:
echo 'export ANTHROPIC_BASE_URL=http://localhost:8080' >> ~/.zshrc
echo 'export ANTHROPIC_API_KEY=dummy-key' >> ~/.zshrc
source ~/.zshrc4. Use Claude Code!
Now just use Claude Code normally - it will use Gemini instead:
claude "Explain this function"🎮 CLI Commands
| Command | Description |
|---------|-------------|
| gemini-proxy setup | Interactive setup wizard |
| gemini-proxy start | Start the proxy server |
| gemini-proxy status | Show current configuration |
| gemini-proxy test | Test connection to Vertex AI |
| gemini-proxy reset | Clear all configuration |
🔐 Authentication
Option 1: Application Default Credentials (Easiest)
gcloud auth application-default loginThen choose "Application Default Credentials" in the setup.
Option 2: Service Account
- Create a service account in GCP Console
- Download the JSON key file
- Choose "Service Account JSON file" in the setup
- Provide the path to your key file
Option 3: gcloud CLI
If you're already authenticated with gcloud, just choose "gcloud" in the setup.
🤖 Available Models
| Model | Description | Context Window | Best For | |-------|-------------|----------------|----------| | gemini-2.5-flash | Best price-performance | 1M tokens | Large-scale processing, low-latency tasks | | gemini-2.5-pro | Advanced thinking model | 2M tokens | Complex reasoning, large datasets | | gemini-3-pro-preview | Most intelligent (preview) | 2M tokens | Multimodal understanding, agentic tasks |
🛠️ Configuration
Configuration is stored in ~/.config/gemini-proxy/config.json
Example configuration:
{
"projectId": "my-gcp-project",
"location": "us-central1",
"model": "gemini-2.5-flash",
"authMethod": "adc",
"port": 8080
}🌍 Supported Regions
us-central1(Iowa)us-east4(Virginia)europe-west1(Belgium)europe-west4(Netherlands)asia-southeast1(Singapore)
🔧 Advanced Usage
Running as a Background Service
Linux/Mac (systemd)
Create /etc/systemd/system/gemini-proxy.service:
[Unit]
Description=Gemini Proxy Server
After=network.target
[Service]
Type=simple
User=yourusername
ExecStart=/usr/local/bin/gemini-proxy start
Restart=always
[Install]
WantedBy=multi-user.targetEnable and start:
sudo systemctl enable gemini-proxy
sudo systemctl start gemini-proxyMac (launchd)
Create ~/Library/LaunchAgents/com.gemini-proxy.plist:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.gemini-proxy</string>
<key>ProgramArguments</key>
<array>
<string>/usr/local/bin/gemini-proxy</string>
<string>start</string>
</array>
<key>RunAtLoad</key>
<true/>
<key>KeepAlive</key>
<true/>
</dict>
</plist>Load:
launchctl load ~/Library/LaunchAgents/com.gemini-proxy.plistCustom Port
# During setup, choose a different port
# Or edit ~/.config/gemini-proxy/config.json directlyChange Model
gemini-proxy reset
gemini-proxy setup
# Choose a different model🐛 Troubleshooting
Authentication Errors
# Test your auth
gemini-proxy test
# Re-authenticate with gcloud
gcloud auth application-default login
# Verify your service account has permissions
gcloud projects get-iam-policy YOUR_PROJECT_IDPort Already in Use
# Check what's using the port
lsof -i :8080
# Or change the port during setup
gemini-proxy reset
gemini-proxy setupClaude Code Not Using Proxy
Make sure environment variables are set:
echo $ANTHROPIC_BASE_URL # Should be http://localhost:8080
echo $ANTHROPIC_API_KEY # Should be dummy-key or anything
# Re-export them
export ANTHROPIC_BASE_URL=http://localhost:8080
export ANTHROPIC_API_KEY=dummy-keyVertex AI API Not Enabled
# Enable the Vertex AI API
gcloud services enable aiplatform.googleapis.com --project=YOUR_PROJECT_ID📚 How It Works
┌─────────────────┐
│ Claude Code │ Thinks it's calling Anthropic API
└────────┬────────┘
│ POST /v1/messages
↓
┌─────────────────────┐
│ Gemini Proxy │ Translates request format
│ (localhost:8080) │ Routes to Vertex AI
└────────┬────────────┘
│ POST /v1/projects/.../models/gemini-...:generateContent
↓
┌─────────────────────┐
│ Vertex AI Gemini │ Processes with Gemini
│ (GCP) │ Returns response
└─────────────────────┘The proxy:
- Accepts Anthropic API format requests
- Translates them to Gemini API format
- Calls Vertex AI
- Translates responses back to Anthropic format
- Returns to Claude Code
🔒 Security
- Service account keys are stored locally only
- No data is logged or transmitted except to Vertex AI
- All authentication uses official Google libraries
- HTTPS is used for all Vertex AI communication
🤝 Contributing
Contributions welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
📝 License
MIT License - see LICENSE file for details
🙏 Credits
Built with:
- Fastify - Web server
- Google Cloud AI Platform - Gemini API
- Inquirer - Interactive CLI
- Chalk - Terminal styling
🆘 Support
⭐ Star History
If this project helps you, consider giving it a star! ⭐
Made with ❤️ by Droid AI
🆕 Changelog (v2.0.0 - December 2025)
Major Updates
- ✅ Node.js 22+ requirement (ESM modules)
- ✅ TypeScript 5.7 with latest features
- ✅ Fastify 5.2 latest web server
- ✅ Chalk 5.6 & Inquirer 13 (ESM)
- ✅ Google Cloud AI Platform 5.13 latest SDK
- ✅ Updated to current Gemini models (2.5 Flash, 2.5 Pro, 3 Pro Preview)
Breaking Changes
- Requires Node.js 22+ (previously 18+)
- All imports now use ESM format
- Some dependency API changes (handled internally)
