npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@aistack/claude-code-proxy

v0.1.2

Published

A high-performance proxy server that translates Anthropic Claude API requests to LiteLLM format for multi-provider LLM access

Downloads

34

Readme

@aistack/claude-code-proxy

A high-performance proxy server that translates Anthropic Claude API requests to LiteLLM format, enabling seamless integration with multiple LLM providers including OpenAI, Google Gemini, OpenRouter, and Anthropic.

Features

  • 🚀 Multi-Provider Support: Seamlessly switch between OpenAI, Gemini, OpenRouter, and Anthropic models
  • 🔄 Automatic Format Translation: Converts Anthropic API format to LiteLLM format and back
  • 🎯 Smart Model Mapping: Intelligently maps Claude models to equivalent models from other providers
  • 📡 Streaming Support: Full support for Server-Sent Events (SSE) streaming responses
  • 🛡️ Production Ready: Comprehensive error handling, logging, and graceful shutdown
  • High Performance: Built with Go for optimal performance and low resource usage
  • 🔧 Configurable: Flexible configuration via environment variables

Installation

Install globally using npm:

npm install -g @aistack/claude-code-proxy

Quick Start

  1. Initialize configuration:
ccproxy init

This will prompt you to create a configuration file at either:

  • User level: $HOME/.ccproxy/settings.json
  • Project level: ./.ccproxy/settings.json
  1. Edit the configuration file:
{
  "server": {
    "port": 8082,
    "host": "0.0.0.0"
  },
  "providers": {
    "openai": {
      "apiKey": "sk-...",
      "baseURL": "https://api.openai.com/v1",
      "responseStyle": "openai"
    },
    "anthropic": {
      "apiKey": "sk-ant-...",
      "baseURL": "https://api.anthropic.com",
      "responseStyle": "anthropic"
    },
    "gemini": {
      "apiKey": "...",
      "baseURL": "https://generativelanguage.googleapis.com",
      "responseStyle": "openai"
    },
    "openrouter": {
      "apiKey": "sk-or-v1-...",
      "baseURL": "https://openrouter.ai/api/v1",
      "responseStyle": "openai"
    }
  },
  "models": {
    "bigModel": "gpt-4@openai",
    "smallModel": "gpt-4-mini@openai"
  },
  "logging": {
    "level": "info",
    "format": "json"
  }
}
  1. Start the proxy:
# Start proxy server and launch claude CLI (default behavior)
ccproxy

# Or start only the proxy server
ccproxy server

The server will start on http://localhost:8082 (or your configured port).

Commands

Available Commands

  • ccproxy (default) - Start proxy server (if needed) and launch claude CLI
  • ccproxy server - Start only the proxy server
  • ccproxy init - Initialize configuration file interactively
    • --scope user - Create user-level configuration
    • --scope project - Create project-level configuration
  • ccproxy help - Show help message
  • ccproxy version - Show version information

Global Options

  • --debug - Enable debug logging (applies to all commands)
  • --port - Override server port (server command only)

Examples

# Start server and launch claude CLI (default behavior)
ccproxy

# Start with debug logging enabled
ccproxy --debug

# Start only the proxy server
ccproxy server

# Start server on custom port
ccproxy server --port 8080

# Start server with debug logging
ccproxy server --debug

# Initialize configuration interactively
ccproxy init

# Create user-level configuration
ccproxy init --scope user

# Create project-level configuration
ccproxy init --scope project

Usage

Configuration

Configuration Files

The proxy loads configuration from multiple sources in priority order:

  1. Command Line Arguments (highest priority)
    • --port: Override server port
  2. JSON Configuration Files:
    • /etc/ccproxy/managed-settings.json (managed/system-wide)
    • $CWD/.ccproxy/settings.json (project-level)
    • $HOME/.ccproxy/settings.json (user-level)

Model Format

Models are specified in the format modelname@provider:

  • gpt-4@openai
  • claude-3-opus@anthropic
  • gemini-pro@gemini
  • gpt-4@openrouter

Response Styles

Providers can have different response styles:

  • "responseStyle": "anthropic" - Native Anthropic format (no transformation)
  • "responseStyle": "openai" - OpenAI format (requires transformation)

This allows the proxy to skip unnecessary transformations for providers that natively support Anthropic's API format.

Model Mapping

The proxy automatically maps Claude models to configured models:

| Claude Model | Mapped To | Description | |--------------|-----------|-------------| | claude--haiku- | smallModel | Maps to the configured small model | | claude--sonnet- | bigModel | Maps to the configured big model | | Other models | As specified | Uses the model@provider format |

Configuration Options

JSON Configuration

| Field | Description | Default | |-------|-------------|---------| | server.port | Server port | 8082 | | server.host | Server host | 0.0.0.0 | | providers.<name>.apiKey | Provider API key | - | | providers.<name>.baseURL | Provider base URL | - | | providers.<name>.responseStyle | Response format style | openai | | models.bigModel | Model for large requests | gpt-4@openai | | models.smallModel | Model for small requests | gpt-4-mini@openai | | logging.level | Log level | info | | logging.format | Log format | json |

Default Behavior

When you run ccproxy without arguments, it will:

  1. Check if the proxy server is already running
  2. Start the server if it's not running
  3. Launch the claude CLI with ANTHROPIC_BASE_URL set to the proxy

This means you can use ccproxy as a drop-in replacement for the claude command while automatically getting multi-provider support.

Adding Custom Providers

You can add custom providers with native Anthropic support:

{
  "providers": {
    "custom-llm": {
      "baseURL": "https://your-llm-api.com",
      "apiKey": "your-api-key",
      "responseStyle": "anthropic"
    }
  }
}

Then use it with: custom-model@custom-llm

Platform Support

The CLI is available for the following platforms:

  • macOS (Intel & Apple Silicon)
  • Linux (x64 & ARM64)
  • Windows (x64)

Troubleshooting

Binary Not Found

If you see "Binary not found" error, it means your platform is not supported. Please open an issue with your platform details.

Permission Denied

On Unix systems, if you get a permission error:

sudo npm install -g @aistack/claude-code-proxy

API Key Issues

Ensure your API keys are correctly set in your settings.json file and that they have the necessary permissions.

Development

For development or contributing, visit the GitHub repository.

License

MIT License - see the LICENSE file for details.

Support

Changelog

See CHANGELOG.md for release history.