skalpel
v2.0.24
Published
Skalpel AI SDK — optimize your OpenAI and Anthropic API calls
Maintainers
Readme
Skalpel
Optimize Claude Code prompts to reduce credit usage. Skalpel sits between Claude Code and the Anthropic API, optimizing prompts before they reach the provider — saving you money without changing how you work.
How It Works
Claude Code --> Skalpel Proxy (local) --> Skalpel Backend (AWS) --> Anthropic API
Intercepts requests Optimizes prompts Returns response
Adds tracking headers Reduces token usage- A local proxy runs on your machine (port 18100)
- Claude Code is configured to send API requests to the proxy instead of
api.anthropic.com - The proxy forwards requests to the Skalpel backend, which optimizes prompts for lower token usage
- The optimized request is sent to Anthropic, and the response streams back to Claude Code
Your Anthropic API key stays in the request chain — Skalpel never stores it.
Quick Start
One command to install, detect Claude Code, and start optimizing:
npx skalpel --api-key sk-skalpel-YOUR_KEY --autoThis will:
- Start the local proxy on ports 18100 (Anthropic) and 18101 (OpenAI)
- Detect coding agents on your machine
- Configure Claude Code and Codex agents automatically
- Set shell environment variables in your
.bashrc/.zshrc - Begin optimizing API traffic immediately
Interactive Setup
For a guided setup with prompts:
npx skalpelThe wizard walks you through API key entry, agent detection, and proxy configuration.
Manual Setup
# 1. Start the proxy
npx skalpel start
# 2. Run the setup wizard to configure Claude Code
npx skalpel setup
# 3. Verify everything is working
npx skalpel doctorWhat Gets Configured
Claude Code
Claude Code is auto-configured during setup. ANTHROPIC_BASE_URL is set in ~/.claude/settings.json to route API calls through the proxy. Only API endpoints (/v1/messages, /v1/complete) are routed through Skalpel for optimization; auth endpoints pass directly to Anthropic.
Shell Environment (~/.bashrc, ~/.zshrc)
Environment variables are added for tools that read them directly:
# BEGIN SKALPEL PROXY - do not edit manually
export ANTHROPIC_BASE_URL="http://localhost:18100"
export OPENAI_BASE_URL="http://localhost:18101"
# END SKALPEL PROXYProxy Config (~/.skalpel/config.json)
{
"apiKey": "sk-skalpel-YOUR_KEY",
"remoteBaseUrl": "http://skalpel-production-554359744.us-west-2.elb.amazonaws.com",
"anthropicPort": 18100,
"openaiPort": 18101
}CLI Commands
| Command | Description |
|---|---|
| npx skalpel | Run the setup wizard |
| npx skalpel start | Start the proxy |
| npx skalpel stop | Stop the proxy |
| npx skalpel status | Show proxy status and uptime |
| npx skalpel doctor | Verify configuration and connectivity |
| npx skalpel logs | View proxy logs |
| npx skalpel logs -f | Follow proxy logs in real time |
| npx skalpel uninstall | Remove proxy, configs, and shell modifications |
Flags
| Flag | Description |
|---|---|
| --api-key <key> | Provide Skalpel API key (skips interactive prompt) |
| --auto | Non-interactive mode (requires --api-key) |
Streaming Support
Skalpel fully supports Anthropic's SSE streaming protocol. When Claude Code sends a streaming request ("stream": true), the proxy:
- Detects the streaming flag in the request body
- Forwards to the backend with all original headers
- Pipes SSE events (
message_start,content_block_delta,message_stop) directly back to Claude Code - Preserves upstream headers like
anthropic-request-id
No buffering, no modification of the stream — chunks flow through in real time.
Headers
The proxy adds these headers to every request forwarded to the Skalpel backend:
| Header | Value | Purpose |
|---|---|---|
| X-Skalpel-API-Key | Your Skalpel key | Backend authentication |
| X-Skalpel-Source | claude-code | Traffic source identification |
| X-Skalpel-Agent-Type | claude-code | Agent type for optimization routing |
| X-Skalpel-SDK-Version | proxy-1.0.0 | SDK version tracking |
The original x-api-key header (your Anthropic key) is forwarded as-is.
Codex Support
Skalpel also supports OpenAI Codex on port 18101. The same --auto flow detects and configures both agents if present.
SDK Integration
For programmatic use in your own applications:
import { createSkalpelClient } from 'skalpel';
import Anthropic from '@anthropic-ai/sdk';
const client = createSkalpelClient(new Anthropic(), {
apiKey: process.env.SKALPEL_API_KEY!,
});
const response = await client.messages.create({
model: 'claude-sonnet-4-20250514',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello' }],
});See the API Reference for all SDK options.
API Reference
createSkalpelClient<T>(client: T, options: SkalpelClientOptions): T
Wraps an existing OpenAI or Anthropic client. All API calls route through the Skalpel proxy with automatic fallback on errors.
createSkalpelAnthropic(options): Promise<Anthropic>
Creates a pre-configured Anthropic client pointing at the Skalpel proxy.
createSkalpelOpenAI(options): Promise<OpenAI>
Creates a pre-configured OpenAI client pointing at the Skalpel proxy.
Configuration Options
interface SkalpelClientOptions {
apiKey: string; // Skalpel API key (sk-skalpel-*)
workspace?: string; // Workspace ID
baseURL?: string; // Proxy URL (default: production ALB)
fallbackOnError?: boolean; // Fall back to direct provider (default: true)
timeout?: number; // Request timeout in ms (default: 30000)
retries?: number; // Max retries (default: 2)
}Uninstall
npx skalpel uninstallThis removes the proxy and cleans up shell environment variables. If a previous version of Skalpel modified ~/.claude/settings.json, uninstall will also clean up those changes.
