pnkd-aiproxy
v2.0.1
Published
Local proxy for LLM APIs. One endpoint for all providers. Logging, cost tracking, load balancing. PRO: 6 providers, auto-fallback, API translation, dashboard.
Maintainers
Readme
aiproxy
Local proxy for LLM APIs. One endpoint for all providers.
Installation
npm install -g pnkd-aiproxyQuick Start
# Configure API keys
aiproxy config set openai-key sk-xxx
aiproxy config set anthropic-key sk-ant-xxx
# Start proxy
aiproxy start --port 8080
# Point your app to the proxy
# OpenAI: http://localhost:8080/v1
# Anthropic: http://localhost:8080/anthropicCommands
FREE Commands
| Command | Description |
|---------|-------------|
| start | Start proxy server |
| stop | Stop proxy server |
| config <get\|set> [key] | Manage configuration |
| providers | List available providers |
| logs | View request logs |
| stats | Show statistics |
| cost | Show cost breakdown |
| limit <set\|clear> [rate] | Set rate limit |
| export | Export logs |
| license | Manage license |
PRO Commands
| Command | Description |
|---------|-------------|
| keys <add\|remove\|list> | Manage multiple API keys |
| rules | Custom routing rules |
| dashboard | Web monitoring dashboard |
Provider Endpoints
OpenAI: http://localhost:8080/v1
Anthropic: http://localhost:8080/anthropic
Google: http://localhost:8080/google [PRO]
Groq: http://localhost:8080/groq [PRO]
Ollama: http://localhost:8080/ollama [PRO]
Custom: http://localhost:8080/custom [PRO]Options
Start Options
--port <port>- Proxy port (default: 8080)
Stats/Logs Options
--last <n>- Show last N entries--today- Filter by today--month- Filter by this month--by-model- Group by model--by-day- Group by day--json- Output as JSON
FREE vs PRO
| Feature | FREE | PRO | |---------|------|-----| | Requests per day | 100 | Unlimited | | Providers | 2 (OpenAI, Anthropic) | 6 (all) | | Keys per provider | 1 | Unlimited | | Load balancing | - | Yes | | Auto-fallback | - | Yes | | API translation | - | Yes | | Web dashboard | - | Yes | | Custom routing | - | Yes |
PRO Features
Load Balancing
Distribute requests across multiple API keys:
aiproxy keys add openai sk-key1
aiproxy keys add openai sk-key2
aiproxy keys add openai sk-key3Supports strategies: round-robin, random, least-latency, weighted.
Auto-Fallback
Automatically retry with another provider if one fails:
OpenAI error → Try Anthropic → Try GroqAPI Translation
Use OpenAI format with any provider:
# Send OpenAI format to Anthropic
curl http://localhost:8080/anthropic/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model": "claude-4.5-opus", "messages": [{"role": "user", "content": "Hi"}]}'Web Dashboard
Monitor requests, costs, and errors in real-time:
aiproxy dashboard
# Opens http://localhost:8081PRO License
$18.99 one-time payment
Purchase at: https://pnkd.dev/aiproxy
# Activate license
aiproxy license activate APX-XXXX-XXXX-XXXX-XXXX
# Check status
aiproxy license statusExample Usage
Python with OpenAI SDK
import openai
client = openai.OpenAI(
api_key="your-key",
base_url="http://localhost:8080/v1"
)
response = client.chat.completions.create(
model="gpt-5-turbo",
messages=[{"role": "user", "content": "Hello"}]
)JavaScript/TypeScript
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'your-key',
baseURL: 'http://localhost:8080/v1',
});
const response = await client.chat.completions.create({
model: 'gpt-5-turbo',
messages: [{ role: 'user', content: 'Hello' }],
});cURL
curl http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-xxx" \
-d '{
"model": "gpt-5-turbo",
"messages": [{"role": "user", "content": "Hello"}]
}'Cost Tracking
Supported models and pricing (per 1M tokens):
| Model | Input | Output | |-------|-------|--------| | gpt-5 | $20 | $60 | | gpt-5-turbo | $5 | $15 | | gpt-4.5-turbo | $1 | $3 | | o3 | $10 | $40 | | claude-5 | $20 | $80 | | claude-4.5-opus | $15 | $75 | | claude-4.5-sonnet | $3 | $15 | | claude-4.5-haiku | $0.25 | $1.25 | | gemini-2.0-pro | $2.50 | $10 | | llama-4 | $0.50 | $1.50 |
Data Storage
All data stored locally in ~/.aiproxy/:
config.json- Configurationstats.json- Statisticslogs/- Request logs (JSONL format)license.json- License info
More PRO Tools
- ctxstuff PRO - Pack code for LLMs ($14.99)
- llmcache PRO - Cache LLM responses ($18.99)
Support
- Issues: https://github.com/pnkd-dev/aiproxy/issues
- Website: https://pnkd.dev/aiproxy
License
MIT - pnkd.dev
