@khanglvm/llm-router
v2.3.1
Published
LLM Router: single gateway endpoint for multi-provider LLMs with unified OpenAI+Anthropic format and seamless fallback
Maintainers
Readme
LLM Router
A unified LLM gateway that routes requests across multiple providers through a single endpoint. Supports both OpenAI and Anthropic-compatible formats. Manage everything via Web UI or CLI — optimized for AI agents.

Install
npm i -g @khanglvm/llm-router@latestQuick Start
llr # open Web UI
llr start # start the local gateway
llr ai-help # agent-oriented setup brief- Open the Web UI and add a provider (API key or OAuth login)
- Create model aliases with routing strategy
- Start the gateway and point your tools at the local endpoint
What You Can Do
- Add & manage providers — connect any OpenAI/Anthropic-compatible API endpoint, test connectivity, auto-discover models
- Unified endpoint — one local gateway that accepts both OpenAI and Anthropic request formats
- Model aliases with routing — group models into stable alias names with weighted round-robin, quota-aware balancing, and automatic fallback
- Rate limiting — set request caps per model or across all models over configurable time windows
- Coding tool routing — one-click routing config for Codex CLI, Claude Code, Factory Droid, and AMP
- Web search — built-in web search for AMP and other router-managed tools
- Deployable — run locally or deploy to Cloudflare Workers
- AI-agent friendly — full CLI parity with
llr config --operation=...so agents can configure everything programmatically
Web UI
Alias & Fallback
Create stable route names across multiple providers with balancing and failover.

AMP (Beta)
Route AMP-compatible requests through LLM Router with custom model mapping.

Codex CLI
Route Codex CLI requests through the gateway with model override and thinking level.

Claude Code
Route Claude Code through the gateway with per-tier model bindings.

Factory Droid
Route Factory Droid through the gateway via a managed custom model entry with reasoning effort control.
Web Search
Configure search providers for AMP and other router-managed tools.

AMP (Beta)
AMP support is in beta. Features and API surface may change.
LLM Router can front AMP-compatible routes locally and proxy unresolved traffic upstream. Configure via the Web UI or CLI:
llr config --operation=set-amp-client-routing --enabled=true --amp-client-settings-scope=workspaceSubscription Providers
OAuth-backed subscription login is supported for ChatGPT.
Note: ChatGPT subscriptions are separate from the OpenAI API and intended for use within OpenAI's own apps. Using them here may violate OpenAI's terms of service.
