npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@khanglvm/llm-router

v2.3.1

Published

LLM Router: single gateway endpoint for multi-provider LLMs with unified OpenAI+Anthropic format and seamless fallback

Readme

LLM Router

A unified LLM gateway that routes requests across multiple providers through a single endpoint. Supports both OpenAI and Anthropic-compatible formats. Manage everything via Web UI or CLI — optimized for AI agents.

LLM Router Web Console

Install

npm i -g @khanglvm/llm-router@latest

Quick Start

llr          # open Web UI
llr start    # start the local gateway
llr ai-help  # agent-oriented setup brief
  1. Open the Web UI and add a provider (API key or OAuth login)
  2. Create model aliases with routing strategy
  3. Start the gateway and point your tools at the local endpoint

What You Can Do

  • Add & manage providers — connect any OpenAI/Anthropic-compatible API endpoint, test connectivity, auto-discover models
  • Unified endpoint — one local gateway that accepts both OpenAI and Anthropic request formats
  • Model aliases with routing — group models into stable alias names with weighted round-robin, quota-aware balancing, and automatic fallback
  • Rate limiting — set request caps per model or across all models over configurable time windows
  • Coding tool routing — one-click routing config for Codex CLI, Claude Code, Factory Droid, and AMP
  • Web search — built-in web search for AMP and other router-managed tools
  • Deployable — run locally or deploy to Cloudflare Workers
  • AI-agent friendly — full CLI parity with llr config --operation=... so agents can configure everything programmatically

Web UI

Alias & Fallback

Create stable route names across multiple providers with balancing and failover.

Alias & Fallback

AMP (Beta)

Route AMP-compatible requests through LLM Router with custom model mapping.

AMP Configuration

Codex CLI

Route Codex CLI requests through the gateway with model override and thinking level.

Codex CLI Routing

Claude Code

Route Claude Code through the gateway with per-tier model bindings.

Claude Code Routing

Factory Droid

Route Factory Droid through the gateway via a managed custom model entry with reasoning effort control.

Web Search

Configure search providers for AMP and other router-managed tools.

Web Search

AMP (Beta)

AMP support is in beta. Features and API surface may change.

LLM Router can front AMP-compatible routes locally and proxy unresolved traffic upstream. Configure via the Web UI or CLI:

llr config --operation=set-amp-client-routing --enabled=true --amp-client-settings-scope=workspace

Subscription Providers

OAuth-backed subscription login is supported for ChatGPT.

Note: ChatGPT subscriptions are separate from the OpenAI API and intended for use within OpenAI's own apps. Using them here may violate OpenAI's terms of service.

Links