npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

openrouter-mcp-server

v1.2.2

Published

MCP server providing access to OpenRouter's unified API for 500+ AI models

Readme

⚡ OpenRouter MCP

Every AI model. One terminal. Zero context switching.

Getting Started · Features · Tools · Development

The Problem

You're in your AI coding assistant. You need a quick GPT-4 opinion. Or a Flux-generated image. Or a side-by-side comparison across three models. That means: leave your editor, open a browser, find the right API, copy-paste keys, lose your flow...

The Fix

Add it to your MCP config and go:

{
  "openrouter": {
    "command": "npx",
    "args": ["-y", "openrouter-mcp-server"],
    "env": {
      "OPENROUTER_API_KEY": "sk-or-v1-your-key-here"
    }
  }
}

Now every model on OpenRouter is one tool call away. Chat, image gen, model search, cost tracking — all inline.


⚡ Getting Started

Step 1 — Get an API key from openrouter.ai/keys

Step 2 — Add to your MCP config (~/.claude/settings.json or your app's MCP settings):

{
  "mcpServers": {
    "openrouter": {
      "command": "npx",
      "args": ["-y", "openrouter-mcp-server"],
      "env": {
        "OPENROUTER_API_KEY": "sk-or-v1-your-key-here"
      }
    }
  }
}

Done. The server starts automatically when your MCP client connects.

git clone https://github.com/overtimepog/OpenrouterMCP.git
cd OpenrouterMCP
bash scripts/setup.sh

🎯 Features


📡 MCP Tools

| Parameter | Type | Required | Description | |:----------|:-----|:---------|:------------| | model | string | Yes | Model ID (e.g. openai/gpt-4) | | messages | array | Yes | [{ role, content }] message array | | session_id | string | | Continue an existing conversation | | stream | boolean | | Stream response (default: true) | | temperature | number | | Randomness 0–2 | | max_tokens | number | | Max tokens to generate | | tools | array | | OpenAI-compatible function definitions | | tool_choice | string | | auto / none / required | | top_p | number | | Nucleus sampling threshold | | top_k | number | | Top-K sampling | | frequency_penalty | number | | Frequency penalty (−2 to 2) | | presence_penalty | number | | Presence penalty (−2 to 2) | | reasoning | object | | Reasoning tokens ({ effort }) | | provider | object | | Provider routing preferences | | models | array | | Fallback model list for auto-routing | | plugins | array | | OpenRouter plugins (e.g. web search) |

| Parameter | Type | Description | |:----------|:-----|:------------| | provider | string | Filter by provider (openai, anthropic, …) | | keyword | string | Search in model names | | min_context_length | number | Minimum context window | | max_context_length | number | Maximum context window | | modality | string | text, image, audio | | min_price / max_price | number | Price range per token | | supports_tools | boolean | Function calling support | | supports_streaming | boolean | Streaming support | | supports_temperature | boolean | Temperature parameter support | | sort_by | string | price, context_length, provider | | sort_order | string | asc or desc |

| Parameter | Type | Description | |:----------|:-----|:------------| | provider | string | Filter by provider | | keyword | string | Search in model names | | min_context_length | number | Minimum context window | | max_context_length | number | Maximum context window | | modality | string | Filter by modality | | min_price / max_price | number | Price range |

| Parameter | Type | Required | Description | |:----------|:-----|:---------|:------------| | model | string | Yes | Image model ID | | prompt | string | Yes | Image description | | aspect_ratio | string | | 1:1, 16:9, 9:16, etc. | | image_size | string | | 1K, 2K, or 4K | | save_path | string | | Local save path (.png, .jpg, .webp) |

No parameters. Returns credit limit, remaining balance, total usage, and daily/weekly/monthly breakdowns.

| Parameter | Type | Description | |:----------|:-----|:------------| | session_id | string | Costs for a specific session | | recent_only | boolean | Only show recent entries |

| Parameter | Type | Required | Description | |:----------|:-----|:---------|:------------| | generation_id | string | Yes | The generation ID to look up |

Returns tokens, cost, latency, model, and provider info.

| Parameter | Type | Required | Description | |:----------|:-----|:---------|:------------| | model_slug | string | Yes | Model slug (e.g. openai/gpt-4) |

Returns all available providers with latency, uptime, pricing, and capabilities.


📁 Project Structure

OpenrouterMCP/
└── src/                    TypeScript MCP server
    ├── index.ts            Entry point
    ├── server/             Server bootstrap
    ├── api/                OpenRouter client, cache, rate limits
    ├── session/            Multi-turn conversation management
    ├── cost/               Cost tracking engine
    ├── schemas/            Shared Zod schemas
    ├── tools/              8 tool implementations
    │   ├── chat/
    │   ├── searchModels/
    │   ├── listModels/
    │   ├── imageGeneration/
    │   ├── credits/
    │   ├── costSummary/
    │   ├── generation/
    │   └── modelEndpoints/
    └── utils/              Logger, model validation

🛠 Development

npm install          # dependencies
npm run build        # compile
npm test             # 383 tests
npm run dev          # watch mode

🔍 Troubleshooting

echo 'export OPENROUTER_API_KEY=sk-or-v1-your-key' >> ~/.zshrc && source ~/.zshrc

Verify at openrouter.ai/keys that the key is correct and active.

Model IDs use the format provider/model-name (e.g. openai/gpt-4). Use the openrouter_search_models tool to find current models — never hardcode IDs.

The server warns before you hit limits. Upgrade your OpenRouter plan or space out requests.

  1. Check key is set: echo $OPENROUTER_API_KEY
  2. Verify your MCP config has the correct server entry
  3. Restart your MCP client
  4. Try npx -y openrouter-mcp-server directly to check for errors

MIT License