npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

model-scout-mcp

v0.2.1

Published

MCP server for discovering, comparing, and selecting LLM models across providers with real-time pricing and capabilities

Downloads

63

Readme

Model Scout MCP

npm version License: MIT

An MCP (Model Context Protocol) server for discovering, comparing, and selecting LLM models across providers with real-time pricing and capabilities.

Supports both stdio and Streamable HTTP transports.

Problem Statement

Large Language Models often have outdated knowledge about:

  • Newly released models and their exact names/IDs
  • Current token pricing
  • Model capabilities and supported features

This MCP server solves these problems by providing real-time access to model catalogs and intelligent search/comparison tools.

Current Status

Phase: ✅ Implemented and Tested

Provider Support:

  • OpenRouter - Currently supported (100+ models from 50+ providers)
  • 🔄 Future: Additional providers may be added

Note: OpenRouter is currently the only supported provider because it provides comprehensive programmatic access to model pricing and capabilities. Not all model providers offer APIs for cost inspection, making OpenRouter an ideal aggregator for this use case.

Documentation

Tool Architecture (2 Tools)

1. get_model

Direct lookup of a specific model by ID. Returns complete details including API endpoint information.

2. consider_models

Flexible tool for exploring, comparing, and analyzing models based on user needs. Handles:

  • Filtering and searching
  • Model comparison
  • Cost analysis and projections
  • Recommendations with trade-offs

Design Philosophy: One tool for lookup, one tool for consideration. The consider_models tool adapts its behavior based on the request - it can be as simple as listing free models or as complex as multi-factor decision analysis with cost projections.

Key Features

Model Discovery

  • Search by natural language ("cheap instructional model")
  • Filter by capabilities (vision, tools, reasoning)
  • Filter by pricing, context length, provider
  • Sort by various criteria (price, context, recency)

Cost Optimization

  • Calculate estimated costs for workloads
  • Find cheapest models meeting requirements
  • Compare pricing across multiple models
  • Identify cost-effective alternatives

Intelligent Recommendations

  • Use-case based recommendations
  • Trade-off analysis (cost vs capability)
  • Alternative suggestions
  • Personalized based on requirements

Use Cases

For LLM Applications

  • Grounding: Get current model names and IDs
  • Cost Planning: Estimate API costs before deployment
  • Optimization: Find cheaper alternatives with similar capabilities
  • Discovery: Find new models matching specific criteria

Example Queries

"Find me a cheap instructional model"
→ search_models or find_cheapest with filters

"What's the latest GPT-4 variant?"
→ list_models filtered by provider, sorted by date

"Compare Claude Opus vs Sonnet pricing"
→ compare_models with both IDs

"I need a model with vision for under $1 per million tokens"
→ search_models or list_models with modality and price filters

"What will it cost to process 100K tokens with GPT-4?"
→ calculate_cost with model ID and token counts

"Is there a cheaper alternative to Claude Opus?"
→ find_alternatives optimized for cost

Technical Approach

Data Source

  • OpenRouter API: https://openrouter.ai/api/v1/models
  • Authenticated via bearer token
  • Returns comprehensive model catalog with pricing and capabilities

Caching Strategy

  • Cache model list for 10 minutes (configurable)
  • In-memory filtering and search
  • force_refresh parameter to bypass cache

Implementation Phases

Phase 1 (MVP): Core tools

  • list_models, get_model, calculate_cost

Phase 2 (Enhanced): Intelligence

  • search_models, compare_models, find_cheapest

Phase 3 (Advanced): AI-powered

  • recommend_model, find_alternatives

API Key Setup

Create a .env file with:

OPENROUTER_API_KEY=sk-or-v1-...

Get your API key from: https://openrouter.ai/keys

Provider Details

OpenRouter

OpenRouter aggregates models from multiple providers:

  • OpenAI (GPT-4, GPT-3.5, etc.)
  • Anthropic (Claude models)
  • Meta (Llama models)
  • Google (Gemini, PaLM)
  • Mistral AI
  • And 50+ more providers

API Documentation: https://openrouter.ai/docs/api/reference/overview

Future Enhancements

  • Historical pricing tracking
  • Performance benchmarks integration
  • Model deprecation alerts
  • Additional provider support
  • Cost prediction based on usage patterns
  • Bulk comparison tools
  • Custom filtering DSL

Installation

Prerequisites

Option 1: Install via npm (Recommended)

Install globally or use with npx:

# Install globally
npm install -g model-scout-mcp

# Or use with npx (no installation needed)
npx model-scout-mcp

Option 2: Install from Source

# Clone the repository
git clone https://github.com/danielrosehill/Model-Scout-MCP.git
cd Model-Scout-MCP

# Install dependencies
npm install

# Link for local testing (optional)
npm link

# Test the server
npm test  # or: node test-server.js

# Test MCP protocol (optional)
./test-mcp-server.sh

Configuration

Claude Desktop (stdio transport)

Add to your MCP settings file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Using npx (Recommended)

{
  "mcpServers": {
    "model-scout": {
      "command": "npx",
      "args": ["-y", "model-scout-mcp"],
      "env": {
        "OPENROUTER_API_KEY": "your-api-key-here"
      }
    }
  }
}

Using Global Install

{
  "mcpServers": {
    "model-scout": {
      "command": "model-scout-mcp",
      "env": {
        "OPENROUTER_API_KEY": "your-api-key-here"
      }
    }
  }
}

Using from Source

{
  "mcpServers": {
    "model-scout": {
      "command": "node",
      "args": ["/absolute/path/to/Model-Scout-MCP/index.js"],
      "env": {
        "OPENROUTER_API_KEY": "your-api-key-here"
      }
    }
  }
}

Note: Replace your-api-key-here with your actual OpenRouter API key from https://openrouter.ai/keys

HTTP Transport (Streamable HTTP)

For clients that support HTTP-based MCP connections, you can run the server in HTTP mode:

# Start the HTTP server
MCP_TRANSPORT=http OPENROUTER_API_KEY=your-key node index.js

# Or with npm scripts
npm run start:http

Environment Variables

| Variable | Description | Default | |----------|-------------|---------| | OPENROUTER_API_KEY | Your OpenRouter API key (required) | - | | MCP_TRANSPORT | Transport mode: stdio or http | stdio | | MCP_PORT | HTTP server port (HTTP mode only) | 3000 | | MCP_HOST | HTTP server host (HTTP mode only) | 127.0.0.1 |

HTTP Endpoints

When running in HTTP mode:

  • MCP Endpoint: POST/GET http://127.0.0.1:3000/mcp
  • Health Check: GET http://127.0.0.1:3000/health

MCP Client Configuration (HTTP)

For MCP clients that support streamable HTTP transport:

{
  "mcpServers": {
    "model-scout": {
      "url": "http://127.0.0.1:3000/mcp",
      "transport": "streamable-http"
    }
  }
}

Contributing

Contributions welcome! Please open an issue or PR on GitHub.

License

MIT License - See LICENSE file for details

Author

Daniel Rosehill