npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

litellm-mcp

v1.0.0

Published

MCP server for managing LiteLLM proxy instances through Claude AI

Readme

LiteLLM MCP Server

A powerful Model Context Protocol (MCP) server written in TypeScript that enables seamless integration between Claude AI and LiteLLM proxy instances. Manage models, API keys, and monitoring directly through Claude's interface.

🚀 Features

  • Model Management: List and inspect all available models in your LiteLLM instance
  • API Key Generation: Create virtual API keys with custom aliases for rate limiting and monitoring
  • User Management: Organize and manage users and their associated API keys
  • Spend Tracking: Monitor API usage and costs per user
  • Docker Native: Runs as a containerized service with secure stdio communication
  • Seamless Integration: Works directly with VSCode and Claude through MCP protocol

📋 Prerequisites

  • Node.js 18+ or Docker & Docker Compose
  • Running LiteLLM proxy instance (v1.79.0+)
  • PostgreSQL database (for LiteLLM)
  • Redis instance (for caching/rate limiting)
  • VSCode with MCP extension support

🔧 Installation

Option 1: Install from npm (Recommended)

npm install -g litellm-mcp

This installs the pre-built executable globally on your system.

Option 2: Clone and Build from Source

git clone https://github.com/ArtemisAI/LiteLLM-MCP-Server.git
cd LiteLLM-MCP-Server
npm install
npm run build

3. Configure Environment

Copy the example configuration and update with your credentials:

cp .vscode/mcp.json.example .vscode/mcp.json

Edit .vscode/mcp.json with your LiteLLM proxy details:

For Docker deployment:

{
  "servers": {
    "litellm-manager": {
      "type": "stdio",
      "command": "docker",
      "args": ["exec", "-i", "litellm_mcp", "node", "dist/index.js"],
      "env": {
        "LITELLM_API_BASE": "http://localhost:4001",
        "LITELLM_MASTER_KEY": "sk-your-api-key",
        "DEBUG": "false"
      }
    }
  }
}

For local Node.js deployment:

{
  "servers": {
    "litellm-manager": {
      "type": "stdio",
      "command": "node",
      "args": ["/path/to/LiteLLM-MCP-Server/dist/index.js"],
      "env": {
        "LITELLM_API_BASE": "http://localhost:4001",
        "LITELLM_MASTER_KEY": "sk-your-api-key",
        "DEBUG": "false"
      }
    }
  }
}

4. Build Docker Image (Optional)

docker build -t litellm_mcp:latest .

5. Run Container (Optional)

docker run -d --name litellm_mcp \
  --network litellm_litellm_network \
  -e LITELLM_API_BASE=http://litellm-llm-1:4000 \
  -e LITELLM_MASTER_KEY=sk-your-key \
  litellm_mcp:latest sleep infinity

6. Enable in VSCode

The MCP server will automatically connect when configured in .vscode/mcp.json. VSCode will discover and register the following tools available in Claude:

🛠️ Available Tools

list_models

List all available models in your LiteLLM instance.

Usage in Claude:

"List the models available in LiteLLM"

Returns: Array of model IDs with metadata


get_model_info

Retrieve detailed information about a specific model.

Parameters:

  • model (string): Model name/ID

Usage in Claude:

"Tell me about the Gem-2.5-flash model"

Returns: Model metadata including owner and creation info


create_virtual_key

Generate a new virtual API key for rate limiting and user management.

Parameters:

  • key_alias (string): Friendly name for the key
  • user_id (string): User identifier to associate

Usage in Claude:

"Create an API key called 'production-app' for user 'app-001'"

Returns: New API key with full configuration


get_spend

Monitor API usage and costs for a specific user.

Parameters:

  • user_id (string): User to check spend for

Usage in Claude:

"Show me the spend for user 'app-001'"

Returns: Usage statistics and cost breakdown


📁 Project Structure

LiteLLM-MCP-Server/
├── src/
│   └── index.ts                 # Main TypeScript MCP server implementation
├── dist/                        # Compiled JavaScript (generated)
│   └── index.js
├── .vscode/
│   ├── mcp.json.example         # Configuration template
│   ├── mcp.docker.json          # Docker configuration template
│   └── mcp.json                 # User config (gitignored)
├── .github/
│   └── FUNDING.yml              # GitHub sponsorship config
├── tests/                       # Test plans and documentation
│   ├── TEST_PLAN.md            # Comprehensive testing strategy
│   └── DOCKER_VS_NPM.md        # Architecture decision docs
├── README.md                    # This file
├── CONTRIBUTING.md              # Contribution guidelines
├── SECURITY.md                  # Security policy
├── DEPLOYMENT.md                # Deployment guide
├── LICENSE                      # MIT License
├── .gitignore                   # Git ignore rules
├── package.json                 # Node.js package metadata
├── tsconfig.json                # TypeScript configuration
└── Dockerfile                   # Container build config

🔐 Security

This project handles sensitive information including API keys and database credentials. Security is paramount.

Key Guidelines

  1. Never commit secrets - Use .vscode/mcp.json.example as a template
  2. Use environment variables - All sensitive data via env vars only
  3. Rotate credentials regularly - Update keys and passwords periodically
  4. Restrict network access - Run on secured networks only
  5. Disable DEBUG mode - Set DEBUG=false in production

Reporting Security Issues

If you discover a security vulnerability, do not open a public issue. Instead, email: [email protected]

See SECURITY.md for detailed security policies and procedures.

🤝 Contributing

We welcome contributions! Please see CONTRIBUTING.md for:

  • Code style and standards
  • Testing requirements
  • Pull request process
  • Issue reporting

Quick Start for Contributors

# Fork and clone
git clone https://github.com/YOUR_USERNAME/LiteLLM-MCP-Server.git
cd LiteLLM-MCP-Server

# Create feature branch
git checkout -b feature/your-feature

# Make changes and test
# ... your code ...

# Submit pull request

💰 Support the Project

If you find this project useful, consider supporting it:

  • Star on GitHub - Help others discover the project
  • 💬 Contribute - Submit issues and pull requests
  • 🤝 Sponsor - Support ongoing development

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

MIT License Summary

✅ Commercial use
✅ Modification
✅ Distribution
✅ Private use
⚠️ Liability
⚠️ Warranty

🔗 Related Projects

📚 Documentation

📞 Support

🙏 Acknowledgments

  • LiteLLM - Excellent proxy server for LLM APIs
  • Anthropic - Claude AI and MCP protocol
  • OpenAI - ChatGPT and foundation models

Made with ❤️ by ArtemisAI

Last Updated: November 6, 2025