npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@edjl/llm-mcp

v1.0.8

Published

MCP server for querying LLMs (OpenAI and Google Gemini) using llm-querier

Readme

LLM MCP Server

A Model Context Protocol (MCP) server that provides tools to query Large Language Models (LLMs) using the llm-querier library. Currently supports OpenAI and Google Gemini.

Installation

Option A: Global Installation

npm install -g @edjl/llm-mcp

Option B: Use with npx (no installation required)

npx -y @edjl/llm-mcp

Configuration

Set the following environment variables:

OpenAI Configuration

  • OPENAI_API_KEY: Your OpenAI API key (required for OpenAI tool)
  • OPENAI_MODEL: OpenAI model to use (default: o3)

Google Gemini Configuration

  • GEMINI_API_KEY: Your Google Gemini API key (required for Gemini tool)
  • GEMINI_MODEL: Gemini model to use (default: gemini-2.5-pro)

Note: You can configure just one provider or both. The server will only enable tools for configured providers.

Available Tools

llm_ask_openai

Ask a single query prompt to OpenAI. Provide as much context as possible. This is a single call - no conversation state is maintained.

Parameters:

  • prompt (required): The query prompt to send to OpenAI
  • context (optional): Array of additional context strings to enhance the prompt
  • examples (optional): Array of examples to guide the response
  • images (optional): Array of image URLs or base64 encoded images
  • scrapeUrls (optional): Array of URLs to scrape and include as context
  • fileUrls (optional): Array of file URLs to download and include as context

llm_ask_gemini

Ask a single query prompt to Google Gemini. Provide as much context as possible. This is a single call - no conversation state is maintained.

Parameters:

  • prompt (required): The query prompt to send to Google Gemini
  • context (optional): Array of additional context strings to enhance the prompt
  • examples (optional): Array of examples to guide the response
  • images (optional): Array of image URLs or base64 encoded images
  • videos (optional): Array of video URLs (Google AI supports video input)
  • scrapeUrls (optional): Array of URLs to scrape and include as context
  • fileUrls (optional): Array of file URLs to download and include as context

Usage with Cursor

Add to your Cursor settings:

Option A: Global Installation

{
  "mcpServers": {
    "llm-mcp": {
      "command": "llm-mcp",
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key",
        "OPENAI_MODEL": "o3",
        "GEMINI_API_KEY": "your-gemini-api-key",
        "GEMINI_MODEL": "gemini-2.5-pro"
      }
    }
  }
}

Option B: Using npx

{
  "mcpServers": {
    "llm-mcp": {
      "command": "npx",
      "args": ["-y", "@edjl/llm-mcp"],
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key",
        "OPENAI_MODEL": "o3",
        "GEMINI_API_KEY": "your-gemini-api-key",
        "GEMINI_MODEL": "gemini-2.5-pro"
      }
    }
  }
}

Examples

Basic Query to OpenAI

const result = await use_mcp_tool({
  server_name: "llm-mcp",
  tool_name: "llm_ask_openai",
  arguments: {
    prompt: "Explain the concept of quantum computing in simple terms"
  }
});

Query with Context and Examples

const result = await use_mcp_tool({
  server_name: "llm-mcp",
  tool_name: "llm_ask_gemini",
  arguments: {
    prompt: "Write a haiku about programming",
    context: ["Focus on the debugging process", "Make it humorous"],
    examples: ["Bugs hide in the code / Like ninjas in the shadows / Coffee is my sword"]
  }
});

Query with Web Scraping

const result = await use_mcp_tool({
  server_name: "llm-mcp",
  tool_name: "llm_ask_openai",
  arguments: {
    prompt: "Summarize the main points from this article",
    scrapeUrls: ["https://example.com/article"]
  }
});

Query with Images (Vision Models)

const result = await use_mcp_tool({
  server_name: "llm-mcp",
  tool_name: "llm_ask_gemini",
  arguments: {
    prompt: "What's in this image?",
    images: ["https://example.com/image.jpg"]
  }
});

Notes

  • This MCP server uses the llm-querier library under the hood
  • Each query is independent - no conversation history is maintained
  • The server only loads tools for providers that have API keys configured
  • For more advanced usage, refer to the llm-querier documentation

License

MIT