npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@supermodeltools/mcp-server

v0.10.0

Published

MCP server for Supermodel API - code graph generation for AI agents

Readme

Supermodel MCP Server

npm MCP CI

MCP server that gives AI agents instant codebase understanding via the Supermodel API. Pre-computed code graphs enable sub-second responses for symbol lookups, call-graph traversal, and cross-subsystem analysis.

Install

Quick Setup (Recommended)

curl -sSL https://raw.githubusercontent.com/supermodeltools/mcp/main/setup.sh | bash

Download, review, then execute:

curl -sSL https://raw.githubusercontent.com/supermodeltools/mcp/main/setup.sh -o setup.sh
cat setup.sh
chmod +x setup.sh
./setup.sh

Or clone the entire repo:

git clone https://github.com/supermodeltools/mcp.git
cd mcp
./setup.sh

Manual Install

npm install -g @supermodeltools/mcp-server

Or run directly:

npx @supermodeltools/mcp-server

Configuration

Get your API key from the Supermodel Dashboard.

| Variable | Description | |----------|-------------| | SUPERMODEL_API_KEY | Your Supermodel API key (required) | | SUPERMODEL_BASE_URL | Override API base URL (optional) | | SUPERMODEL_CACHE_DIR | Directory for pre-computed graph cache files (optional) | | SUPERMODEL_TIMEOUT_MS | API request timeout in ms (default: 900000 / 15 min) | | SUPERMODEL_NO_API_FALLBACK | Set to disable on-demand API calls; cache-only mode (optional) | | SUPERMODEL_EXPERIMENT | Experiment mode. Set to graphrag to enable GraphRAG tools (optional) |

Global Setup (Recommended)

Set your API key globally in your shell profile so it's available to all MCP clients:

# Add to ~/.zshrc (macOS) or ~/.bashrc (Linux)
export SUPERMODEL_API_KEY="your-api-key"

With the API key set globally, you can omit the env block from your MCP configs:

{
  "mcpServers": {
    "supermodel": {
      "command": "npx",
      "args": ["-y", "@supermodeltools/mcp-server"]
    }
  }
}

Usage

Claude Code CLI

claude mcp add supermodel --env SUPERMODEL_API_KEY=your-api-key -- npx -y @supermodeltools/mcp-server

Or if SUPERMODEL_API_KEY is already set in your shell environment:

claude mcp add supermodel -- npx -y @supermodeltools/mcp-server

Verify installation:

claude mcp list

Cursor

Add to ~/.cursor/mcp.json:

{
  "mcpServers": {
    "supermodel": {
      "command": "npx",
      "args": ["-y", "@supermodeltools/mcp-server"],
      "env": {
        "SUPERMODEL_API_KEY": "your-api-key"
      }
    }
  }
}

Default Working Directory

For benchmarking tools or batch processing, pass a default working directory as a CLI argument:

npx @supermodeltools/mcp-server /path/to/repository

Tools will use this directory automatically if no explicit directory parameter is given.

Tools

symbol_context (Default Mode)

Deep dive on a specific function, class, or method. Given a symbol name, instantly returns its definition location, source code, all callers, all callees, domain membership, and related symbols in the same file.

Output includes:

  • Definition location (file, line range) and source code
  • Callers (who calls this symbol)
  • Callees (what this symbol calls)
  • Architectural domain membership
  • Related symbols in the same file
  • File import statistics

Parameters:

| Argument | Type | Required | Description | |----------|------|----------|-------------| | symbol | string | No* | Name of the function, class, or method. Supports ClassName.method syntax and partial matching. | | symbols | string[] | No* | Array of symbol names for batch lookup. More efficient than multiple calls. | | directory | string | No | Path to repository directory. Omit if server was started with a default workdir. | | brief | boolean | No | Return compact output (no source code). Recommended for 3+ symbols. |

* Either symbol or symbols must be provided.

Example prompts:

  • "Look up the symbol filter_queryset in this codebase"
  • "What calls QuerySet.filter and what does it call?"

GraphRAG Mode (Experimental)

Activate with SUPERMODEL_EXPERIMENT=graphrag. Replaces symbol_context with a graph-oriented tool for call-graph traversal and cross-subsystem analysis.

explore_function

BFS traversal of a function, class, or method call graph. Shows source code, callers, callees, and cross-subsystem boundaries with ← DIFFERENT SUBSYSTEM markers.

Parameters:

| Argument | Type | Required | Description | |----------|------|----------|-------------| | symbol | string | Yes | Function, class, or method name to explore. Supports partial matching and ClassName.method syntax. | | direction | string | No | downstream (callees), upstream (callers), or both (default). | | depth | number | No | Hops to follow: 1–3 (default: 2). | | directory | string | No | Repository path. |

Output: Readable narrative showing upstream/downstream neighbors with domain context at each hop.

Recommended Workflow

Default mode:

  1. Identify symbols from the issue and call symbol_context to explore them (batch via symbols array or parallel calls)
  2. Use Read/Grep to examine source code at identified locations
  3. Start editing by turn 3. Max 3 MCP calls total.

GraphRAG mode:

  1. Identify key symbols from the issue, call explore_function to understand their call-graph context. Issue multiple calls in parallel (read-only, safe).
  2. Use the cross-subsystem markers and source code from the response to start editing. Max 2 MCP calls total.

Pre-computed Graphs

For fastest performance, pre-compute graphs ahead of time using the precache CLI subcommand. This calls the Supermodel API once and saves the result to disk, enabling sub-second tool responses with no API calls at runtime.

Pre-compute a graph

npx @supermodeltools/mcp-server precache /path/to/repo --output-dir ./supermodel-cache

Options:

  • --output-dir <dir> — Directory to save the cache file (default: ./supermodel-cache or SUPERMODEL_CACHE_DIR)
  • --name <name> — Repository name for the cache key (default: auto-detected from git remote + commit hash)

Use cached graphs at runtime

SUPERMODEL_CACHE_DIR=./supermodel-cache npx @supermodeltools/mcp-server

The server loads all cached graphs from SUPERMODEL_CACHE_DIR at startup. If no cache exists for a given repository, the server falls back to an on-demand API call (which takes 5-15 minutes for large repos).

Startup precaching

Use the --precache flag to automatically generate and cache the graph for the default workdir on server startup:

npx @supermodeltools/mcp-server /path/to/repo --precache

This is useful in automated environments (e.g., Docker containers for benchmarking) where you want the graph ready before any tool calls.

Benchmarking

Benchmark this MCP server using mcpbr with the provided mcpbr-config.yaml configuration.

Local Development

Building from Source

git clone https://github.com/supermodeltools/mcp.git
cd mcp
npm install
npm run build

Running Locally

node dist/index.js                    # Start MCP server
node dist/index.js /path/to/repo      # With default workdir
node dist/index.js precache /path/to/repo  # Pre-compute graph

Running Tests

npm test              # Run all tests
npm run test:coverage # Run with coverage
npm run typecheck     # Type checking

Using MCP Inspector

For interactive testing, use the MCP Inspector:

npx @modelcontextprotocol/inspector node dist/index.js

Troubleshooting

Timeout Errors

The first analysis of a repository requires an API call that can take 5-15 minutes. If your MCP client times out:

  1. Pre-compute the graph — Use precache to generate the graph ahead of time (see Pre-computed Graphs)
  2. Increase your MCP client timeout — For Claude Code CLI, set MCP_TOOL_TIMEOUT=900000 in your shell profile
  3. Analyze a subdirectory — Target specific parts of your codebase to reduce analysis time

Common Issues

  • 401 Unauthorized: Check SUPERMODEL_API_KEY is set correctly
  • Permission denied: Check read permissions on the directory
  • ENOTFOUND or connection errors: Check your internet connection and firewall settings

Debug Logging

Set the DEBUG environment variable for verbose logging:

{
  "mcpServers": {
    "supermodel": {
      "command": "npx",
      "args": ["-y", "@supermodeltools/mcp-server"],
      "env": {
        "SUPERMODEL_API_KEY": "your-api-key",
        "DEBUG": "supermodel:*"
      }
    }
  }
}

Links