npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@mcpframework/docs

v0.1.0

Published

MCP documentation server framework — spin up an MCP server from your Fumadocs site or llms.txt

Readme

@mcpframework/docs

Spin up an MCP documentation server from your Fumadocs site (or any site with llms.txt) in minutes. AI agents in Claude Code, Cursor, or any MCP client get tools to search, browse, and retrieve your documentation — enabling them to write correct integration code on the first try.

Built on top of mcp-framework.

Quick Start

import { DocsServer, FumadocsRemoteSource } from "@mcpframework/docs";

const source = new FumadocsRemoteSource({
  baseUrl: "https://docs.myapi.com",
});

const server = new DocsServer({
  source,
  name: "my-api-docs",
  version: "1.0.0",
});

server.start();

Or scaffold a project instantly:

npx create-docs-mcp my-api-docs

Sources

FumadocsRemoteSource

Purpose-built for Fumadocs sites. Uses the native Orama search API for high-quality results, with automatic fallback to local text search.

import { FumadocsRemoteSource } from "@mcpframework/docs";

const source = new FumadocsRemoteSource({
  baseUrl: "https://docs.myapi.com",     // Required
  searchEndpoint: "/api/search",          // Default: "/api/search"
  llmsTxtPath: "/llms.txt",              // Default: "/llms.txt"
  llmsFullTxtPath: "/llms-full.txt",     // Default: "/llms-full.txt"
  refreshInterval: 300_000,              // Cache TTL in ms (default: 5 min)
  headers: {                              // Optional custom headers
    Authorization: "Bearer ...",
  },
});

| Option | Type | Default | Description | |--------|------|---------|-------------| | baseUrl | string | required | Base URL of your Fumadocs site | | searchEndpoint | string | "/api/search" | Fumadocs Orama search endpoint | | llmsTxtPath | string | "/llms.txt" | Path to llms.txt index | | llmsFullTxtPath | string | "/llms-full.txt" | Path to full content | | refreshInterval | number | 300000 | Cache TTL in milliseconds | | headers | Record<string, string> | undefined | Custom HTTP headers |

LlmsTxtSource

Works with any documentation site that publishes llms.txt and llms-full.txt files (Fumadocs, Docusaurus with plugin, etc.). Search is performed locally via text matching.

import { LlmsTxtSource } from "@mcpframework/docs";

const source = new LlmsTxtSource({
  baseUrl: "https://docs.myapi.com",
  mdxPathPrefix: "/docs/",              // Default: "/"
  refreshInterval: 300_000,
});

| Option | Type | Default | Description | |--------|------|---------|-------------| | baseUrl | string | required | Base URL of your docs site | | llmsTxtPath | string | "/llms.txt" | Path to llms.txt index | | llmsFullTxtPath | string | "/llms-full.txt" | Path to full content | | mdxPathPrefix | string | "/" | Prefix for individual page .mdx fetching | | refreshInterval | number | 300000 | Cache TTL in milliseconds | | headers | Record<string, string> | undefined | Custom HTTP headers | | cache | Cache | MemoryCache | Custom cache implementation |

Tools

The server exposes three MCP tools:

search_docs

Search documentation by keyword or phrase. Returns ranked results with excerpts.

| Parameter | Type | Required | Description | |-----------|------|----------|-------------| | query | string | yes | Search keywords or phrase | | section | string | no | Filter to a specific section | | limit | number | no | Max results (default 10, max 25) |

get_page

Retrieve the full markdown content of a documentation page.

| Parameter | Type | Required | Description | |-----------|------|----------|-------------| | slug | string | yes | Page slug or URL path |

list_sections

Browse the documentation tree structure to discover available content.

| Parameter | Type | Required | Description | |-----------|------|----------|-------------| | section | string | no | Filter to a section's children |

Fumadocs Setup

Your Fumadocs site needs to serve these endpoints:

  1. /llms.txt (required) — Generated by fumadocs-core's source.llms().index() utility
  2. /llms-full.txt (required for search) — Generated by source.llms().full()
  3. /api/search (optional) — Fumadocs' built-in Orama search API for better results

See the Fumadocs LLMs.txt documentation for setup instructions.

MCP Client Configuration

Claude Code

claude mcp add my-api-docs -- node /path/to/server/dist/index.js

Claude Desktop

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "my-api-docs": {
      "command": "node",
      "args": ["/path/to/server/dist/index.js"],
      "env": {
        "DOCS_BASE_URL": "https://docs.myapi.com"
      }
    }
  }
}

Cursor

Add to your MCP settings:

{
  "my-api-docs": {
    "command": "node",
    "args": ["/path/to/server/dist/index.js"],
    "env": {
      "DOCS_BASE_URL": "https://docs.myapi.com"
    }
  }
}

Custom Source Adapters

Implement the DocSource interface to create adapters for any documentation backend:

import { DocSource, DocPage, DocSearchResult, DocSection } from "@mcpframework/docs";

class MyCustomSource implements DocSource {
  name = "my-source";

  async search(query: string, options?: { section?: string; limit?: number }) {
    // Your search implementation
    return [];
  }

  async getPage(slug: string) {
    // Fetch and return a page, or null
    return null;
  }

  async listSections() {
    // Return your documentation tree
    return [];
  }

  async getIndex() {
    return ""; // llms.txt content
  }

  async getFullContent() {
    return ""; // llms-full.txt content
  }

  async healthCheck() {
    return { ok: true };
  }
}

Caching

The built-in MemoryCache provides LRU eviction with TTL expiry. You can pass a custom cache:

import { MemoryCache, LlmsTxtSource } from "@mcpframework/docs";

const cache = new MemoryCache({
  maxEntries: 200,        // Default: 100
  ttlMs: 600_000,         // Default: 300_000 (5 min)
});

const source = new LlmsTxtSource({
  baseUrl: "https://docs.myapi.com",
  cache,
});

Or implement the Cache interface for Redis, SQLite, etc.

API

// Main exports
import {
  DocsServer,
  LlmsTxtSource,
  FumadocsRemoteSource,
  MemoryCache,
  SearchDocsTool,
  GetPageTool,
  ListSectionsTool,
} from "@mcpframework/docs";

// Subpath imports
import { LlmsTxtSource, FumadocsRemoteSource } from "@mcpframework/docs/sources";
import { SearchDocsTool, GetPageTool, ListSectionsTool } from "@mcpframework/docs/tools";
import { MemoryCache } from "@mcpframework/docs/cache";

License

MIT