npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

recursion-mcp

v1.0.2

Published

Recursive Language Model MCP server for unbounded document processing with local RAG

Readme

Recursion MCP

npm version License: MIT

An NPX-installable MCP (Model Context Protocol) server for comprehensive document analysis with two powerful approaches:

  • V1: RAG + RLM with local Ollama (retrieval-based Q&A and deep analysis)
  • V2: Navigation-enabled recursive analysis (file-system based, no external APIs)

Quick Start

# Run directly without installing
npx @recursion-mcp/npx

# Or install globally
npm install -g @recursion-mcp/npx
recursion-mcp

Features

V1: RAG + RLM (Local AI)

  • Document Ingestion: PDF, DOCX, XLSX, TXT, MD support
  • Smart Chunking: Overlapping chunks preserving structure
  • Local Embeddings: Via Ollama (nomic-embed-text) for semantic search
  • SQLite + FTS5: Zero-infrastructure storage
  • Local RAG: Answer generation with Ollama (llama3, mistral, etc.)
  • RLM Engine: Recursive Language Model for unlimited context via Python
  • Hybrid Search: Vector + keyword with reciprocal rank fusion

V2: Navigation Analysis (Agent-Driven)

  • No External APIs: Pure file-system based, works offline
  • Complete Document Analysis: No missed content like brittle RAG
  • Hierarchical Navigation: Read any section, any line range
  • Persistent Analysis: Save and retrieve agent-generated insights
  • Markdown Conversion: PDF/DOCX → structured markdown
  • Recursive Reading: Agent-controlled systematic analysis

Installation

Via NPX (recommended)

# Default: Run V1 (RAG/RLM)
npx recursion-mcp

# Run V2 (Navigation Analysis)
npx recursion-mcp recursion-mcp-v2

Global Install

npm install -g recursion-mcp

# Run V1
recursion-mcp

# Run V2
recursion-mcp-v2

Automatic IDE Configuration

When you install globally, the package automatically configures MCP for detected IDEs:

| IDE | Auto-Configured | |-----|----------------| | Windsurf | ✅ Yes | | Claude Desktop | ✅ Yes | | Cursor | ✅ Yes | | VSCode | ✅ Yes (with MCP extension) |

Restart your IDE after installation to see the MCP tools.

Manual Setup (if auto-config fails)

# Run setup manually
npm run setup --prefix $(npm root -g)/recursion-mcp

Or manually add to your IDE's MCP settings (see MCP Configuration section below).

Prerequisites

For V1 (RAG/RLM):

  • Node.js 18+
  • Ollama (for embeddings and LLM)
    ollama pull nomic-embed-text
    ollama pull llama3
  • Python 3 (for RLM REPL)

For V2 (Navigation):

  • Node.js 18+ only (no other dependencies!)

MCP Configuration

V1 Configuration (RAG/RLM)

{
  "mcpServers": {
    "recursion": {
      "command": "npx",
      "args": ["recursion-mcp"]
    }
  }
}

V2 Configuration (Navigation)

{
  "mcpServers": {
    "recursion-v2": {
      "command": "npx",
      "args": ["recursion-mcp", "recursion-mcp-v2"]
    }
  }
}

Which Version to Use?

| Use Case | Recommended | |----------|-------------| | Quick Q&A on documents | V1 - RAG with Ollama | | Deep analysis of large docs | V2 - Navigation (complete coverage) | | No internet/external APIs | V2 - Pure file system | | Code/math analysis | V1 - RLM with Python REPL | | Complete book review | V2 - Systematic section analysis |

V1 Tools (RAG/RLM)

ingest_document

Ingest a document into the knowledge base.

{
  "filePath": "/path/to/document.pdf",
  "title": "Optional Title"
}

search_documents

Search across all ingested documents using hybrid search.

{
  "query": "search query",
  "topK": 10,
  "docId": "optional-doc-id"
}

ask_documents

Ask a question and get an answer using RAG with Ollama.

{
  "question": "What is the main topic?",
  "topK": 5,
  "docId": "doc-id"
}

rlm_analyze

Use Recursive Language Model for unlimited context analysis.

{
  "query": "Analyze the contract terms",
  "docId": "doc-id",
  "maxDepth": 1,
  "maxIterations": 20
}

list_documents

List all ingested documents.

delete_document

Delete a document and all its data.

V2 Tools (Navigation)

ingest_document_v2

Convert and store document with navigable structure.

{
  "filePath": "/path/to/document.pdf",
  "title": "Optional Title"
}

get_document_structure

Get hierarchical outline (chapters, sections, subsections).

{
  "docId": "document-id",
  "depth": 2
}

read_section

Read a specific section by ID.

{
  "docId": "document-id",
  "sectionId": "section-id",
  "maxLines": 100
}

search_document

Search for text with context lines.

{
  "docId": "document-id",
  "query": "search term",
  "contextLines": 3
}

save_analysis / get_analysis

Save and retrieve agent-generated analysis.

{
  "docId": "document-id",
  "sectionId": "full",
  "analysisType": "summary",
  "content": "Analysis text..."
}

V2 Agent Analysis Pattern

// 1. Ingest document
const docId = await ingest_document_v2({
  filePath: "/path/to/book.pdf"
});

// 2. Get structure
const structure = await get_document_structure({ docId });

// 3. Systematic analysis
for (const chapter of structure.sections) {
  const content = await read_section({ docId, sectionId: chapter.id });
  const analysis = agentAnalyze(content);
  await save_analysis({ docId, sectionId: chapter.id, analysisType: "summary", content: analysis });
}

// 4. Synthesize complete understanding
const fullAnalysis = await get_analysis({ docId, sectionId: "full", analysisType: "complete" });

Storage

  • V1: ~/.kw-os/documents.db (SQLite with embeddings)
  • V2: ~/.kw-os/v2/documents/{doc-id}/ (file system)
    • document.md - Full markdown
    • structure.json - Hierarchical outline
    • analysis/ - Saved analyses

Architecture

V1: RAG + RLM

  • RAG: Retrieval Augmented Generation with local Ollama
  • RLM: Recursive Language Model via Python REPL for unlimited context
  • Hybrid Search: Vector similarity + BM25 keyword search

V2: Navigation Analysis

  • File System Storage: Markdown + JSON structure
  • Hierarchical Navigation: Section-level granularity
  • Agent-Driven: AI controls reading, no brittle retrieval
  • Analysis Persistence: Incremental understanding building

Comparison

| Feature | V1 (RAG/RLM) | V2 (Navigation) | |---------|--------------|-----------------| | Coverage | Partial chunks | Complete document | | Dependencies | Ollama, Python | Node.js only | | Speed | Fast retrieval | Thorough analysis | | Depth | Surface | Deep, recursive | | Best For | Q&A | Complete reviews | | External APIs | Required (Ollama) | None |

Environment Variables

V1 Only

  • OLLAMA_BASE_URL: Ollama server (default: http://localhost:11434)
  • OLLAMA_MODEL: Chat model for RAG (default: llama3)

Documentation

License

MIT © netflypsb

Links