npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

local-context-mcp

v0.1.9

Published

Local semantic code search via MCP - works offline, indexes current directory

Readme

local-context-mcp

A fully local-first semantic code search engine exposed via MCP (Model Context Protocol).

Features

  • Local-first: Works entirely offline using USearch as the vector database
  • Zero-config: Automatically targets the current working directory
  • Submodule-aware: When working inside a git submodule, automatically indexes from the meta-repo root
  • MCP-based: Integrates with Claude Code, OpenCode, and other MCP-compatible tools
  • Flexible embeddings: Supports Ollama (default), OpenAI, Gemini, and VoyageAI
  • Fast: In-memory vector search with optional persistence

Quick Start

npx local-context-mcp

This indexes the current directory and starts the MCP server. First run may take a few minutes depending on codebase size.

Command Line Options

local-context-mcp --path /path/to/index    # Directory to index
local-context-mcp --watch                   # Watch mode: auto-reindex on file changes
local-context-mcp --help                    # Show help

Or set environment variable:

LOCAL_CONTEXT_PATH=/path/to/index npx local-context-mcp

Watch Mode

Use --watch to automatically reindex changed files in real-time:

local-context-mcp --watch --path /path/to/project

The MCP server starts immediately and the initial index runs in the background. After that, any file changes trigger an incremental reindex within 2 seconds. This is the recommended mode for active development sessions.

In watch mode:

  • Initial index runs in background if no index exists
  • Incremental updates only re-embed changed files (fast)
  • Debounced at 2 seconds to batch rapid saves from editors
  • Supports file additions, modifications, and deletions

MCP Integration

Claude Code / Claude CLI

claude mcp add local-context -- npx -y local-context-mcp

Or if installed globally:

claude mcp add local-context -- local-context-mcp

OpenCode

Add to your OpenCode settings (~/.opencode/settings.json or project config):

{
  "mcpServers": {
    "local-context": {
      "command": "npx",
      "args": ["-y", "local-context-mcp", "--watch"],
      "env": {
        "LOCAL_CONTEXT_PATH": "${workspaceFolder}"
      }
    }
  }
}

Or using the global binary (if installed):

{
  "mcpServers": {
    "local-context": {
      "command": "local-context-mcp"
    }
  }
}

Cursor

  1. Open Cursor Settings (⌘, or Ctrl+,)
  2. Go to ExtensionsMCP
  3. Click Add new MCP server
  4. Configure:
    • Name: local-context
    • Command: npx
    • Arguments: -y local-context-mcp --watch
    • Environment variables (optional): LOCAL_CONTEXT_PATH=/path/to/your/project

Or add to cursor settings JSON:

{
  "mcpServers": {
    "local-context": {
      "command": "npx",
      "args": ["-y", "local-context-mcp", "--watch"],
      "env": {
        "LOCAL_CONTEXT_PATH": "${workspaceFolder}"
      }
    }
  }
}

Windsurf (Codeium)

  1. Open Windsurf Settings
  2. Go to ExtensionsMCP
  3. Add new server with:
    • Command: npx
    • Arguments: -y local-context-mcp --watch

Other MCP Clients

For other MCP-compatible tools, add the server with:

  • npx: npx -y local-context-mcp
  • Global: local-context-mcp
  • Docker: docker run local-context-mcp

Set LOCAL_CONTEXT_PATH environment variable to specify which directory to index.

Git Submodules

When running from inside a git submodule, the tool automatically detects the submodule and resolves to the superproject (meta-repo) root for indexing and searching. This means:

  • All code in the meta-repo is indexed, including all submodules
  • Search results span the entire project, not just the current submodule
  • No configuration needed — detection is automatic using git rev-parse --show-superproject-working-tree
  • Falls back gracefully if not in a git repo or git version is too old

Installation

npm (Recommended)

npm install -g local-context-mcp

Or use directly with npx:

npx local-context-mcp

Build from Source

npm install
npm run build
npm link  # Links globally for CLI use

Tools

| Tool | Description | |------|-------------| | reindex | Index the current codebase for semantic search. Use after adding/changing files. | | search | Search the indexed codebase using natural language. Returns code implementations first. | | status | Get current indexing status (file count, chunk count). |

Search Example

> search: "async insert function"

Returns relevant code snippets with file paths and line numbers. Results prioritize code implementations over documentation.

How It Works

Files → AST Parser → Chunks → Embeddings → USearch → Retrieval
  1. File Discovery: Scans directory for code files (respects .gitignore and .contextignore)
  2. Chunking: Splits code into semantic chunks using tree-sitter AST parsing
  3. Embedding: Generates vector embeddings for each chunk
  4. Indexing: Stores vectors in USearch for fast similarity search
  5. Retrieval: Finds most relevant chunks using hybrid scoring:
    • 50% semantic similarity (embedding match)
    • 25% keyword matching (query identifiers in code)
    • 15% file type (code files prioritized over docs)
    • 10% chunk type (functions/methods prioritized)

Storage

Index files are created in the current working directory:

usearch_index_<collection>.usearch  # Vector index
usearch_meta_<collection>.json        # Document metadata
usearch_coll_<collection>.json       # Collection metadata

Collection name is derived from the directory path hash.

Supported Languages

TypeScript, JavaScript, Python, Java, C/C++, C#, Go, Rust, PHP, Ruby, Swift, Kotlin, Scala, Objective-C, Markdown, Jupyter

Environment Variables

Embedding Provider Selection

Providers are checked in this order. Set any API key to use that provider:

| Variable | Provider | Default Model | |----------|----------|---------------| | OLLAMA_BASE_URL | Ollama | nomic-embed-text | | OPENAI_API_KEY | OpenAI | text-embedding-3-small | | GEMINI_API_KEY | Gemini | text-embedding-004 | | VOYAGE_API_KEY | VoyageAI | voyage-3 |

Ollama Configuration

OLLAMA_BASE_URL=http://localhost:11434    # Default: http://127.0.0.1:11434
OLLAMA_MODEL=nomic-embed-text              # Default: nomic-embed-text

OpenAI Configuration

OPENAI_API_KEY=sk-...
OPENAI_EMBEDDING_MODEL=text-embedding-3-small  # Default
OPENAI_BASE_URL=https://api.openai.com/v1        # Optional: for proxies

Gemini Configuration

GEMINI_API_KEY=...
GEMINI_EMBEDDING_MODEL=text-embedding-004  # Default

VoyageAI Configuration

VOYAGE_API_KEY=...
VOYAGE_EMBEDDING_MODEL=voyage-3  # Default

Ignore Patterns

Files matching these patterns are excluded from indexing:

  • node_modules/**, dist/**, build/**, out/**
  • .git/**, .svn/**, .hg/**
  • *.log, *.min.js, *.min.css, .env
  • *.map, *.bundle.js, *.chunk.js

Additionally respects .gitignore and .contextignore in the project root.

Architecture

See docs/architecture.md for system design details.

MCP Protocol

See docs/mcp.md for tool schemas and protocol details.

Building from Source

npm install
npm run build

Run directly:

npx tsx src/index.ts

Security Note

This package connects to localhost only (127.0.0.1:11434) when using the default Ollama embedding provider. No external network connections are made unless you configure a remote embedding provider (OpenAI, Gemini, or VoyageAI).

The npm audit warning about "supply chain risk" for http://127.0.0.1:11434 is a false positive - this is the local Ollama server endpoint, not an external URL.

License

MIT License. See LICENSE for details.


Based on claude-context by Zilliz.