npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@code-indexer/mcp

v0.0.5

Published

Model Context Protocol integration for CodeIndexer

Readme

@code-indexer/mcp

Model Context Protocol (MCP) integration for CodeIndexer - A powerful MCP server that enables AI assistants and agents to index and search codebases using semantic search.

npm version npm downloads

📖 New to CodeIndexer? Check out the main project README for an overview and setup instructions.

The Model Context Protocol (MCP) is an open protocol that standardizes how AI applications can securely connect to and interact with data sources and tools. This package provides an MCP server that exposes CodeIndexer's semantic search capabilities to any MCP-compatible client.

Features

  • 🔌 MCP Protocol Compliance: Full compatibility with MCP-enabled AI assistants and agents
  • 🔍 Semantic Code Search: Natural language queries to find relevant code snippets
  • 📁 Codebase Indexing: Index entire codebases for fast semantic search
  • 🔄 Auto-Sync: Automatically detects and synchronizes file changes to keep index up-to-date
  • 🧠 AI-Powered: Uses OpenAI embeddings and Milvus vector database
  • ⚡ Real-time: Interactive indexing and searching with progress feedback
  • 🛠️ Tool-based: Exposes three main tools via MCP protocol

Available Tools

1. index_codebase

Index a codebase directory for semantic search.

Parameters:

  • path (required): Path to the codebase directory to index
  • force (optional): Force re-indexing even if already indexed (default: false)

2. search_code

Search the indexed codebase using natural language queries.

Parameters:

  • query (required): Natural language query to search for in the codebase
  • limit (optional): Maximum number of results to return (default: 10, max: 50)

3. clear_index

Clear the search index.

Parameters:

  • confirm (required): Confirmation flag to prevent accidental clearing

Installation

npm install @code-indexer/mcp

Or run directly with npx:

npx @code-indexer/mcp@latest

Quick Start

Prerequisites

Before using the MCP server, make sure you have:

  • API key for your chosen embedding provider (OpenAI, VoyageAI, Gemini, or Ollama setup)
  • Milvus vector database (local or cloud)

💡 Setup Help: See the main project setup guide for detailed installation instructions.

Prepare Environment Variables

Embedding Provider Configuration

CodeIndexer MCP supports multiple embedding providers. Choose the one that best fits your needs:

# Supported providers: OpenAI, VoyageAI, Gemini, Ollama
EMBEDDING_PROVIDER=OpenAI

1. OpenAI Configuration (Default)

OpenAI provides high-quality embeddings with excellent performance for code understanding.

# Required: Your OpenAI API key
OPENAI_API_KEY=sk-your-openai-api-key

# Optional: Specify embedding model (default: text-embedding-3-small)
EMBEDDING_MODEL=text-embedding-3-small

# Optional: Custom API base URL (for Azure OpenAI or other compatible services)
OPENAI_BASE_URL=https://api.openai.com/v1

Available Models:

  • text-embedding-3-small (1536 dimensions, faster, lower cost)
  • text-embedding-3-large (3072 dimensions, higher quality)
  • text-embedding-ada-002 (1536 dimensions, legacy model)

Getting API Key:

  1. Visit OpenAI Platform
  2. Sign in or create an account
  3. Generate a new API key
  4. Set up billing if needed

2. VoyageAI Configuration

VoyageAI offers specialized code embeddings optimized for programming languages.

# Required: Your VoyageAI API key
VOYAGEAI_API_KEY=pa-your-voyageai-api-key

# Optional: Specify embedding model (default: voyage-code-3)
EMBEDDING_MODEL=voyage-code-3

Available Models:

  • voyage-code-3 (1024 dimensions, optimized for code)
  • voyage-3 (1024 dimensions, general purpose)
  • voyage-3-lite (512 dimensions, faster inference)

Getting API Key:

  1. Visit VoyageAI Console
  2. Sign up for an account
  3. Navigate to API Keys section
  4. Create a new API key

3. Gemini Configuration

Google's Gemini provides competitive embeddings with good multilingual support.

# Required: Your Gemini API key
GEMINI_API_KEY=your-gemini-api-key

# Optional: Specify embedding model (default: gemini-embedding-001)
EMBEDDING_MODEL=gemini-embedding-001

Available Models:

  • gemini-embedding-001 (3072 dimensions, latest model)

Getting API Key:

  1. Visit Google AI Studio
  2. Sign in with your Google account
  3. Go to "Get API key" section
  4. Create a new API key

4. Ollama Configuration (Local/Self-hosted)

Ollama allows you to run embeddings locally without sending data to external services.

# Required: Specify which Ollama model to use
EMBEDDING_MODEL=nomic-embed-text

# Optional: Specify Ollama host (default: http://127.0.0.1:11434)
OLLAMA_HOST=http://127.0.0.1:11434

Available Models:

  • nomic-embed-text (768 dimensions, recommended for code)
  • mxbai-embed-large (1024 dimensions, higher quality)
  • all-minilm (384 dimensions, lightweight)

Setup Instructions:

  1. Install Ollama from ollama.ai
  2. Pull the embedding model:
    ollama pull nomic-embed-text
  3. Ensure Ollama is running:
    ollama serve

Milvus Configuration

Zilliz Cloud (fully managed Milvus vector database as a service, you can use it for free)

  • MILVUS_ADDRESS is the Public Endpoint of your Zilliz Cloud instance
  • MILVUS_TOKEN is the token of your Zilliz Cloud instance.
# Required: Milvus address
MILVUS_ADDRESS=https://xxx-xxxxxxxxxxxx.serverless.gcp-us-west1.cloud.zilliz.com

# Required for Zilliz Cloud: Milvus token
MILVUS_TOKEN=xxxxxxx

Optional: Self-hosted Milvus. See Milvus Documentation for more details to install Milvus.

Embedding Batch Size

You can set the embedding batch size to optimize the performance of the MCP server, depending on your embedding model throughput. The default value is 100.

EMBEDDING_BATCH_SIZE=512

Usage with MCP Clients

img

Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server

Pasting the following configuration into your Cursor ~/.cursor/mcp.json file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json in your project folder. See Cursor MCP docs for more info.

OpenAI Configuration (Default):

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["-y", "@code-indexer/mcp@latest"],
      "env": {
        "EMBEDDING_PROVIDER": "OpenAI",
        "OPENAI_API_KEY": "your-openai-api-key",
        "OPENAI_BASE_URL": "https://your-custom-endpoint.com/v1",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}

VoyageAI Configuration:

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["-y", "@code-indexer/mcp@latest"],
      "env": {
        "EMBEDDING_PROVIDER": "VoyageAI",
        "VOYAGEAI_API_KEY": "your-voyageai-api-key",
        "EMBEDDING_MODEL": "voyage-code-3",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}

Gemini Configuration:

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["-y", "@code-indexer/mcp@latest"],
      "env": {
        "EMBEDDING_PROVIDER": "Gemini",
        "GEMINI_API_KEY": "your-gemini-api-key",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}

Ollama Configuration:

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["-y", "@code-indexer/mcp@latest"],
      "env": {
        "EMBEDDING_PROVIDER": "Ollama",
        "EMBEDDING_MODEL": "nomic-embed-text",
        "OLLAMA_HOST": "http://127.0.0.1:11434",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}

Add to your Claude Desktop configuration:

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["@code-indexer/mcp@latest"],
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}

Use the command line interface to add the CodeIndexer MCP server:

# Add the CodeIndexer MCP server
claude mcp add code-indexer -e OPENAI_API_KEY=your-openai-api-key -e MILVUS_ADDRESS=localhost:19530 -- npx @code-indexer/mcp@latest

See the Claude Code MCP documentation for more details about MCP server management.

Windsurf supports MCP configuration through a JSON file. Add the following configuration to your Windsurf MCP settings:

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["-y", "@code-indexer/mcp@latest"],
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}

The CodeIndexer MCP server can be used with VS Code through MCP-compatible extensions. Add the following configuration to your VS Code MCP settings:

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["-y", "@code-indexer/mcp@latest"],
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}

Cherry Studio allows for visual MCP server configuration through its settings interface. While it doesn't directly support manual JSON configuration, you can add a new server via the GUI:

  1. Navigate to Settings → MCP Servers → Add Server.
  2. Fill in the server details:
    • Name: code-indexer
    • Type: STDIO
    • Command: npx
    • Arguments: ["@code-indexer/mcp@latest"]
    • Environment Variables:
      • OPENAI_API_KEY: your-openai-api-key
      • MILVUS_ADDRESS: localhost:19530
  3. Save the configuration to activate the server.

Cline uses a JSON configuration file to manage MCP servers. To integrate the provided MCP server configuration:

  1. Open Cline and click on the MCP Servers icon in the top navigation bar.

  2. Select the Installed tab, then click Advanced MCP Settings.

  3. In the cline_mcp_settings.json file, add the following configuration:

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["@code-indexer/mcp@latest"],
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}
  1. Save the file.

To configure Code Indexer MCP in Augment Code, you can use either the graphical interface or manual configuration.

A. Using the Augment Code UI

  1. Click the hamburger menu.

  2. Select Settings.

  3. Navigate to the Tools section.

  4. Click the + Add MCP button.

  5. Enter the following command:

    npx @code-indexer/mcp@latest
  6. Name the MCP: Code Indexer.

  7. Click the Add button.


B. Manual Configuration

  1. Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel
  2. Select Edit Settings
  3. Under Advanced, click Edit in settings.json
  4. Add the server configuration to the mcpServers array in the augment.advanced object
"augment.advanced": { 
  "mcpServers": [ 
    { 
      "name": "code-indexer", 
      "command": "npx", 
      "args": ["-y", "@code-indexer/mcp@latest"] 
    } 
  ] 
}

Gemini CLI requires manual configuration through a JSON file:

  1. Create or edit the ~/.gemini/settings.json file.

  2. Add the following configuration:

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["@code-indexer/mcp@latest"],
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}
  1. Save the file and restart Gemini CLI to apply the changes.

Roo Code utilizes a JSON configuration file for MCP servers:

  1. Open Roo Code and navigate to Settings → MCP Servers → Edit Global Config.

  2. In the mcp_settings.json file, add the following configuration:

{
  "mcpServers": {
    "code-indexer": {
      "command": "npx",
      "args": ["@code-indexer/mcp@latest"],
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key",
        "MILVUS_ADDRESS": "localhost:19530"
      }
    }
  }
}
  1. Save the file to activate the server.

The server uses stdio transport and follows the standard MCP protocol. It can be integrated with any MCP-compatible client by running:

npx @code-indexer/mcp@latest

Contributing

This package is part of the CodeIndexer monorepo. Please see:

Related Projects

License

MIT - See LICENSE for details