@zilliz/claude-context-mcp
v0.1.13
Published
Model Context Protocol integration for Claude Context
Readme
@zilliz/claude-context-mcp
Model Context Protocol (MCP) integration for Claude Context - A powerful MCP server that enables AI assistants and agents to index and search codebases using semantic search.
📖 New to Claude Context? Check out the main project README for an overview and setup instructions.
🚀 Use Claude Context as MCP in Claude Code and others
Model Context Protocol (MCP) allows you to integrate Claude Context with your favorite AI coding assistants, e.g. Claude Code.
Quick Start
Prerequisites
Before using the MCP server, make sure you have:
- API key for your chosen embedding provider (OpenAI, VoyageAI, Gemini, or Ollama setup)
- Milvus vector database (local or cloud)
💡 Setup Help: See the main project setup guide for detailed installation instructions.
Prepare Environment Variables
Embedding Provider Configuration
Claude Context MCP supports multiple embedding providers. Choose the one that best fits your needs:
📋 Quick Reference: For a complete list of environment variables and their descriptions, see the Environment Variables Guide.
# Supported providers: OpenAI, VoyageAI, Gemini, Ollama
EMBEDDING_PROVIDER=OpenAIOpenAI provides high-quality embeddings with excellent performance for code understanding.
# Required: Your OpenAI API key
OPENAI_API_KEY=sk-your-openai-api-key
# Optional: Specify embedding model (default: text-embedding-3-small)
EMBEDDING_MODEL=text-embedding-3-small
# Optional: Custom API base URL (for Azure OpenAI or other compatible services)
OPENAI_BASE_URL=https://api.openai.com/v1Available Models:
See getSupportedModels in openai-embedding.ts for the full list of supported models.
Getting API Key:
- Visit OpenAI Platform
- Sign in or create an account
- Generate a new API key
- Set up billing if needed
VoyageAI offers specialized code embeddings optimized for programming languages.
# Required: Your VoyageAI API key
VOYAGEAI_API_KEY=pa-your-voyageai-api-key
# Optional: Specify embedding model (default: voyage-code-3)
EMBEDDING_MODEL=voyage-code-3Available Models:
See getSupportedModels in voyageai-embedding.ts for the full list of supported models.
Getting API Key:
- Visit VoyageAI Console
- Sign up for an account
- Navigate to API Keys section
- Create a new API key
Google's Gemini provides competitive embeddings with good multilingual support.
# Required: Your Gemini API key
GEMINI_API_KEY=your-gemini-api-key
# Optional: Specify embedding model (default: gemini-embedding-001; supports gemini-embedding-2)
EMBEDDING_MODEL=gemini-embedding-001
# Optional: Custom API base URL (for custom endpoints)
GEMINI_BASE_URL=https://generativelanguage.googleapis.com/v1betaAvailable Models:
See getSupportedModels in gemini-embedding.ts for the full list of supported models.
Getting API Key:
- Visit Google AI Studio
- Sign in with your Google account
- Go to "Get API key" section
- Create a new API key
Ollama allows you to run embeddings locally without sending data to external services.
# Required: Specify which Ollama model to use
EMBEDDING_MODEL=nomic-embed-text
# Optional: Specify Ollama host (default: http://127.0.0.1:11434)
OLLAMA_HOST=http://127.0.0.1:11434
# Optional: Override embedding dimension to skip runtime dimension detection
EMBEDDING_DIMENSION=768Setup Instructions:
Install Ollama from ollama.com
Pull the embedding model:
ollama pull nomic-embed-textEnsure Ollama is running:
ollama serve
Get a free vector database on Zilliz Cloud
Claude Context needs a vector database. You can sign up on Zilliz Cloud to get an API key.

Copy your Personal Key to replace your-zilliz-cloud-api-key in the configuration examples.
MILVUS_TOKEN=your-zilliz-cloud-api-key
# Optional: increase timeout for Milvus collection-limit pre-check on slow clusters (default: 15000)
MILVUS_COLLECTION_LIMIT_CHECK_TIMEOUT_MS=30000Embedding Batch Size
You can set the embedding batch size to optimize the performance of the MCP server, depending on your embedding model throughput. The default value is 100.
EMBEDDING_BATCH_SIZE=512Custom File Processing (Optional)
You can configure custom file extensions and ignore patterns globally via environment variables:
# Additional file extensions to include beyond defaults
CUSTOM_EXTENSIONS=.vue,.svelte,.astro,.twig
# Additional ignore patterns to exclude files/directories
CUSTOM_IGNORE_PATTERNS=temp/**,*.backup,private/**,uploads/**These settings work in combination with tool parameters - patterns from both sources will be merged together.
Custom Collection Name (Optional)
Use this when you want a human-readable prefix on collection names in Milvus/Zilliz instead of the bare hash:
# Creates code_chunks_my_project_<pathHash> or hybrid_code_chunks_my_project_<pathHash>
CODE_CHUNKS_COLLECTION_NAME_OVERRIDE=my_projectThe per-codebase <pathHash> suffix is preserved even when the override is set, so the same MCP server can still index multiple repos without collapsing them onto one collection. The override value is sanitized to letters, numbers, and underscores, and truncated to keep the full name within Milvus's 255-char limit. If you unset the variable later, Claude Context switches back to the plain code_chunks_<pathHash> naming.
Trigger File Watcher (Optional)
In addition to the periodic background sync, the MCP server watches a sentinel file at ~/.context/.sync-trigger and starts an immediate re-index whenever the file is modified. This lets external tools (Claude Code PostToolUse hooks, editor save hooks, CI scripts, etc.) request a sync on demand instead of waiting for the next polling tick.
# Default: watcher enabled. Set to false to disable filesystem watching entirely
# (useful on read-only filesystems or sandboxed environments).
CLAUDE_CONTEXT_TRIGGER_WATCHER=trueExample — Claude Code hook that re-indexes after every Edit/Write:
"hooks": {
"PostToolUse": [
{ "matcher": "Edit|Write", "hooks": [
{ "type": "command", "command": "touch ~/.context/.sync-trigger" }
]}
]
}Notes:
- The trigger fires a debounced re-index (2 s window) so rapid touches collapse to a single sync.
- Triggered syncs go through the same global cross-process lock as background sync, so when multiple MCP processes share
$HOMEonly one process performs the work per trigger. - The trigger file's contents are ignored — only the modification event matters.
Background Sync Configuration (Optional)
By default, the MCP server runs startup + periodic background sync for compatibility with existing installations. The global cross-process sync lock ensures only one local MCP process performs a sync cycle at a time.
You can tune or disable periodic polling with environment variables:
# Default: true. Set to false to disable startup + periodic polling.
CLAUDE_CONTEXT_BACKGROUND_SYNC=false
# Optional: control how often sync runs (default: 300000 = 5 minutes)
CLAUDE_CONTEXT_SYNC_INTERVAL_MS=60000For multi-instance local stdio setups, set CLAUDE_CONTEXT_BACKGROUND_SYNC=false and keep the trigger watcher enabled. That avoids idle polling while still allowing external tools to request immediate re-indexing by touching ~/.context/.sync-trigger.
Usage with MCP Clients
Use the command line interface to add the Claude Context MCP server:
# Add the Claude Context MCP server
claude mcp add claude-context -e OPENAI_API_KEY=your-openai-api-key -e MILVUS_ADDRESS=your-zilliz-cloud-public-endpoint -e MILVUS_TOKEN=your-zilliz-cloud-api-key -- npx @zilliz/claude-context-mcp@latest
See the Claude Code MCP documentation for more details about MCP server management.
Codex CLI uses TOML configuration files:
Create or edit the
~/.codex/config.tomlfile.Add the following configuration:
# IMPORTANT: the top-level key is `mcp_servers` rather than `mcpServers`.
[mcp_servers.claude-context]
command = "npx"
args = ["@zilliz/claude-context-mcp@latest"]
env = { "OPENAI_API_KEY" = "your-openai-api-key", "MILVUS_TOKEN" = "your-zilliz-cloud-api-key" }
# Optional: override the default 10s startup timeout
startup_timeout_ms = 20000- Save the file and restart Codex CLI to apply the changes.
Gemini CLI requires manual configuration through a JSON file:
Create or edit the
~/.gemini/settings.jsonfile.Add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}- Save the file and restart Gemini CLI to apply the changes.
Create or edit the ~/.qwen/settings.json file and add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server
Pasting the following configuration into your Cursor ~/.cursor/mcp.json file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json in your project folder. See Cursor MCP docs for more info.
OpenAI Configuration (Default):
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "OpenAI",
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}VoyageAI Configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "VoyageAI",
"VOYAGEAI_API_KEY": "your-voyageai-api-key",
"EMBEDDING_MODEL": "voyage-code-3",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}Gemini Configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "Gemini",
"GEMINI_API_KEY": "your-gemini-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}Ollama Configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "Ollama",
"EMBEDDING_MODEL": "nomic-embed-text",
"OLLAMA_HOST": "http://127.0.0.1:11434",
"EMBEDDING_DIMENSION": "768",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}Go to: Settings -> MCP -> Add MCP Server
Add the following configuration to your Void MCP settings:
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}Add to your Claude Desktop configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}Windsurf supports MCP configuration through a JSON file. Add the following configuration to your Windsurf MCP settings:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}The Claude Context MCP server can be used with VS Code through MCP-compatible extensions. Add the following configuration to your VS Code MCP settings:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}Cherry Studio allows for visual MCP server configuration through its settings interface. While it doesn't directly support manual JSON configuration, you can add a new server via the GUI:
- Navigate to Settings → MCP Servers → Add Server.
- Fill in the server details:
- Name:
claude-context - Type:
STDIO - Command:
npx - Arguments:
["-y", "@zilliz/claude-context-mcp@latest"] - Environment Variables:
OPENAI_API_KEY:your-openai-api-keyMILVUS_TOKEN:your-zilliz-cloud-api-key
- Name:
- Save the configuration to activate the server.
Cline uses a JSON configuration file to manage MCP servers. To integrate the provided MCP server configuration:
Open Cline and click on the MCP Servers icon in the top navigation bar.
Select the Installed tab, then click Advanced MCP Settings.
In the
cline_mcp_settings.jsonfile, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}- Save the file.
To configure Claude Context MCP in Augment Code, you can use either the graphical interface or manual configuration.
A. Using the Augment Code UI
Click the hamburger menu.
Select Settings.
Navigate to the Tools section.
Click the + Add MCP button.
Enter the following command:
npx @zilliz/claude-context-mcp@latestName the MCP: Claude Context.
Click the Add button.
B. Manual Configuration
- Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel
- Select Edit Settings
- Under Advanced, click Edit in settings.json
- Add the server configuration to the
mcpServersarray in theaugment.advancedobject
"augment.advanced": {
"mcpServers": [
{
"name": "claude-context",
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"]
}
]
}Roo Code utilizes a JSON configuration file for MCP servers:
Open Roo Code and navigate to Settings → MCP Servers → Edit Global Config.
In the
mcp_settings.jsonfile, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}- Save the file to activate the server.
Zencoder offers support for MCP tools and servers in both its JetBrains and VS Code plugin versions.
- Go to the Zencoder menu (...)
- From the dropdown menu, select
Tools - Click on the
Add Custom MCP - Add the name (i.e.
Claude Contextand server configuration from below, and make sure to hit theInstallbutton
{
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
- Save the server by hitting the
Installbutton.
For LangChain/LangGraph integration examples, see this example.
The server uses stdio transport and follows the standard MCP protocol. It can be integrated with any MCP-compatible client by running:
npx @zilliz/claude-context-mcp@latestFeatures
- 🔌 MCP Protocol Compliance: Full compatibility with MCP-enabled AI assistants and agents
- 🔍 Hybrid Code Search: Natural language queries using advanced hybrid search (BM25 + dense vector) to find relevant code snippets
- 📁 Codebase Indexing: Index entire codebases for fast hybrid search across millions of lines of code
- 🔄 Incremental Indexing: Efficiently re-index only changed files using Merkle trees for auto-sync
- 🧩 Intelligent Code Chunking: AST-based code analysis for syntax-aware chunking with automatic fallback
- 🗄️ Scalable: Integrates with Zilliz Cloud for scalable vector search, no matter how large your codebase is
- 🛠️ Customizable: Configure file extensions, ignore patterns, and embedding models
- ⚡ Real-time: Interactive indexing and searching with progress feedback
Available Tools
1. index_codebase
Index a codebase directory for hybrid search (BM25 + dense vector).
Parameters:
path(required): Absolute path to the codebase directory to indexforce(optional): Force re-indexing even if already indexed (default: false)splitter(optional): Code splitter to use - 'ast' for syntax-aware splitting with automatic fallback, 'langchain' for character-based splitting (default: "ast")customExtensions(optional): Additional file extensions to include beyond defaults (e.g., ['.vue', '.svelte', '.astro']). Extensions should include the dot prefix or will be automatically added (default: [])ignorePatterns(optional): Additional ignore patterns to exclude specific files/directories beyond defaults (e.g., ['static/', '*.tmp', 'private/']) (default: [])
2. search_code
Search the indexed codebase using natural language queries with hybrid search (BM25 + dense vector).
Parameters:
path(required): Absolute path to the codebase directory to search inquery(required): Natural language query to search for in the codebaselimit(optional): Maximum number of results to return (default: 10, max: 50)extensionFilter(optional): List of file extensions to filter results (e.g., ['.ts', '.py']) (default: [])
3. clear_index
Clear the search index for a specific codebase.
Parameters:
path(required): Absolute path to the codebase directory to clear index for
4. get_indexing_status
Get the current indexing status of a codebase. Shows progress percentage for actively indexing codebases and completion status for indexed codebases.
Parameters:
path(required): Absolute path to the codebase directory to check status for
What the status output means:
- Progress is phase-based, not a direct file-count ratio. The MCP server reports coarse milestones for collection preparation, file scanning, and file processing / embedding work.
- Because indexing runs in the background and progress is persisted periodically, percentages can jump quickly on large repositories or appear unchanged for a while during long embedding batches.
- File and chunk statistics are written when an indexing run finishes successfully. During active indexing,
get_indexing_statusintentionally reports progress rather than live file/chunk totals. - Codebases are keyed by their absolute path. Indexing
/repo, a symlinked path to the same repo, and a second clone will create separate tracked entries. - If a completed entry shows
0 files, 0 chunks, that usually means the local snapshot metadata is stale rather than the vector database being queried live. Re-indexing, or clearing and re-indexing that exact absolute path, refreshes the stored stats.
For a deeper explanation, see the asynchronous indexing workflow guide and the troubleshooting FAQ.
Contributing
This package is part of the Claude Context monorepo. Please see:
- Main Contributing Guide - General contribution guidelines
- MCP Package Contributing - Specific development guide for this package
Related Projects
- @zilliz/claude-context-core - Core indexing engine used by this MCP server
- VSCode Extension - Alternative VSCode integration
- Model Context Protocol - Official MCP documentation
License
MIT - See LICENSE for details
