treeseek
v0.3.4
Published
MCP server for semantic code search using local LanceDB and Ollama
Readme
treeseek
MCP server for semantic code search. Indexes a codebase using AST-aware chunking, generates embeddings via Ollama, and stores them in a local LanceDB vector database — 100% offline, no APIs or external services required.
Documentation
https://treeseek.sorianox2013.workers.dev/
Prerequisites
- Node.js 20 or 22 (do NOT use Node 24 — incompatible with the core)
- Ollama running locally: https://ollama.com
ollama serve - Embedding model pulled:
ollama pull nomic-embed-text
Environment Variables
| Variable | Default | Description |
|---|---|---|
| OLLAMA_HOST | http://localhost:11434 | Ollama API endpoint |
| EMBED_MODEL | nomic-embed-text | Ollama embedding model name |
| LANCEDB_PATH | ~/.codebase-indexer | Base directory for LanceDB storage |
MCP Tools
index_codebase
Indexes a codebase directory.
{ "path": "/absolute/path/to/project" }Returns: { indexed_files, total_chunks, status, duration_ms }
search_code
Semantic search across an indexed codebase.
{ "query": "database connection handler", "path": "/absolute/path/to/project", "top_k": 10 }Returns: array of { file, start_line, end_line, content, language, score }
get_indexing_status
Check if a codebase has been indexed.
{ "path": "/absolute/path/to/project" }Returns: { indexed, collection_name, total_chunks }
clear_index
Remove the index for a codebase.
{ "path": "/absolute/path/to/project" }Returns: { success, message }
Setup
Claude Code
claude mcp add treeseek \
-e OLLAMA_HOST=http://localhost:11434 \
-e EMBED_MODEL=nomic-embed-text \
-- npx treeseekOr add manually to ~/.claude.json under mcpServers:
{
"mcpServers": {
"treeseek": {
"type": "stdio",
"command": "npx",
"args": ["treeseek"],
"env": {
"OLLAMA_HOST": "http://localhost:11434",
"EMBED_MODEL": "nomic-embed-text"
}
}
}
}OpenCode
Via CLI (interactive):
opencode mcp addPreencha os campos quando solicitado:
- type:
local - name:
treeseek - command:
npx treeseek - environment:
OLLAMA_HOST=http://localhost:11434,EMBED_MODEL=nomic-embed-text
Ou adicione manualmente ao ~/.config/opencode/opencode.json (global) ou opencode.json na raiz do projeto:
{
"mcp": {
"treeseek": {
"type": "local",
"command": ["npx", "treeseek"],
"enabled": true,
"environment": {
"OLLAMA_HOST": "http://localhost:11434",
"EMBED_MODEL": "nomic-embed-text"
}
}
}
}Nota: O OpenCode usa
environment(nãoenv) ecommanddeve ser um array. Variáveis no campoenvsão silenciosamente ignoradas.
Para listar os MCPs configurados:
opencode mcp listCursor
Add to your .cursor/mcp.json:
{
"mcpServers": {
"treeseek": {
"command": "npx",
"args": ["treeseek"],
"env": {
"OLLAMA_HOST": "http://localhost:11434",
"EMBED_MODEL": "nomic-embed-text"
}
}
}
}VS Code (Copilot)
Add to your .vscode/mcp.json:
{
"servers": {
"treeseek": {
"type": "stdio",
"command": "npx",
"args": ["treeseek"],
"env": {
"OLLAMA_HOST": "http://localhost:11434",
"EMBED_MODEL": "nomic-embed-text"
}
}
}
}Windsurf
Add to your MCP settings:
{
"mcpServers": {
"treeseek": {
"command": "npx",
"args": ["treeseek"],
"env": {
"OLLAMA_HOST": "http://localhost:11434",
"EMBED_MODEL": "nomic-embed-text"
}
}
}
}Supported Languages
TypeScript, JavaScript, Python, Java, Go, Rust, C, C++, C#, Ruby, PHP, Swift, Kotlin, Scala, and more (via tree-sitter AST chunking).
