context-creator-mcp
v1.5.0
Published
High-performance CLI tool to convert codebases to Markdown for LLM context - MCP Server
Maintainers
Readme
context-creator
High-performance CLI for building quality context windows that make AI assistants actually understand your codebase.
The Problem
AI coding assistants are only as good as the context you provide. Most tools simply concatenate files, leading to:
- Irrelevant files cluttering the context window
- Missing dependencies that are crucial for understanding
- Token limits wasted on unimportant code
- No understanding of how your code actually connects
The Solution
context-creator uses tree-sitter to build a dependency graph of your codebase, selecting only the files relevant to your task. It's like repomix, but faster and smarter.
Without context-creator
# Generic context that includes everything
cat src/**/*.ts > context.txt # 500K tokens of mostly noiseWith context-creator
# Intelligent context that follows your code's actual dependencies
context-creator --prompt "How does the authentication work?"
# Returns: auth files + their actual dependencies + related tests = 50K relevant tokensKey Advantages
- Dependency-aware: Uses tree-sitter AST parsing to understand imports, not just file names
- Fast: Rust-powered parallel processing handles massive codebases in seconds
- Smart selection: Includes only files connected to your query through the dependency graph
- Multi-language: Semantic analysis for Python, TypeScript, JavaScript, and Rust
- MCP integration: Works as a server for AI assistants to query your codebase programmatically
Installation
npm install -g context-creator-mcp@latestFor platform-specific MCP client setup, see Installation Guide.
Usage
CLI
# Analyze current directory
context-creator
# Build focused context for specific task
context-creator --prompt "Find security vulnerabilities in the auth system"
# Trace dependencies of specific files
context-creator --trace-imports --include "**/auth.py"
# Compare changes with dependency context
context-creator diff HEAD~1 HEAD
# Enrich code with OpenTelemetry runtime metrics
context-creator telemetry -t traces.jsonMCP Server
Add to your MCP client configuration:
{
"mcpServers": {
"context-creator": {
"command": "npx",
"args": ["-y", "context-creator-mcp@latest"]
}
}
}Then in your AI assistant:
"Explain how the payment system works" # AI will use analyze_local to build relevant context
"Find all SQL injection vulnerabilities" # Searches with full dependency understandingFeatures
- Tree-sitter AST parsing for true code understanding
- Import tracing and dependency resolution
- Parallel processing with Rayon
- Token budget management
- Git history integration
- MCP server with programmatic access
- OpenTelemetry integration for runtime metrics
MCP Tools
analyze_local- Analyze local codebases with dependency awarenessanalyze_remote- Analyze Git repositoriessearch- Text pattern searchsemantic_search- AST-based code searchfile_metadata- File informationdiff- File comparison
Configuration
.contextignore
node_modules/
target/
*.log
.env.contextkeep
src/core/**
src/api/**.context-creator.toml
[defaults]
max_tokens = 200000
[[priorities]]
pattern = "src/core/**"
weight = 100Documentation
- Installation Guide - Detailed setup instructions
- Usage Examples - CLI commands and workflows
- Configuration - Advanced configuration
- MCP Server Guide - MCP integration details
- Architecture - Technical implementation
Requirements
- Node.js >= v18.0.0 (for npm package)
- or Rust >= 1.70.0 (for building from source)
Building from Source
git clone https://github.com/matiasvillaverde/context-creator
cd context-creator
cargo build --releaseContributing
See CONTRIBUTING.md for guidelines.
License
MIT
