ktme-cli
v0.1.1
Published
Knowledge Transfer Me - Automated documentation generation from code changes using AI
Maintainers
Readme
ktme - Knowledge Transfer Me
A Rust-based CLI tool and MCP server for automated documentation generation from code changes. Integrates with AI coding assistants like Claude Code, Cursor, Windsurf, and other MCP-compatible clients.
Table of Contents
- Overview
- Core Concepts
- Features
- Installation
- AI Agent Integration
- Quick Start
- Configuration
- MCP Tools
- Knowledge Search (RAG)
- Usage Examples
- Architecture
- Contributing
- License
Overview
ktme (Knowledge Transfer Me) is a CLI tool and MCP server that bridges code changes with documentation. It uses the Model Context Protocol (MCP) to communicate with AI coding assistants, enabling them to:
- Read and analyze code changes from Git
- Generate documentation based on diffs
- Update existing docs intelligently
- Publish to Markdown files or Confluence
graph LR
subgraph AI Assistants
Claude[Claude Code]
Cursor[Cursor]
Windsurf[Windsurf]
end
subgraph ktme MCP Server
Server[MCP Server]
Tools[Tools]
end
Git[Git Repository] --> Tools
Claude --> Server
Cursor --> Server
Windsurf --> Server
Server --> Tools
Tools --> Docs[Documentation]How It Works
- You configure ktme as an MCP server in your AI assistant (Claude Code, Cursor, etc.)
- AI assistant connects to ktme via the Model Context Protocol
- AI uses ktme tools to read code changes, query mappings, and generate docs
- Documentation is created in your preferred format (Markdown, Confluence)
Core Concepts
Multi-Source Change Extraction
ktme can extract code changes from multiple sources:
# From commits
ktme extract --commit abc123
# From staged changes
ktme extract --staged
# From pull requests
ktme extract --pr 456 --provider github
# From commit ranges
ktme extract --commit v1.0.0..v1.1.0Service-to-Documentation Mapping
Map your services to their documentation locations:
# ~/.config/ktme/mappings.toml
[[services]]
name = "api-gateway"
path = "projects/api-gateway"
docs = [
{ type = "markdown", location = "projects/api-gateway/README.md" },
{ type = "confluence", location = "https://company.atlassian.net/wiki/display/API/APIGateway" }
]MCP Integration
ktme exposes tools through the Model Context Protocol, enabling AI assistants to:
- Read extracted code changes
- Query service-document mappings
- Generate documentation based on changes
- Update existing documentation intelligently
Features
| Feature | Status | Description | |---------|--------|-------------| | CLI Interface | ✅ IMPLEMENTED | Complete command structure with extract, generate, mapping, MCP, and config commands | | Git Diff Extraction | ✅ IMPLEMENTED | Extract from commits and staged changes (PR extraction not yet implemented) | | AI-Powered Generation | ✅ IMPLEMENTED | Generate documentation using OpenAI or Claude AI models | | Service Mapping Storage | ✅ IMPLEMENTED | SQLite database for storing service-to-documentation mappings | | MCP Server with Tools | ✅ IMPLEMENTED | Functional MCP server with tools for reading changes, mappings, and generating docs | | Configuration Management | ✅ IMPLEMENTED | TOML-based configuration with environment variable support | | Database Layer | ✅ IMPLEMENTED | Complete SQLite models and repositories for all features | | JSON Output Format | ✅ IMPLEMENTED | Export diffs and data in JSON format | | Basic Error Handling | ✅ IMPLEMENTED | Comprehensive error types and propagation | | Git Platform Support | 🔄 PARTIAL | Local Git ✅, GitHub/GitLab PR support framework exists | | Template System | 🔄 FRAMEWORK | Template structure in place, basic rendering implemented | | Documentation Providers | 🔄 BASIC | Markdown file writer ✅, Confluence provider structure exists | | PR Extraction | ❌ NOT IMPLEMENTED | GitHub/GitLab API integration for PR diffs | | Knowledge Search/RAG | ❌ NOT IMPLEMENTED | Vector embeddings and semantic search features | | Confluence Sync | ❌ NOT IMPLEMENTED | No Confluence API integration implemented | | Feature Mapping System | ❌ NOT IMPLEMENTED | Feature-to-document mapping not implemented |
Current Implementation Status
✅ Fully Implemented
- CLI Interface: Complete command-line interface with all major commands (extract, generate, mapping, MCP, config)
- Git Operations: Functional diff extraction from commits and staged changes using git2-rs
- AI Integration: Working OpenAI and Claude API clients with automatic provider detection
- MCP Server: Fully functional MCP server with STDIO transport and working tools
- SQLite Storage: Complete database layer with models and repositories
- Service Mapping: Add, list, get, remove service-to-document mappings
- Configuration: TOML-based configuration management with environment variables
- Error Handling: Comprehensive error types and proper propagation
🔄 Partially Implemented
- Documentation Generation: Basic template-based generation from Git diffs
- Markdown Writer: Simple file-based documentation output
- PR Extraction: Framework exists but GitHub/GitLab API integration not implemented
- Confluence Provider: Structure in place but API calls not implemented
- Template System: Basic template structure exists
- HTTP Transport: SSE mode structure implemented but not tested
❌ Not Yet Implemented
- Knowledge Search/RAG: No vector embeddings, semantic search, or RAG functionality
- Confluence Sync: No incremental sync or API integration
- Feature Mapping: No feature-to-document mapping system
- Bitbucket Integration: Provider structure not implemented
- Advanced Templates: No sophisticated template rendering
- Web UI: No web interface for managing mappings
⚠️ Known Issues
- 103 compilation warnings (mostly unused code - expected for incomplete features)
- PR extraction returns error - GitHub/GitLab API not implemented
- Some MCP tools have basic implementations (generate_documentation creates simple markdown)
- Confluence provider structure exists but API calls not implemented
Installation
Option 1: Install from NPM (Recommended)
npm install -g ktme-cli
# Verify installation
ktme --versionOption 2: Install from Source
Prerequisites:
- Rust 1.70 or higher
- Git 2.30 or higher
- Access to an AI model API (Claude, GPT, etc.)
Build from source:
git clone https://github.com/FreePeak/ktme.git
cd ktme
cargo build --releaseUsing Cargo:
cargo install --path .Development Workflow
For active development, use one of these methods to update your ktme command after code changes:
Method 1: Makefile (Recommended)
# Quick development cycle
make dev
# Or step by step
make build-release
make install-devMethod 2: Rebuild and Replace
# After code changes, rebuild
cargo build --release
# Copy to system location (one-time setup)
sudo cp ./target/release/ktme /usr/local/bin/ktmeMethod 3: Cargo Install (Easiest)
# After code changes, reinstall globally
cargo install --path --force .Method 4: Development Alias
Add to your ~/.zshrc or ~/.bashrc:
alias ktme="cd /path/to/ktme && ./target/release/ktme"Method 5: PATH Update Add to your shell config:
export PATH="/path/to/ktme/target/release:$PATH"Using Makefile
The project includes a Makefile for common development tasks:
# Show all available targets
make help
# Build and install in development mode
make dev
# Build in release mode
make build-release
# Run tests and formatting
make ci
# MCP server management
make run-mcp # Start server
make stop-mcp # Stop server
make status-mcp # Check statusVerify Installation
ktme --versionNote: After installing
ktme-clifrom NPM, the command available isktme(without-cli).
AI Agent Integration
ktme uses the Rust MCP SDK to implement the Model Context Protocol, enabling seamless communication with AI coding assistants.
Claude Code
Add to your Claude Code MCP settings (~/.claude/mcp_settings.json):
{
"mcpServers": {
"ktme": {
"command": "ktme",
"args": ["mcp", "start"],
"env": {
"KTME_MCP_API_KEY": "your-api-key"
}
}
}
}Cursor
Add to Cursor's MCP configuration (.cursor/mcp.json):
{
"servers": {
"ktme": {
"command": "ktme",
"args": ["mcp", "start", "--stdio"]
}
}
}Windsurf
Add to Windsurf's MCP settings:
{
"mcp": {
"servers": {
"ktme": {
"command": "ktme",
"args": ["mcp", "start", "--stdio"]
}
}
}
}SSE Mode (HTTP)
For HTTP-based connections, start ktme in SSE mode:
ktme mcp start --sse --host 127.0.0.1 --port 8080Then configure your AI assistant to connect to http://127.0.0.1:8080.
Quick Start
1. Install ktme
npm install -g ktme-cli2. Initialize Configuration
ktme config init3. Set Up API Key
export KTME_MCP_API_KEY="your-api-key"4. Map Your Service
ktme mapping add user-api --file ~/projects/user-api/README.md5. Generate Documentation
cd ~/projects/user-api
ktme generate --commit HEAD --service user-apiImportant: The NPM package is named
ktme-cli, but after installation, you use thektmecommand.
Configuration
Configuration File
Location: ~/.config/ktme/config.toml
[general]
default_directory = "~/projects"
log_level = "info"
[git]
default_branch = "main"
include_merge_commits = false
[mcp]
model = "claude-3-5-sonnet-20241022"
max_tokens = 4096
temperature = 0.7
[documentation]
default_format = "markdown"
template_directory = "~/.config/ktme/templates"
[confluence]
base_url = "https://your-company.atlassian.net"
auth_type = "token"
space_key = "DOCS"Environment Variables
| Variable | Description |
|----------|-------------|
| KTME_MCP_API_KEY | AI model API key |
| KTME_MCP_MODEL | Model identifier |
| CONFLUENCE_API_TOKEN | Confluence authentication token |
| CONFLUENCE_USERNAME | Confluence username |
| KTME_LOG_LEVEL | Logging level (debug, info, warn, error) |
| OPENAI_API_KEY | OpenAI API key for embeddings (optional) |
Knowledge Search Configuration
[knowledge]
# Cache location (default: ~/.config/ktme/knowledge.db)
cache_path = "~/.config/ktme/knowledge.db"
# Embedding provider: "openai" or "local"
embedding_provider = "local"
# Confluence spaces to sync
sync_spaces = ["DEV", "API", "DOCS"]
# Auto-sync interval in hours (0 = manual only)
auto_sync_interval = 0
[knowledge.chunking]
# Chunk size for RAG (tokens)
chunk_size = 512
chunk_overlap = 50MCP Tools
ktme exposes the following tools through MCP:
Documentation Tools
| Tool | Description |
|------|-------------|
| read_changes | Read extracted diff from a file or Git |
| generate_documentation | Create new documentation from changes |
| update_documentation | Modify existing documentation |
Mapping Tools
| Tool | Description |
|------|-------------|
| get_service_mapping | Retrieve document URL for a service |
| list_services | List all mapped services |
| add_mapping | Add a new service-to-doc mapping |
Git Tools
| Tool | Description |
|------|-------------|
| extract_commit | Extract changes from a specific commit |
| extract_pr | Extract changes from a pull request |
| list_commits | List commits in a range |
Knowledge Search Tools
| Tool | Description |
|------|-------------|
| search_knowledge | Search documentation using natural language queries |
| get_document | Retrieve full document content by ID or URL |
| list_documents | List documents filtered by team, tags, or source |
| sync_documents | Trigger sync from Confluence or other sources |
Feature Mapping Tools
| Tool | Description |
|------|-------------|
| get_feature | Get a feature/screen with all related documentation |
| map_feature_document | Link a feature to a documentation page |
| list_features | List all features filtered by team |
Knowledge Search (RAG) - NOT IMPLEMENTED
⚠️ WARNING: This section describes planned features that are not yet implemented. The knowledge search, RAG capabilities, and related tools do not exist in the current codebase.
ktme is designed to include a powerful knowledge search system that will enable teams to search documentation across Confluence and local files using natural language queries through MCP-connected AI assistants.
Planned Architecture
graph TB
subgraph "Users (via Cursor/Claude)"
Backend[Backend Team]
Mobile[Mobile Team]
end
subgraph "MCP Server"
SearchTool[search_knowledge]
GetDocTool[get_document]
SyncTool[sync_documents]
end
subgraph "Local Cache"
SQLite[(SQLite)]
FTS[FTS5 Index]
Embeddings[Embeddings]
end
subgraph "Source of Truth"
Confluence[(Confluence)]
end
Backend -.-> SearchTool
Mobile -.-> SearchTool
SearchTool -.-> FTS
SearchTool -.-> Embeddings
GetDocTool -.-> SQLite
Confluence -.->|"Incremental Sync"| SQLite
SQLite -.-> FTS
SQLite -.-> EmbeddingsHow It Will Work
- Confluence is the source of truth - All documentation lives in Confluence
- Local SQLite cache - Documents will be synced to a local cache for fast searching
- Hybrid search - Will combine keyword matching (FTS5) with semantic search (embeddings)
- Incremental sync - Will only fetch documents modified since last sync
Planned Knowledge Search Tools
| Tool | Status | Description |
|------|--------|-------------|
| search_knowledge | ❌ Not Implemented | Search documentation using natural language |
| get_document | ❌ Not Implemented | Retrieve full document content by ID |
| list_documents | ❌ Not Implemented | List documents by team, tags, or source |
| sync_documents | ❌ Not Implemented | Sync documents from Confluence |
Syncing Documents
# Initial full sync from Confluence
ktme sync --space DEV --full
ktme sync --space API --full
# Incremental sync (only changed documents)
ktme sync --space DEV
# Sync all configured spaces
ktme sync --allSearching Knowledge
# CLI search
ktme search "user dashboard overview"
ktme search "payment integration" --team mobile
ktme search "authentication flow" --tag "feature:auth"
# Via MCP (AI assistant uses these tools)
# search_knowledge("user dashboard overview")
# search_knowledge("how does the API gateway work", team="backend")Tagging Documents
Documents can be tagged for better organization:
# Add tags to indexed documents
ktme tag DOC_ID --team mobile --tag "screen:user_dashboard"
ktme tag DOC_ID --tag "feature:main_screen"
# Search by tags
ktme search --tag "screen:*" --team mobileCache Location
The knowledge cache is stored locally per user:
~/.config/ktme/
config.toml # Configuration
ktme.db # Service mappings
knowledge.db # Knowledge search cache
documents # Cached document content
documents_fts # Full-text search index
document_chunks # RAG chunks for context
embeddings # Vector embeddings
sync_state # Last sync timestampsThe cache is ephemeral and can be regenerated by re-syncing from Confluence.
Feature Mapping - NOT IMPLEMENTED
⚠️ WARNING: This section describes planned features that are not yet implemented. The feature mapping system does not exist in the current codebase.
Features are planned to represent logical units like mobile screens, components, or business flows. They will be mapped to documentation and services.
Planned Commands:
# Add a feature (mobile screen)
ktme feature add user_dashboard \
--team mobile \
--display-name "User Dashboard Screen" \
--aliases "dashboard,main screen,overview"
# Map feature to documentation
ktme feature map user_dashboard \
--doc-url "https://confluence.company.com/display/MOBILE/User+Dashboard"
# Link feature to backend service
ktme feature link user_dashboard --service api-gateway
# Get feature with all related docs
ktme feature get user_dashboardPlanned Feature-Service-Document Relationships:
Feature (Mobile) Service (Backend) Document (Confluence)
---------------- ----------------- ---------------------
user_dashboard <---> api-gateway ---> "User Dashboard Design Doc"
<---> user-service ---> "User API Reference"
---> "Authentication Flow"Example: Mobile Team Searching Documentation
Mobile Dev in Cursor: "Find docs about the user dashboard screen"
AI Assistant calls: search_knowledge("user dashboard screen")
System searches:
1. Features FTS: Matches "user_dashboard" via alias "dashboard"
2. Documents FTS: Matches documents containing "user interface"
3. Returns merged results with feature->document mappings
Results returned:
1. "User Dashboard - Main Interface"
URL: https://confluence.company.com/display/MOBILE/User+Dashboard
Team: mobile
Feature: user_dashboard
Related Services: [api-gateway, user-service]
Summary: "The user dashboard displays account information and recent activity..."
2. "API Gateway Documentation"
URL: https://confluence.company.com/display/BACKEND/API+Gateway
Team: backend
Service: api-gateway
Related Features: [user_dashboard, authentication]
Summary: "Central gateway for routing API requests to backend services..."Usage Examples
⚠️ NOTE: The examples below show the intended usage, but most CLI commands are not yet implemented. The project compiles but command handlers are missing.
Document a Feature Branch (Planned)
# Extract changes from feature branch
ktme extract --commit main..feature/new-auth --output /tmp/feature.json
# Generate documentation
ktme generate --input /tmp/feature.json --service user-service --type api-docUpdate Changelog from PR (Planned)
# Extract PR changes
ktme extract --pr 456 --provider github
# Update changelog section
ktme update --pr 456 --service api-gateway --section "Changelog"Publish to Confluence (Planned)
# Configure Confluence
export CONFLUENCE_API_TOKEN="your-token"
ktme config set confluence.base_url "https://company.atlassian.net"
# Map and generate
ktme mapping add user-service --url "https://company.atlassian.net/wiki/spaces/DOCS/pages/12345"
ktme generate --commit abc123 --service user-service --format confluenceDocument Staged Changes (Planned)
# Stage your changes
git add src/main.rs
# Generate documentation before committing
ktme generate --staged --service my-service
# Commit both code and docs
git add .
git commit -m "feat: Add new feature with documentation"What Actually Works Now
# Build the project
cargo build --release
# Extract changes from staging
./target/release/ktme extract --staged --output changes.json
# Extract changes from a commit
./target/release/ktme extract --commit HEAD --output last-commit.json
# Start MCP server with STDIO transport
./target/release/ktme mcp start --stdio
# Check version
./target/release/ktme --version
# The binary compiles and runs with working Git operationsArchitecture
graph TB
subgraph CLI Layer
CLI[CLI Interface]
Config[Config Manager]
end
subgraph Core Layer
Git[Git Reader]
Extractor[Diff Extractor]
Generator[Doc Generator]
Storage[Mapping Storage]
end
subgraph MCP Layer
Server[MCP Server]
Tools[MCP Tools]
end
subgraph Output Layer
MD[Markdown Writer]
Confluence[Confluence Writer]
end
CLI --> Git
CLI --> Config
Config --> Storage
Git --> Extractor
Extractor --> Server
Server --> Tools
Tools --> Generator
Storage --> Generator
Generator --> MD
Generator --> ConfluenceComponent Overview
| Component | Purpose | |-----------|---------| | CLI Interface | Command parsing and user interaction | | Git Reader | Extract changes from Git repositories | | Diff Extractor | Parse and structure Git diffs | | MCP Server | Model Context Protocol communication | | Doc Generator | Transform AI output to documentation | | Mapping Storage | Service-to-document mappings | | Writers | Output to Markdown or Confluence |
Supported Platforms
| Platform | PR Extraction | Commit Extraction | |----------|---------------|-------------------| | GitHub | Yes | Yes | | GitLab | Yes | Yes | | Bitbucket | Yes | Yes | | Local Git | N/A | Yes |
Troubleshooting
Common Issues
MCP server connection failed
ktme mcp status
ktme mcp startService mapping not found
ktme mapping list
ktme mapping add my-service --file ~/path/to/docs/README.mdEnable debug logging
export KTME_LOG_LEVEL="debug"
ktme --verbose generate --commit abc123 --service my-serviceImplementation Analysis & Roadmap
Codebase Architecture
ktme follows a well-structured architecture with clear separation of concerns:
src/main.rs: Entry point with CLI command definitionssrc/cli/: Command-line interface and command handlerssrc/mcp/: MCP server implementation with toolssrc/git/: Git operations and diff extractionsrc/ai/: AI provider integrations (OpenAI, Claude)src/storage/: SQLite database models and repositoriessrc/config/: Configuration managementsrc/doc/: Documentation providers and writerssrc/error.rs: Comprehensive error handling
Current Build Status
- ✅ Project compiles successfully with
cargo build --release - ⚠️ 103 warnings (mostly unused code from incomplete features)
- ✅ All dependencies properly configured
- ✅ Binary runs and responds to commands
- ✅ Git operations work correctly
- ✅ MCP server starts in STDIO mode
Implementation Priorities
Phase 1: Complete Core Features (1-2 weeks)
PR Extraction Implementation
- Add GitHub API client for PR diffs
- Implement GitLab API integration
- Test with real pull requests
Enhance Documentation Generation
- Improve AI prompt templates
- Add better diff-to-documentation logic
- Support multiple documentation formats
Complete MCP Tools
- Enhance generate_documentation with AI integration
- Add error handling for all tools
- Test with Claude Code/Cursor integration
Phase 2: Provider Integration (2-3 weeks)
Confluence Provider
- Implement REST API client
- Add authentication (token/oauth)
- Create and update pages
Template System
- Implement variable substitution
- Add built-in templates
- Support custom templates
Configuration Commands
- Complete config command implementations
- Add interactive setup
- Validate configurations
Phase 3: Advanced Features (4-6 weeks)
Knowledge Search & RAG
- SQLite document caching
- FTS5 full-text search
- Vector embeddings for semantic search
Feature Mapping System
- Feature-to-service mapping
- Feature-to-document linking
- Team-based organization
Confluence Sync
- Incremental sync from Confluence
- Bi-directional updates
- Conflict resolution
Testing Strategy
The project currently lacks unit tests. Recommended testing approach:
- Unit Tests: Test individual components (Git reader, AI client, storage)
- Integration Tests: Test CLI commands end-to-end
- MCP Tests: Test MCP server with actual AI assistants
- E2E Tests: Full workflow from Git diff to documentation
Development Guidelines
- Keep It Simple: Avoid over-engineering, use straightforward solutions
- Incremental Development: Build and test one feature at a time
- Error Handling: Provide clear error messages for debugging
- Documentation: Add inline docs for complex logic
- Performance: Profile before optimizing, focus on actual bottlenecks
Contributing
Contributions are welcome!
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Development
# Build
cargo build
# Build in release mode (recommended for testing)
cargo build --release
# Run tests
cargo test
# Format code
cargo fmt
# Lint
cargo clippy
# Quick development cycle
cargo build --release && ./target/release/ktme --versionLicense
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- Model Context Protocol for the MCP specification
- Rust MCP SDK for MCP server implementation
- git2-rs for Git integration
- clap for CLI parsing
Built with Rust. Powered by AI. Documentation made simple.
