claudecode-rlm
v1.3.0
Published
MCP server for Claude Code - Knowledge graph-based context storage with 74x faster reads
Maintainers
Readme
claudecode-rlm
Unlimited context for Claude Code - Knowledge graph-based context storage with automatic memory.
Author: Michael Thornton ([email protected]) Repository: https://github.com/tekcin/claudecode-rlm
Features
- TRUE Unlimited Context - Proxy server auto-stores ALL conversations, auto-injects relevant context
- MCP Server - Manual memory tools for Claude Code
- Knowledge Graph - Hierarchical storage (Document → Section → Chunk → Entity)
- 74x Faster Reads - LRU cache with inverted index
- Entity Extraction - Automatic extraction of code elements, files, concepts
- Recency Weighting - Fresh context ranked higher
Installation
npm install -g claudecode-rlm🚀 Proxy Mode (Recommended) - TRUE Automatic Memory
The proxy server intercepts ALL API calls for complete automatic context:
1. Start the proxy
claudecode-rlm-proxy2. Configure Claude Code
export ANTHROPIC_BASE_URL=http://localhost:3456Or add to your shell profile (~/.bashrc, ~/.zshrc):
export ANTHROPIC_BASE_URL=http://localhost:3456What happens automatically:
- ✅ Every user message → Stored in knowledge graph
- ✅ Every assistant response → Stored in knowledge graph
- ✅ Before each request → Relevant past context injected
- ✅ Streaming supported → Works with Claude Code's streaming
Proxy Environment Variables
| Variable | Description | Default |
|----------|-------------|---------|
| CLAUDECODE_RLM_PROXY_PORT | Proxy port | 3456 |
| CLAUDECODE_RLM_WORKDIR | Storage directory | Current directory |
| CLAUDECODE_RLM_SESSION | Session ID | auto |
| CLAUDECODE_RLM_MAX_INJECT | Max tokens to inject | 2000 |
| ANTHROPIC_REAL_URL | Real Anthropic API URL | https://api.anthropic.com |
MCP Mode - Manual Memory Tools
For manual control via MCP tools:
Add to Claude Code config (~/.claude/claude_desktop_config.json):
{
"mcpServers": {
"claudecode-rlm": {
"command": "claudecode-rlm"
}
}
}Available Tools
| Tool | Description |
|------|-------------|
| memory_auto | Store AND retrieve context in one call (recommended) |
| memory_store | Store content in knowledge graph |
| memory_search | Search with recency weighting |
| graph_query | Query graph (search/entity/expand/path) |
| list_entities | List tracked entities |
| graph_stats | Get storage statistics |
Architecture
Knowledge Graph Structure
Document (conversation turn)
├── Section (topic/header group)
│ ├── Chunk (~300 chars, searchable)
│ │ ├── Entity (function name)
│ │ ├── Entity (file path)
│ │ └── Entity (concept)
│ └── Chunk
│ └── [FOLLOWS] → next chunk
└── SectionStorage Location
.claude/claudecode-rlm/graph/{sessionID}/
├── nodes/*.json - Content nodes
├── edges/*.json - Relationships
└── indexes/*.json - Search indexesPerformance
| Operation | Basic | Optimized | Speedup | |-----------|-------|-----------|---------| | Node reads | 21.3µs | 0.3µs | 74x | | Get by type | 1.1ms | 21µs | 54x | | Bulk insert | 6.07s | 1.21s | 5x |
Library Usage
import { GraphStorage, GraphIngester, EnhancedSearch } from 'claudecode-rlm'
GraphStorage.init('/path/to/project')
// Store
GraphIngester.ingestContextBlock({
id: 'block-1',
sessionID: 'my-session',
content: 'Your content...',
summary: '',
tokens: 1000,
createdAt: Date.now()
})
// Search
const results = EnhancedSearch.search('my-session', 'query')Development
npm install
npm run build
npm start # MCP server
npm run proxy # Proxy serverLicense
MIT
Author
Michael Thornton
- Email: [email protected]
- GitHub: @tekcin
