local-memory-mcp
v1.4.4
Published
Local Memory MCP Server - AI-powered persistent memory system for Claude Code, Claude Desktop, Gemini, Codex, OpenCode and other MCP-compatible tools
Downloads
726
Maintainers
Readme
Local Memory (localmemory.co)
Version 1.4.4 - AI-powered persistent memory system for developers working with Claude Code, Claude Desktop, Gemini, Codex, Cursor, and other coding agents.
Transform your AI interactions with intelligent memory that grows smarter over time. Build persistent knowledge bases, track learning progression, and maintain context across conversations. Local Memory provides MCP, REST API, and Command-line interfaces for you and your coding agents to store, retrieve, discover, and analyze memories to give your agent the right context for it's work and tasks.
Key Features
- One-Command Install:
npm install -g local-memory-mcp - Smart Memory: AI-powered categorization and relationship discovery
- Lightning Fast: 10-57ms search responses with semantic similarity
- MCP Integration: Works seamlessly with Claude Desktop and other AI tools
- REST API: Easily integrate into your applications and workflows
- Command-Line Interface: Enable script automation and code execution via shell scripts and commands
- Persistent Context: Never lose conversation context again
- Enterprise Ready: Commercial licensing with security hardening
- Multi-Provider AI: Mix and match AI backends - Ollama, OpenAI, Anthropic, claude CLI, or any OpenAI-compatible server
Quick Start
Installation
npm install -g local-memory-mcpLicense Activation
# Get your license key from https://localmemory.co
local-memory license activate LM-XXXX-XXXX-XXXX-XXXX-XXXX
# Verify activation
local-memory license statusStart the Service
# Start the daemon
local-memory start
# Verify it's running
local-memory statusCLI Usage
Basic Memory Operations
# Store memories with automatic AI categorization
local-memory remember "Go channels are like pipes between goroutines"
local-memory remember "Redis is excellent for caching" --importance 8 --tags caching,database
# Search memories (keyword and AI-powered semantic search)
local-memory search "concurrency patterns"
local-memory search "neural networks" --use_ai --limit 5
# Create relationships between concepts
local-memory relate "channels" to "goroutines" --type enables
# Delete memories
local-memory forget <memory-id>Advanced Search
# Tag-based search
local-memory search --tags python,programming
# Date range search
local-memory search --start_date 2025-01-01 --end_date 2025-12-31
# Domain filtering
local-memory search "machine learning" --domain ai-research
# Session filtering (cross-conversation access)
local-memory search "recent work" --session_filter_mode allMCP Integration
Claude Desktop Setup
Add to your Claude Desktop configuration file:
{
"mcpServers": {
"local-memory": {
"command": "/path/to/local-memory",
"args": [
"--mcp"
],
"transport": "stdio"
}
}
}Available MCP Tools
When configured, Claude will have access to these memory tools:
Core Tools:
observe- Record observations with knowledge level support (L0-L3)search- Find memories with semantic searchbootstrap- Initialize sessions with knowledge context
Knowledge Evolution:
reflect- Process observations into learningsevolve- Validate, promote, or decay knowledgequestion- Track epistemic gaps and contradictionsresolve- Resolve contradictions and answer questions
Reasoning Tools:
predict- Generate predictions from patternsexplain- Trace causal paths between statescounterfactual- Explore "what if" scenariosvalidate- Check knowledge graph integrity
Relationship Tools:
relate- Create relationships between memories- Plus memory management tools (update, delete, get)
REST API
When the daemon is running, REST API is available at http://localhost:3002:
# Search memories
curl "http://localhost:3002/api/memories/search?query=python&limit=5"
# Store new memory
curl -X POST "http://localhost:3002/api/memories" \
-H "Content-Type: application/json" \
-d '{"content": "Python dict comprehensions are powerful", "importance": 7}'
# Health check
curl "http://localhost:3002/api/v1/health"Service Management
# Start daemon
local-memory start
# Check daemon status
local-memory status
# Stop daemon
local-memory stop
# View running processes
local-memory ps
# Kill all processes (if needed)
local-memory kill_all
# System diagnostics
local-memory doctorWhat's New in v1.4.4
Multi-Provider AI Backend - Split architecture for mixing AI providers independently for embeddings and chat. Fallback chains and circuit breakers for resilience.
| Provider | Embeddings | Chat | Type | |----------|------------|------|------| | Ollama | Yes | Yes | Local (default) | | OpenAI Compatible | Yes | Yes | Local/Remote | | OpenAI | Yes | Yes | Cloud | | Anthropic | No | Yes | Cloud | | claude CLI | No | Yes | Local subprocess |
Agent Attribution - Auto-detect agent type (claude-desktop, claude-code, api) with hostname tracking for multi-device attribution
Default Domain with MCP Prompts - Configurable
session.default_domainwith intelligent domain cascade from CLAUDE.md, AGENTS.md, and GEMINI.md; new MCPprompts/listandprompts/getprotocol supportSecurity Hardening - 6 fixes including API key redaction, SSRF protection, command injection prevention, and secure file permissions
Bug Fixes - ResponseFormat validation for "intelligent" format; enhanced text search SQL error fix
Quick Provider Setup
# Recommended: Local embeddings + Claude reasoning
ai_provider:
embedding_provider: "ollama"
chat_provider: "anthropic"
chat_fallback: "ollama"
anthropic:
enabled: true
api_key: "sk-ant-xxxxx"
model: "claude-sonnet-4-20250514"See the full configuration guide for all provider options.
System Requirements
- RAM: 512MB minimum
- Storage: 100MB for binaries, variable for database
- Platforms: macOS (Intel/ARM64), Linux x64, Windows x64
Licensing
Commercial License Required
Local Memory requires a commercial license for all use cases. Get your license at localmemory.co.
Activation
# Activate license
local-memory license activate LM-XXXX-XXXX-XXXX-XXXX-XXXX
# Check status
local-memory license statusSupport
- Documentation: localmemory.co/docs
- Architecture: localmemory.co/architecture
- Prompts: localmemory.co/prompts
- Support: Local Memory Discord
- Releases Repo: Local Memory Releases
Transform your AI workflow with persistent memory. Install now and never lose context again.
npm install -g local-memory-mcp