npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

local-memory-mcp

v1.4.4

Published

Local Memory MCP Server - AI-powered persistent memory system for Claude Code, Claude Desktop, Gemini, Codex, OpenCode and other MCP-compatible tools

Downloads

726

Readme

Local Memory (localmemory.co)

Version 1.4.4 - AI-powered persistent memory system for developers working with Claude Code, Claude Desktop, Gemini, Codex, Cursor, and other coding agents.

Transform your AI interactions with intelligent memory that grows smarter over time. Build persistent knowledge bases, track learning progression, and maintain context across conversations. Local Memory provides MCP, REST API, and Command-line interfaces for you and your coding agents to store, retrieve, discover, and analyze memories to give your agent the right context for it's work and tasks.

Key Features

  • One-Command Install: npm install -g local-memory-mcp
  • Smart Memory: AI-powered categorization and relationship discovery
  • Lightning Fast: 10-57ms search responses with semantic similarity
  • MCP Integration: Works seamlessly with Claude Desktop and other AI tools
  • REST API: Easily integrate into your applications and workflows
  • Command-Line Interface: Enable script automation and code execution via shell scripts and commands
  • Persistent Context: Never lose conversation context again
  • Enterprise Ready: Commercial licensing with security hardening
  • Multi-Provider AI: Mix and match AI backends - Ollama, OpenAI, Anthropic, claude CLI, or any OpenAI-compatible server

Quick Start

Installation

npm install -g local-memory-mcp

License Activation

# Get your license key from https://localmemory.co
local-memory license activate LM-XXXX-XXXX-XXXX-XXXX-XXXX

# Verify activation
local-memory license status

Start the Service

# Start the daemon
local-memory start

# Verify it's running
local-memory status

CLI Usage

Basic Memory Operations

# Store memories with automatic AI categorization
local-memory remember "Go channels are like pipes between goroutines"
local-memory remember "Redis is excellent for caching" --importance 8 --tags caching,database

# Search memories (keyword and AI-powered semantic search)
local-memory search "concurrency patterns"
local-memory search "neural networks" --use_ai --limit 5

# Create relationships between concepts
local-memory relate "channels" to "goroutines" --type enables

# Delete memories
local-memory forget <memory-id>

Advanced Search

# Tag-based search
local-memory search --tags python,programming

# Date range search
local-memory search --start_date 2025-01-01 --end_date 2025-12-31

# Domain filtering
local-memory search "machine learning" --domain ai-research

# Session filtering (cross-conversation access)
local-memory search "recent work" --session_filter_mode all

MCP Integration

Claude Desktop Setup

Add to your Claude Desktop configuration file:

{
  "mcpServers": {
    "local-memory": {
      "command": "/path/to/local-memory",
      "args": [
        "--mcp"
      ],
      "transport": "stdio"
    }
  }
}

Available MCP Tools

When configured, Claude will have access to these memory tools:

Core Tools:

  • observe - Record observations with knowledge level support (L0-L3)
  • search - Find memories with semantic search
  • bootstrap - Initialize sessions with knowledge context

Knowledge Evolution:

  • reflect - Process observations into learnings
  • evolve - Validate, promote, or decay knowledge
  • question - Track epistemic gaps and contradictions
  • resolve - Resolve contradictions and answer questions

Reasoning Tools:

  • predict - Generate predictions from patterns
  • explain - Trace causal paths between states
  • counterfactual - Explore "what if" scenarios
  • validate - Check knowledge graph integrity

Relationship Tools:

  • relate - Create relationships between memories
  • Plus memory management tools (update, delete, get)

REST API

When the daemon is running, REST API is available at http://localhost:3002:

# Search memories
curl "http://localhost:3002/api/memories/search?query=python&limit=5"

# Store new memory
curl -X POST "http://localhost:3002/api/memories" \
  -H "Content-Type: application/json" \
  -d '{"content": "Python dict comprehensions are powerful", "importance": 7}'

# Health check
curl "http://localhost:3002/api/v1/health"

Service Management

# Start daemon
local-memory start

# Check daemon status
local-memory status

# Stop daemon
local-memory stop

# View running processes
local-memory ps

# Kill all processes (if needed)
local-memory kill_all

# System diagnostics
local-memory doctor

What's New in v1.4.4

  • Multi-Provider AI Backend - Split architecture for mixing AI providers independently for embeddings and chat. Fallback chains and circuit breakers for resilience.

    | Provider | Embeddings | Chat | Type | |----------|------------|------|------| | Ollama | Yes | Yes | Local (default) | | OpenAI Compatible | Yes | Yes | Local/Remote | | OpenAI | Yes | Yes | Cloud | | Anthropic | No | Yes | Cloud | | claude CLI | No | Yes | Local subprocess |

  • Agent Attribution - Auto-detect agent type (claude-desktop, claude-code, api) with hostname tracking for multi-device attribution

  • Default Domain with MCP Prompts - Configurable session.default_domain with intelligent domain cascade from CLAUDE.md, AGENTS.md, and GEMINI.md; new MCP prompts/list and prompts/get protocol support

  • Security Hardening - 6 fixes including API key redaction, SSRF protection, command injection prevention, and secure file permissions

  • Bug Fixes - ResponseFormat validation for "intelligent" format; enhanced text search SQL error fix

Quick Provider Setup

# Recommended: Local embeddings + Claude reasoning
ai_provider:
  embedding_provider: "ollama"
  chat_provider: "anthropic"
  chat_fallback: "ollama"

  anthropic:
    enabled: true
    api_key: "sk-ant-xxxxx"
    model: "claude-sonnet-4-20250514"

See the full configuration guide for all provider options.

System Requirements

  • RAM: 512MB minimum
  • Storage: 100MB for binaries, variable for database
  • Platforms: macOS (Intel/ARM64), Linux x64, Windows x64

Licensing

Commercial License Required

Local Memory requires a commercial license for all use cases. Get your license at localmemory.co.

Activation

# Activate license
local-memory license activate LM-XXXX-XXXX-XXXX-XXXX-XXXX

# Check status
local-memory license status

Support


Transform your AI workflow with persistent memory. Install now and never lose context again.

npm install -g local-memory-mcp