@plures/superlocalmemory-mcp
v0.2.0
Published
MCP server for superlocalmemory — persistent vector memory for AI coding assistants
Maintainers
Readme
@plures/superlocalmemory-mcp
MCP (Model Context Protocol) server for superlocalmemory — a local-first, persistent vector memory for AI coding assistants.
It exposes a small set of MCP tools and resources so editors like VS Code (Copilot MCP), Cursor, Continue, and Claude Desktop can store and recall long-term memory during coding sessions.
- Storage: local SQLite file (better-sqlite3)
- Retrieval: semantic vector search
- Embeddings: Transformers.js (default, zero-config) or OpenAI (optional)
✨ Zero-Config Usage
No API keys required! Just run:
npx @plures/superlocalmemory-mcpThe server uses Transformers.js to run embeddings locally in-process with the bge-small-en-v1.5 model (384 dimensions).
Note: First run will download the model (~100MB) to
~/.cache/superlocalmemory/transformers. Subsequent runs are instant and fully offline.
Install
# Run directly (recommended):
npx @plures/superlocalmemory-mcp
# Or install globally:
npm install -g @plures/superlocalmemory-mcpConfiguration
All configuration is optional:
Environment Variables
SUPERLOCALMEMORY_DB_PATH(optional) — SQLite DB path (default:~/.superlocalmemory/mcp.db)SUPERLOCALMEMORY_DEBUG(optional) — set totruefor debug logs to stderrSUPERLOCALMEMORY_CACHE_DIR(optional) — Transformers.js model cache directory (default:~/.cache/superlocalmemory/transformers)OPENAI_API_KEY(optional) — use OpenAI embeddings instead of local Transformers.jsOPENAI_EMBEDDING_MODEL(optional) — OpenAI model to use (default:text-embedding-3-small)
Using OpenAI (Optional)
If you prefer OpenAI embeddings over local Transformers.js:
- Set
OPENAI_API_KEYin your environment - The server will use OpenAI with 1536-dim embeddings
Migration / Breaking Changes
Important: Embedding dimensions are not interchangeable. Transformers.js uses 384‑dim embeddings, while OpenAI uses 1536‑dim embeddings by default.
If you already have an existing database:
- Databases created with OpenAI embeddings must continue using
OPENAI_API_KEY.- Do not switch that database to Transformers.js; the stored vectors will be incompatible.
- The server will detect dimension mismatches and provide a clear error message.
- There is no automatic migration between 384‑dim and 1536‑dim embeddings.
- To change providers (OpenAI ⇄ Transformers.js), you must either:
- Point
SUPERLOCALMEMORY_DB_PATHto a new database, or - Delete/recreate the existing DB and re-index all memories.
- Point
Editor setup
Zero-Config Examples
These examples require no environment variables and work out of the box:
VS Code (Copilot) — mcp.json
{
"mcpServers": {
"superlocalmemory": {
"command": "npx",
"args": ["@plures/superlocalmemory-mcp"]
}
}
}Cursor — settings.json
{
"mcpServers": {
"superlocalmemory": {
"command": "npx",
"args": ["@plures/superlocalmemory-mcp"]
}
}
}Continue.dev — config.json
{
"mcpServers": [
{
"name": "superlocalmemory",
"command": "npx",
"args": ["@plures/superlocalmemory-mcp"]
}
]
}Claude Desktop — claude_desktop_config.json
{
"mcpServers": {
"superlocalmemory": {
"command": "npx",
"args": ["@plures/superlocalmemory-mcp"]
}
}
}With OpenAI (Optional)
If you prefer OpenAI embeddings:
{
"mcpServers": {
"superlocalmemory": {
"command": "npx",
"args": ["@plures/superlocalmemory-mcp"],
"env": {
"OPENAI_API_KEY": "your-key"
}
}
}
}Tools
memory_store
Store a memory.
Input
content(string, required)tags(string[], optional)category(string, optional)source(string, optional)
memory_search
Semantic search.
Input
query(string, required)limit(number, optional, default 5)minScore(number, optional, default 0.3)
memory_forget
Delete by UUID id or by semantic query.
Input
id(string, optional)query(string, optional)threshold(number, optional, default 0.8)
memory_profile
Return the stored user profile summary (if any).
memory_index
Index a directory by storing file contents as memories.
Input
directory(string, required)maxFiles(number, optional, default 500)maxBytesPerFile(number, optional, default 200000)category(string, optional, defaultproject-context)tags(string[], optional)
memory_stats
Return database stats.
Resources
memory://profile— JSON user profile (if available)memory://recent— markdown list of the 20 most recent memory contentsmemory://stats— JSON stats
How it works
This server provides local-first persistent memory with:
- Local SQLite database stores memory rows with embeddings
- Embeddings:
- Default: Transformers.js (
bge-small-en-v1.5, 384-dim) — runs in-process, zero-config - Optional: OpenAI API (
text-embedding-3-small, 1536-dim) — requires API key
- Default: Transformers.js (
- Vector search: In-process cosine similarity against stored embeddings
- Indexing:
memory_indexwalks a directory, reads text files, and stores them for later retrieval
Privacy
All memory data is stored locally on your machine at SUPERLOCALMEMORY_DB_PATH (default: ~/.superlocalmemory/mcp.db).
Network Usage
- Transformers.js (default): One-time model download (~100MB) on first run, then 100% offline
- OpenAI (optional): API calls for each embedding when
OPENAI_API_KEYis set
No memory content is ever sent to external services except for embedding generation when using OpenAI.
Development
npm install
npm run devBuild:
npm run buildLicense
AGPL-3.0
