npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@plures/superlocalmemory-mcp

v0.2.0

Published

MCP server for superlocalmemory — persistent vector memory for AI coding assistants

Readme

@plures/superlocalmemory-mcp

MCP (Model Context Protocol) server for superlocalmemory — a local-first, persistent vector memory for AI coding assistants.

It exposes a small set of MCP tools and resources so editors like VS Code (Copilot MCP), Cursor, Continue, and Claude Desktop can store and recall long-term memory during coding sessions.

  • Storage: local SQLite file (better-sqlite3)
  • Retrieval: semantic vector search
  • Embeddings: Transformers.js (default, zero-config) or OpenAI (optional)

✨ Zero-Config Usage

No API keys required! Just run:

npx @plures/superlocalmemory-mcp

The server uses Transformers.js to run embeddings locally in-process with the bge-small-en-v1.5 model (384 dimensions).

Note: First run will download the model (~100MB) to ~/.cache/superlocalmemory/transformers. Subsequent runs are instant and fully offline.

Install

# Run directly (recommended):
npx @plures/superlocalmemory-mcp

# Or install globally:
npm install -g @plures/superlocalmemory-mcp

Configuration

All configuration is optional:

Environment Variables

  • SUPERLOCALMEMORY_DB_PATH (optional) — SQLite DB path (default: ~/.superlocalmemory/mcp.db)
  • SUPERLOCALMEMORY_DEBUG (optional) — set to true for debug logs to stderr
  • SUPERLOCALMEMORY_CACHE_DIR (optional) — Transformers.js model cache directory (default: ~/.cache/superlocalmemory/transformers)
  • OPENAI_API_KEY (optional) — use OpenAI embeddings instead of local Transformers.js
  • OPENAI_EMBEDDING_MODEL (optional) — OpenAI model to use (default: text-embedding-3-small)

Using OpenAI (Optional)

If you prefer OpenAI embeddings over local Transformers.js:

  1. Set OPENAI_API_KEY in your environment
  2. The server will use OpenAI with 1536-dim embeddings

Migration / Breaking Changes

Important: Embedding dimensions are not interchangeable. Transformers.js uses 384‑dim embeddings, while OpenAI uses 1536‑dim embeddings by default.

If you already have an existing database:

  • Databases created with OpenAI embeddings must continue using OPENAI_API_KEY.
    • Do not switch that database to Transformers.js; the stored vectors will be incompatible.
    • The server will detect dimension mismatches and provide a clear error message.
  • There is no automatic migration between 384‑dim and 1536‑dim embeddings.
  • To change providers (OpenAI ⇄ Transformers.js), you must either:
    • Point SUPERLOCALMEMORY_DB_PATH to a new database, or
    • Delete/recreate the existing DB and re-index all memories.

Editor setup

Zero-Config Examples

These examples require no environment variables and work out of the box:

VS Code (Copilot) — mcp.json

{
  "mcpServers": {
    "superlocalmemory": {
      "command": "npx",
      "args": ["@plures/superlocalmemory-mcp"]
    }
  }
}

Cursor — settings.json

{
  "mcpServers": {
    "superlocalmemory": {
      "command": "npx",
      "args": ["@plures/superlocalmemory-mcp"]
    }
  }
}

Continue.dev — config.json

{
  "mcpServers": [
    {
      "name": "superlocalmemory",
      "command": "npx",
      "args": ["@plures/superlocalmemory-mcp"]
    }
  ]
}

Claude Desktop — claude_desktop_config.json

{
  "mcpServers": {
    "superlocalmemory": {
      "command": "npx",
      "args": ["@plures/superlocalmemory-mcp"]
    }
  }
}

With OpenAI (Optional)

If you prefer OpenAI embeddings:

{
  "mcpServers": {
    "superlocalmemory": {
      "command": "npx",
      "args": ["@plures/superlocalmemory-mcp"],
      "env": {
        "OPENAI_API_KEY": "your-key"
      }
    }
  }
}

Tools

memory_store

Store a memory.

Input

  • content (string, required)
  • tags (string[], optional)
  • category (string, optional)
  • source (string, optional)

memory_search

Semantic search.

Input

  • query (string, required)
  • limit (number, optional, default 5)
  • minScore (number, optional, default 0.3)

memory_forget

Delete by UUID id or by semantic query.

Input

  • id (string, optional)
  • query (string, optional)
  • threshold (number, optional, default 0.8)

memory_profile

Return the stored user profile summary (if any).

memory_index

Index a directory by storing file contents as memories.

Input

  • directory (string, required)
  • maxFiles (number, optional, default 500)
  • maxBytesPerFile (number, optional, default 200000)
  • category (string, optional, default project-context)
  • tags (string[], optional)

memory_stats

Return database stats.

Resources

  • memory://profile — JSON user profile (if available)
  • memory://recent — markdown list of the 20 most recent memory contents
  • memory://stats — JSON stats

How it works

This server provides local-first persistent memory with:

  • Local SQLite database stores memory rows with embeddings
  • Embeddings:
    • Default: Transformers.js (bge-small-en-v1.5, 384-dim) — runs in-process, zero-config
    • Optional: OpenAI API (text-embedding-3-small, 1536-dim) — requires API key
  • Vector search: In-process cosine similarity against stored embeddings
  • Indexing: memory_index walks a directory, reads text files, and stores them for later retrieval

Privacy

All memory data is stored locally on your machine at SUPERLOCALMEMORY_DB_PATH (default: ~/.superlocalmemory/mcp.db).

Network Usage

  • Transformers.js (default): One-time model download (~100MB) on first run, then 100% offline
  • OpenAI (optional): API calls for each embedding when OPENAI_API_KEY is set

No memory content is ever sent to external services except for embedding generation when using OpenAI.

Development

npm install
npm run dev

Build:

npm run build

License

AGPL-3.0