@sharkdyt/omni-memory-mcp
v1.0.7
Published
Universal memory MCP server for multi-agent AI workflows. 100% local with SQLite + FTS5.
Maintainers
Readme
Omni Memory MCP
Universal memory MCP server for multi-agent workflows.
100% local with SQLite + FTS5.
npm Package
- Package:
@sharkdyt/omni-memory-mcp - npm:
https://www.npmjs.com/package/@sharkdyt/omni-memory-mcp - Current
latest:1.0.7
Release Notes
v1.0.7
- Ships
memory_context_packfor read-only, token-budgeted prompt assembly. - Returns compact excerpts plus structured metadata without mutating
access_count. - Keeps the current local SQLite + FTS5 architecture: no schema migration, vector retrieval, or ranking rewrite.
Project Memory
This project does not keep local memory-bank/ files.
Operational context is stored through Omni Memory itself.
Why this exists
- Keep AI memory local (no cloud dependency)
- Reuse the same memory across tools/agents
- Search fast with SQLite FTS5
Features
- Local-first storage (SQLite)
- Full-text search with FTS5
- MCP-native tools
- Progressive Disclosure: Searches return metadata and summaries instead of full text to prevent LLM context overflow.
- Context Packs:
memory_context_packassembles compact, token-budgeted excerpts for prompt construction. - Active Forgetting Tracking: Read actions (
memory_get) incrementaccess_countand updateaccessed_at. - CRUD operations (
memory_add,memory_upsert,memory_get,memory_update,memory_delete,memory_list,memory_search) - Context optimization tools (
memory_prune) - Diagnostic tools (
memory_stats) - Organization by
area,project, andtags - Shared long-term memory across multiple projects and multiple coding agents/clients
- Canonical MCP config + client adapters (OpenCode, Codex, Cursor)
Cross-Client MCP Standard
This project keeps one canonical MCP config and generates client-specific files.
Canonical source:
config/mcp/servers.json
Generate adapters:
npm run mcp:generateValidate adapters:
npm run mcp:validateGenerated files:
config/mcp/generated/opencode.windows.jsonconfig/mcp/generated/opencode.posix.jsonconfig/mcp/generated/opencode.windows.array.npx.jsonconfig/mcp/generated/opencode.windows.string-args.npx.jsonconfig/mcp/generated/opencode.windows.array.fallback-dist.jsonconfig/mcp/generated/opencode.posix.array.npx.jsonconfig/mcp/generated/opencode.posix.string-args.npx.jsonconfig/mcp/generated/opencode.posix.array.fallback-dist.jsonconfig/mcp/generated/codex.windows.jsonconfig/mcp/generated/codex.posix.jsonconfig/mcp/generated/cursor.windows.jsonconfig/mcp/generated/cursor.posix.json
Compatibility matrix:
docs/mcp-compatibility-matrix.md
Quick Start (Most Newbie Friendly)
If your MCP client supports command + args + env, use:
{
"mcpServers": {
"omni-memory": {
"command": "npx",
"args": ["-y", "@sharkdyt/omni-memory-mcp"],
"env": {
"OMNI_MEMORY_DIR": "~/.omni-memory"
}
}
}
}This is the easiest setup because:
- No manual clone path
- No manual build path
- Works after npm registry publish
OpenCode compatibility profiles
OpenCode support can vary by client version and environment.
This repository now generates three OpenCode profiles for each platform:
array.npx:commandas array (["npx","-y","@sharkdyt/omni-memory-mcp"])string-args.npx:commandas string +argsarray.fallback-dist: shell command that triesnpxand falls back to localdist/index.js
Recommended default for OpenCode:
config/mcp/generated/opencode.<platform>.json(this points toarray.fallback-dist)
If you prefer explicit profile selection, copy one of:
config/mcp/generated/opencode.<platform>.array.npx.jsonconfig/mcp/generated/opencode.<platform>.string-args.npx.jsonconfig/mcp/generated/opencode.<platform>.array.fallback-dist.json
Path Guide (Relative vs Absolute)
If you prefer running from local source (dist/index.js), use an absolute path.
- Relative path example (can break):
./dist/index.js - Absolute path example (recommended): full path to file on disk
Why relative paths break:
- Many MCP clients resolve paths from their own process working directory, not from your config file directory.
Absolute path examples
Linux:
{
"command": "node",
"args": ["/home/your-user/.local/mcp/omni-memory-mcp/dist/index.js"]
}macOS:
{
"command": "node",
"args": ["/Users/your-user/.local/mcp/omni-memory-mcp/dist/index.js"]
}Windows:
{
"command": "node",
"args": ["C:\\Users\\your-user\\.local\\mcp\\omni-memory-mcp\\dist\\index.js"]
}Note:
~is convenient, but not every client expands it consistently on Windows. Absolute paths are safer.
Install from Source (Local Dev)
git clone https://github.com/allanschramm/omni-memory-mcp.git
cd omni-memory-mcp
npm install
npm run buildThen configure your MCP client with node + absolute path to dist/index.js.
OpenCode / Codex / Cursor
Instead of writing each config by hand, generate client-specific adapters:
npm run mcp:generateThen copy the generated file for your client/platform from config/mcp/generated/.
OpenCode troubleshooting priority
- Use
opencode.<platform>.jsonfirst (default fallback profile). - If your OpenCode build prefers native
npxonly, tryopencode.<platform>.array.npx.json. - If your OpenCode build requires
commandstring +args, useopencode.<platform>.string-args.npx.json.
Tools
memory_add
{
"name": "User typescript preferences",
"content": "User prefers TypeScript with strict mode",
"area": "preferences",
"project": "my-project",
"tags": ["typescript", "coding-style"],
"metadata": {
"source": "conversation setup"
}
}Use memory_add for clearly new, one-off memories.
memory_upsert
{
"name": "User typescript preferences",
"content": "User prefers TypeScript with strict mode",
"project": "my-project",
"tags": ["typescript", "coding-style"],
"allow_create": true
}Note: memory_upsert is intentionally conservative. It uses normalized name + project matching, updates only when there is one clear candidate, and refuses to write when the match is ambiguous.
Use memory_upsert before memory_add for durable facts, preferences, and evolving project memory.
memory_get
Note: Fetching a memory via memory_get registers an access (increments access_count and updates accessed_at), indicating the memory is actively used.
{
"id": "abc123"
}memory_update
{
"id": "abc123",
"name": "Updated typescript preferences",
"content": "Updated content",
"project": null,
"metadata": null,
"tags": ["new-tag"]
}Use null for project or metadata to clear those values.
memory_delete
{
"id": "abc123"
}memory_list
Note: Enforces Progressive Disclosure. It returns only IDs, Names, and metadata. You must call memory_get with the specific ID to read the full content.
{
"area": "snippets",
"project": "my-project",
"tag": "important",
"limit": 20
}memory_search
Note: Enforces Progressive Disclosure. It returns only IDs, Names, and metadata. You must call memory_get with the specific ID to read the full content.
{
"query": "typescript configuration",
"project": "my-project",
"limit": 10,
"enableAdvancedSyntax": false,
"search_mode": "balanced"
}Note: enableAdvancedSyntax allows FTS5 boolean logic (e.g. "typescript" AND "react" NOT "vue") but requires a strictly valid FTS5 query or it will throw an error.
Note: search_mode tunes ranking only. balanced is the default, exact boosts exact title matches harder, and broad is more permissive for content-heavy results.
Search results include a compact Match: explanation showing which indexed fields contributed to the result.
memory_context_pack
Note: Builds a compact prompt-ready bundle from matching memories without incrementing access_count. It keeps progressive disclosure intact by returning short excerpts instead of full documents.
{
"query": "typescript configuration",
"project": "my-project",
"tag": "important",
"max_tokens": 1200,
"max_memories": 5,
"search_mode": "balanced"
}Returns:
- compact text for immediate agent use
- structured metadata with
count,estimated_tokens,truncated, and per-memory excerpts - excerpts ordered by the existing
memory_searchranking
memory_stats
{}Returns total memories, size on disk, breakdown by project and area, plus local upsert metrics such as memory_upsert_created and memory_upsert_updated.
memory_prune
Note: Cleans up memories that have decayed below a specific score dynamically calculated based on created_at, accessed_at, and access_count.
{
"threshold_score": 0,
"dry_run": true
}Always use dry_run: true first to see how many and which memories would be pruned before running the destructible cleanup.
Memory Areas
| Area | Description |
| --- | --- |
| general | General notes |
| snippets | Code snippets and patterns |
| solutions | Problem-solution pairs |
| preferences | User/team preferences |
Data Storage
Default directory:
~/.omni-memory/
|- omni-memory.db
|- omni-memory.db-wal
`- omni-memory.db-shmEnvironment variables:
| Variable | Default | Description |
| --- | --- | --- |
| OMNI_MEMORY_DIR | ~/.omni-memory | Data storage directory |
| OMNI_MEMORY_DB | {OMNI_MEMORY_DIR}/omni-memory.db | SQLite DB file path |
Development
npm install
npm run check
npm run build
npm testExtra commands:
npm run dev # watch mode
npm run start # run server from dist/Documentation and Memory Hygiene
For every meaningful project change, keep both sources of truth updated:
- Repository docs (
README.md,docs/*, compatibility/config docs) must reflect current behavior. - Omni Memory must receive a concise project memory entry with:
- what changed,
- why it changed,
- constraints/assumptions,
- next steps (if any).
Minimum release/update gate:
npm run mcp:generatenpm run mcp:validatenpm run check- Update docs for any behavior/config change
- Add/update important project memory in Omni Memory
Post-change runtime verification for this repository:
npm run build- Sync repo
dist/intoC:\Users\allan\.local\mcp\omni-memory-mcp\dist - Smoke test the deployed
omni-memory-mcp
Operational checklist:
docs/release-checklist.md
License
Apache 2.0. See LICENSE.
