npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@mem0/openclaw-mem0

v0.1.0

Published

Mem0 memory backend for OpenClaw — platform or self-hosted open-source

Downloads

791

Readme

@mem0/openclaw-mem0

Long-term memory for OpenClaw agents, powered by Mem0.

Your agent forgets everything between sessions. This plugin fixes that. It watches conversations, extracts what matters, and brings it back when relevant — automatically.

How it works

Auto-Recall — Before the agent responds, the plugin searches Mem0 for memories that match the current message and injects them into context.

Auto-Capture — After the agent responds, the plugin sends the exchange to Mem0. Mem0 decides what's worth keeping — new facts get stored, stale ones updated, duplicates merged.

Both run silently. No prompting, no configuration, no manual calls.

Short-term vs long-term memory

Memories are organized into two scopes:

  • Session (short-term) — Auto-capture stores memories scoped to the current session via Mem0's run_id / runId parameter. These are contextual to the ongoing conversation and automatically recalled alongside long-term memories.

  • User (long-term) — The agent can explicitly store long-term memories using the memory_store tool (with longTerm: true, the default). These persist across all sessions for the user.

During auto-recall, the plugin searches both scopes and presents them separately — long-term memories first, then session memories — so the agent has full context.

The agent tools (memory_search, memory_list) accept a scope parameter ("session", "long-term", or "all") to control which memories are queried. The memory_store tool accepts a longTerm boolean (default: true) to choose where to store.

All new parameters are optional and backward-compatible — existing configurations work without changes.

Setup

openclaw plugins install @mem0/openclaw-mem0

Platform (Mem0 Cloud)

Get an API key from app.mem0.ai, then add to your openclaw.json:

// plugins.entries
"openclaw-mem0": {
  "enabled": true,
  "config": {
    "apiKey": "${MEM0_API_KEY}",
    "userId": "your-user-id"
  }
}

Open-Source (Self-hosted)

No Mem0 key needed. Requires OPENAI_API_KEY for default embeddings/LLM.

"openclaw-mem0": {
  "enabled": true,
  "config": {
    "mode": "open-source",
    "userId": "your-user-id"
  }
}

Sensible defaults out of the box. To customize the embedder, vector store, or LLM:

"config": {
  "mode": "open-source",
  "userId": "your-user-id",
  "oss": {
    "embedder": { "provider": "openai", "config": { "model": "text-embedding-3-small" } },
    "vectorStore": { "provider": "qdrant", "config": { "host": "localhost", "port": 6333 } },
    "llm": { "provider": "openai", "config": { "model": "gpt-4o" } }
  }
}

All oss fields are optional. See Mem0 OSS docs for providers.

Agent tools

The agent gets five tools it can call during conversations:

| Tool | Description | |------|-------------| | memory_search | Search memories by natural language | | memory_list | List all stored memories for a user | | memory_store | Explicitly save a fact | | memory_get | Retrieve a memory by ID | | memory_forget | Delete by ID or by query |

CLI

# Search all memories (long-term + session)
openclaw mem0 search "what languages does the user know"

# Search only long-term memories
openclaw mem0 search "what languages does the user know" --scope long-term

# Search only session/short-term memories
openclaw mem0 search "what languages does the user know" --scope session

# Stats
openclaw mem0 stats

Options

General

| Key | Type | Default | | |-----|------|---------|---| | mode | "platform" | "open-source" | "platform" | Which backend to use | | userId | string | "default" | Scope memories per user | | autoRecall | boolean | true | Inject memories before each turn | | autoCapture | boolean | true | Store facts after each turn | | topK | number | 5 | Max memories per recall | | searchThreshold | number | 0.3 | Min similarity (0–1) |

Platform mode

| Key | Type | Default | | |-----|------|---------|---| | apiKey | string | — | Required. Mem0 API key (supports ${MEM0_API_KEY}) | | orgId | string | — | Organization ID | | projectId | string | — | Project ID | | enableGraph | boolean | false | Entity graph for relationships | | customInstructions | string | (built-in) | Extraction rules — what to store, how to format | | customCategories | object | (12 defaults) | Category name → description map for tagging |

Open-source mode

Works with zero extra config. The oss block lets you swap out any component:

| Key | Type | Default | | |-----|------|---------|---| | customPrompt | string | (built-in) | Extraction prompt for memory processing | | oss.embedder.provider | string | "openai" | Embedding provider ("openai", "ollama", etc.) | | oss.embedder.config | object | — | Provider config: apiKey, model, baseURL | | oss.vectorStore.provider | string | "memory" | Vector store ("memory", "qdrant", "chroma", etc.) | | oss.vectorStore.config | object | — | Provider config: host, port, collectionName, dimension | | oss.llm.provider | string | "openai" | LLM provider ("openai", "anthropic", "ollama", etc.) | | oss.llm.config | object | — | Provider config: apiKey, model, baseURL, temperature | | oss.historyDbPath | string | — | SQLite path for memory edit history |

Everything inside oss is optional — defaults use OpenAI embeddings (text-embedding-3-small), in-memory vector store, and OpenAI LLM. Override only what you need.

License

Apache 2.0