@getplumb/plumb-openclaw
v0.1.8
Published
Plumb OpenClaw plugin — auto-ingest and memory injection for OpenClaw
Readme
@getplumb/plumb-openclaw
Persistent memory for OpenClaw — automatic ingest and context injection, no setup required.
Install
openclaw plugins install @getplumb/plumb-openclawThen restart the gateway to activate:
openclaw gateway restartThat's it. Plumb starts learning from your conversations immediately.
What it does
- Auto-ingest — every conversation turn is stored to a local SQLite DB after the response
- Context injection — relevant memory facts are injected into the system prompt before each response
- Shadow mode — observe what would be injected without actually injecting it (good for testing)
- Local only — all data stays on your machine; nothing is sent to external servers
Configuration
Configuration lives under plugins.entries.plumb.config in your openclaw.json. All fields are optional — defaults work out of the box.
| Field | Default | Description |
|---|---|---|
| dbPath | ~/.plumb/memory.db | Path to the SQLite database file |
| userId | default | User ID for scoping memory |
| shadowMode | false | If true, retrieves context but does not inject it |
| llmProvider | (inherits from OpenClaw) | LLM provider for fact extraction (openai, anthropic, ollama, openai-compatible) |
| llmModel | (inherits from OpenClaw) | Model for fact extraction |
| llmApiKey | (inherits from OpenClaw) | API key for fact extraction |
To configure via CLI:
openclaw config set plugins.entries.plumb.config.userId "clay"
openclaw gateway restartUninstall
openclaw plugins uninstall plumb
openclaw gateway restartLinks
License
MIT
