@ekai/contexto
v0.1.1
Published
OpenClaw plugin — local-first memory for context recall and ingest
Downloads
257
Maintainers
Readme
@ekai/contexto
OpenClaw plugin that provides local-first memory — ingest conversation turns and recall relevant context automatically.
Uses @ekai/memory for semantic extraction, embedding, and SQLite storage.
Install
openclaw plugins install @ekai/contextoOr from source:
openclaw plugins install ./integrations/openclawConfigure
In your OpenClaw config:
{
plugins: {
allow: ["@ekai/contexto"],
entries: {
"@ekai/contexto": {
enabled: true,
config: {
"dbPath": "~/.openclaw/ekai/memory.db",
"provider": "openai",
"apiKey": "sk-..."
}
}
}
}
}| Setting | Default | Description |
|---------|---------|-------------|
| dbPath | ~/.openclaw/ekai/memory.db | Path to SQLite memory file |
| provider | (auto-detected) | LLM provider for extraction/embedding (openai, gemini, openrouter) |
| apiKey | (auto-detected) | API key for the selected provider |
| bootstrapDelayMs | 1000 | Milliseconds to wait between sessions during bootstrap backfill |
Provider auto-detection
When provider and apiKey are not explicitly configured, the plugin auto-detects from environment variables:
- Both
provider+apiKeyin config — used as-is - Only
providerin config — API key resolved from the provider's env var (e.g.OPENAI_API_KEY) - Only
apiKeyin config — ignored with a warning (ambiguous without provider) MEMORY_EMBED_PROVIDERorMEMORY_EXTRACT_PROVIDERset — defers to@ekai/memorycore- Auto-detect from env — checks
OPENAI_API_KEY→GOOGLE_API_KEY→OPENROUTER_API_KEY(first match wins) - Nothing found — passes no provider, lets core handle the error
Verify
openclaw plugins list # should show @ekai/contexto
openclaw hooks list # should show plugin:@ekai/contexto:* hooksBootstrap
If the plugin is installed on an existing OpenClaw instance with historical conversations, use the /memory-bootstrap slash command to backfill all session transcripts into memory:
/memory-bootstrapBootstrap scans {stateDir}/agents/*/sessions/*.jsonl, parses each session, and ingests the messages. Progress is tracked per-session so it can resume if interrupted. Running the command again after completion returns immediately. Configure bootstrapDelayMs to control pacing.
How It Works
Two hooks and a slash command:
agent_end— Ingests new conversation turns into memory (ongoing). Normalizes messages (user + assistant only), redacts secrets, extracts semantic memories via@ekai/memory. Only processes the delta since the last ingestion.before_prompt_build— Recalls relevant memories for the current query and prepends them as context (capped at 2000 chars)./memory-bootstrap— One-time backfill of all existing session transcripts. Scans the OpenClaw state directory for historical JSONL session files and ingests them into memory. Runs in the background with configurable delay between sessions. Idempotent — safe to re-run.
Delta tracking is persisted to {dbPath}.progress.json using composite keys (agentId:sessionId) so only new messages are ingested, even across restarts. Both ongoing ingestion and bootstrap share the same progress file.
Development
# Type-check (no build needed -- OpenClaw loads .ts via jiti)
npm run type-check --workspace=integrations/openclaw
# Run tests
npm test --workspace=integrations/openclaw
# Local dev install (symlink)
openclaw plugins install -l ./integrations/openclawLicense
MIT
