@syuesw/openclaw-memory
v1.0.0-alpha.1
Published
OpenClaw Memory Extension - Local self-hosted memory layer for personalized AI interactions
Downloads
88
Maintainers
Readme
OpenClaw Memory Extension
🧠 AI Memory Layer - Give your AI assistant long-term memory with personalized recall.
本项目使用 OpenClaw 辅助生成
Features
- ✅ Remember user preferences across sessions
- ✅ Recall context from previous conversations
- ✅ Provide personalized responses
- ✅ Reduce token usage by storing context externally
- ✅ Flexible Memory Isolation - Control how memories are grouped (by user, chat, channel, etc.)
- ✅ Support multiple embedding providers (Ollama, Aliyun, OpenAI, etc.)
- ✅ Modular architecture - replace any component
Quick Start
One-Command Install (Recommended) ⭐
curl -fsSL https://raw.githubusercontent.com/syuesw/openclaw-memory/main/install.sh | bashWith options:
# Install with Chinese language
curl -fsSL https://raw.githubusercontent.com/syuesw/openclaw-memory/main/install.sh | bash -s -- --lang=zh
# Install with English language
curl -fsSL https://raw.githubusercontent.com/syuesw/openclaw-memory/main/install.sh | bash -s -- --lang=en
# Skip interactive configuration
curl -fsSL https://raw.githubusercontent.com/syuesw/openclaw-memory/main/install.sh | bash -s -- --skip-config
# Show help
curl -fsSL https://raw.githubusercontent.com/syuesw/openclaw-memory/main/install.sh | bash -s -- --helpWhat this does:
- Downloads and runs the installer script
- Installs the plugin via
openclaw plugins install - Configures language preference
- Enables the plugin
Alternative: Install from npm
# 1. Install plugin from npm
openclaw plugins install @syuesw/openclaw-memory
# 2. Configure language
openclaw config set plugins.entries.memory.config.language "zh" # or "en" or "auto"
# 3. Enable plugin
openclaw plugins enable memory
# 4. Restart Gateway
openclaw gateway restartVerify
curl http://localhost:5000/health
# Or in chat: /memory_statusDeploy Memory Services (Optional)
Memory services run separately from the plugin. Deploy them to enable memory functionality.
Step 1: Clone Repository
git clone https://github.com/syuesw/openclaw-memory.git ~/.openclaw/extensions/openclaw-memory
cd ~/.openclaw/extensions/openclaw-memoryStep 2: Configure Environment
# Copy example configuration
cp .env.example .env
# Edit .env with your settings
# - For local embedding: Keep default Ollama settings
# - For cloud embedding: Set EMBEDDING_API_URL and EMBEDDING_API_KEY
nano .envStep 3: Start Services
# Local embedding mode (requires 12GB+ RAM)
docker compose up -d
# Cloud embedding mode (requires 3GB+ RAM)
# Comment out 'embedding' service in docker-compose.yml first
docker compose up -dStep 4: Verify
curl http://localhost:5000/health📖 Full installation guide: Quick Start
Usage
Add Memory
Remember that I prefer dark modeRecall Memory
What did I say about my preferred mode?Check Status
/memory_status📖 Complete usage guide: Usage Guide
Configuration
Plugin Config (WebUI: Settings → Plugins → Memory)
{
"baseUrl": "http://localhost:5000",
"autoMemory": true,
"maxResults": 5,
"isolationScope": ["user"]
}| Option | Default | Description |
|--------|---------|-------------|
| baseUrl | http://localhost:5000 | Memory API URL |
| autoMemory | true | Auto-use memories |
| maxResults | 5 | Max search results |
| isolationScope | ["user"] | Isolation levels (multi-select) |
Isolation Levels (Multi-Select)
Note: If no levels selected ([]), memories are globally shared (no isolation).
| Level | Prefix | Example |
|-------|--------|---------|
| By User | u | u:alice |
| By Chat | c | c:work-chat |
| By Channel | ch | ch:telegram |
| By Agent | a | a:main |
| By Session | s | s:session123 |
📖 Detailed configuration: Configuration Guide
📖 Isolation explained: Memory Isolation
Chat Commands
| Command | Description |
|---------|-------------|
| /memory_status | Show plugin status |
| /memory_help | Show help information |
📖 All commands: Chat Commands
Architecture
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ OpenClaw AI │────▶│ Memory Plugin │────▶│ Memory API │
│ (Your Assistant)│ │ (This Plugin) │ │ (Self-hosted) │
└─────────────────┘ └─────────────────┘ └────────┬────────┘
│
┌──────────────────────────────┼──────────────────────────────┐
│ │ │
▼ ▼ ▼
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ Qdrant │ │ Embedding │ │ Cloud │
│ (Vector DB) │ │ (Optional) │ │ (Optional) │
└──────────────┘ └──────────────┘ └──────────────┘Modular Components (All Replaceable):
| Component | Default | Alternatives | |-----------|---------|--------------| | Vector DB | Qdrant | Pinecone, Weaviate, Milvus, Chroma | | Embedding | Ollama (qwen3-embedding:0.6b) | Aliyun DashScope, OpenAI, Cohere | | Memory API | Go Service | Custom implementation |
Docker Images
| Image | Description |
|-------|-------------|
| syuesw/memory-api | Memory API service (Go) |
| qdrant/qdrant | Vector database (official) |
| ollama/ollama | Embedding service (local mode, official) |
Resource Requirements
Local Embedding Mode
| Resource | Minimum | Recommended | |----------|---------|-------------| | RAM | 12GB | 16GB | | Disk | 10GB | 20GB | | CPU | 4 cores | 8 cores |
Cloud Embedding Mode
| Resource | Minimum | Recommended | |----------|---------|-------------| | RAM | 3GB | 4GB | | Disk | 2GB | 5GB | | CPU | 2 cores | 4 cores |
Documentation
| Document | Description | |----------|-------------| | 🚀 Quick Start | Installation and setup (5 minutes) | | 📖 Usage Guide | Detailed usage examples and features | | 💬 Chat Commands | Commands for chat windows | | ⚙️ Configuration | All configuration options | | 🔒 Memory Isolation | Understand isolation levels | | 🔧 Troubleshooting | Common issues and solutions | | 📝 API Reference | Complete API documentation |
Project Structure
openclaw-memory/
├── index.ts # Plugin entry point
├── src/
│ ├── tools.ts # Tool implementations
│ └── runtime.ts # Runtime helpers
├── memory-api/ # Go memory service
├── docker-compose.yml # Service orchestration
├── .env.example # Environment template
├── install.sh # Plugin installer
├── README.md # This file
├── README_CN.md # Chinese documentation
├── docs/ # Documentation
│ ├── QUICKSTART.md # Quick start guide
│ ├── USAGE.md # Usage guide
│ ├── CONFIGURATION.md # Configuration guide
│ ├── ISOLATION.md # Memory isolation guide
│ ├── TROUBLESHOOTING.md # Troubleshooting guide
│ ├── COMMANDS.md # Chat commands
│ ├── api/ # API documentation
│ │ ├── API.md
│ │ └── API_CN.md
│ └── guides/ # User guides (duplicates, for compatibility)
├── LICENSE # MIT License
└── package.json # NPM package configResources
License
MIT License - see LICENSE file for details.
See THIRD-PARTY-NOTICES.md for third-party license information and disclaimers.
