@sesamespace/hivemind
v0.8.9
Published
Cognitive architecture for AI agents with multi-layered memory
Readme
Hivemind
Cognitive architecture for AI agents with multi-layered memory, context isolation, and multi-machine fleet distribution.
Quick Start
One-Command Install (recommended)
curl -sL api.sesame.space/api/v1/hivemind/install | bash -s -- <your-sesame-api-key>This installs all dependencies, configures the agent from Sesame, and starts it.
Manual Install
# Install the CLI
npm install -g @sesamespace/hivemind
# Initialize from your Sesame API key
hivemind init <your-sesame-api-key>
# Start the agent
hivemind start --config config/default.toml
# (Optional) Install as a service that survives reboots
hivemind service installDevelopment Install
git clone https://github.com/baileydavis2026/hivemind.git
cd hivemind
pnpm install
pnpm build
cd packages/memory && cargo build --release && cd ../..What You Need
- macOS (Apple Silicon recommended)
- Sesame API key — get one at sesame.space
- OpenRouter API key — get one at openrouter.ai (or provision via Sesame vault)
Architecture
┌─────────────────────────────────────────┐
│ Agent Runtime (TypeScript) │
│ ├── Sesame WebSocket (messaging) │
│ ├── Context Manager (project isolation) │
│ ├── LLM Client (OpenRouter) │
│ └── Memory Client │
│ └── Memory Daemon (Rust/LanceDB) │
│ └── Ollama (embeddings) │
└─────────────────────────────────────────┘Memory Layers
| Layer | Name | Purpose | |-------|------|---------| | L1 | Working Memory | Current conversation (in-memory) | | L2 | Episodic Memory | All interactions (LanceDB vectors) | | L3 | Semantic Memory | Promoted knowledge (high-access patterns) | | L4 | External Memory | Git, files, APIs (on-demand) |
Dashboard
Local web UI for debugging memory and LLM request formation.
http://localhost:9485- Request Inspector: View every LLM call with full prompt breakdown (identity files, L3 knowledge, L2 episodes with scores, L1 history, config snapshot, token estimates)
- Memory Browser: Search L2 episodes, view/delete L3 entries per context
- Context Overview: List contexts with episode counts
Bound to localhost only, no auth required. Starts automatically with the agent.
Agent Introspection Commands
Use hm: prefix in DMs (not group chats — may collide with other agents' platforms):
| Command | Description |
|---------|-------------|
| hm:status | Agent health, memory daemon, model, active context |
| hm:config | Full configuration details |
| hm:contexts | All contexts with L1/L2 info |
| hm:memory stats | Episode counts and L3 entries per context |
| hm:memory search <query> | Search L2 memories with similarity scores |
| hm:memory l3 [context] | View promoted L3 knowledge entries |
| hm:help | List all available commands |
These bypass the LLM — deterministic, direct daemon queries.
CLI Commands
| Command | Description |
|---------|-------------|
| hivemind init <key> | Initialize agent from Sesame API key |
| hivemind start | Start the agent |
| hivemind service install | Install as launchd service (auto-start on boot) |
| hivemind service status | Check service status |
| hivemind service logs | View recent logs |
| hivemind service uninstall | Remove launchd services |
| hivemind fleet | Manage worker fleet |
Configuration
Config is layered (later overrides earlier):
config/default.toml— shipped defaultsconfig/local.toml— generated byhivemind init.env— secrets (SESAME_API_KEY, LLM_API_KEY, AGENT_NAME)- Environment variables — override everything
Team Charter
Agent behavior in group chats is governed by config/TEAM-CHARTER.md. Edit it to change how agents interact in shared spaces. Changes take effect on agent restart.
Development
pnpm install # Install deps
pnpm build # Build all packages
# Run tests (49 total)
npx tsx packages/runtime/src/__tests__/fleet.test.ts
npx tsx packages/runtime/src/__tests__/integration.test.ts
npx tsx packages/runtime/src/__tests__/fleet-integration.test.tsPublishing
./scripts/build-npm.sh # Build flat npm package
cd dist/npm && npm publish --access public # Publish to npmLicense
Private — Big Canyon Farms
