npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@ekai/memory

v0.0.1

Published

Neuroscience-inspired cognitive memory kernel

Readme

Memory SDK (Ekai)

Neuroscience-inspired, agent-centric memory kernel. Sectorized storage with PBWM gating — the agent reflects on conversations and decides what to learn. Memory is first-person, not a passive database about users.

Quickstart

By default, memory is embedded inside the OpenRouter integration and served on port 4010. No separate service needed.

npm install -w @ekai/memory
npm run build -w @ekai/memory

See Usage Modes below for SDK, mountable router, and standalone options.

SDK Usage

import { Memory } from '@ekai/memory';

// Provider config is global — shared across all agents
const mem = new Memory({ provider: 'openai', apiKey: 'sk-...' });

// Register an agent (soul and relevancePrompt are optional)
mem.addAgent('my-bot', { name: 'My Bot', soul: 'You are helpful' });
mem.addAgent('chef-bot', { name: 'Chef', relevancePrompt: 'Only store memories about cooking and recipes' });

// Get a scoped instance — all data ops go through this
const bot = mem.agent('my-bot');
await bot.add(messages, { userId: 'alice' });
await bot.search('preferences', { userId: 'alice' });
bot.users();                            // agent's known users
bot.memories();                         // all agent memories
bot.memories({ userId: 'alice' });      // memories about alice
bot.memories({ scope: 'global' });      // non-user-scoped memories
bot.delete(id);

Environment Variables

| Variable | Default | Required | |----------|---------|----------| | GOOGLE_API_KEY | — | Yes (if using Gemini provider) | | OPENAI_API_KEY | — | Yes (if using OpenAI provider) | | OPENROUTER_API_KEY | — | Yes (if using OpenRouter provider) | | MEMORY_EMBED_PROVIDER | gemini | No | | MEMORY_EXTRACT_PROVIDER | gemini | No | | MEMORY_DB_PATH | ./memory.db | No | | MEMORY_CORS_ORIGIN | * | No (standalone mode only) |

Usage Modes

1. SDK (recommended)

import { Memory } from '@ekai/memory';

// Provider config is global
const mem = new Memory({ provider: 'openai', apiKey: 'sk-...' });

// Register agents (soul is optional)
mem.addAgent('my-bot', { name: 'My Bot', soul: 'You are helpful' });
mem.addAgent('support-bot', { name: 'Support Bot' });
mem.getAgents();

// Scope to an agent for data ops
const bot = mem.agent('my-bot');
await bot.add(messages, { userId: 'alice' });
await bot.search('query', { userId: 'alice' });
bot.users();
bot.memories({ userId: 'alice' });
bot.delete(id);

2. Mountable router

Mount memory endpoints into an existing Express app:

import { Memory, createMemoryRouter } from '@ekai/memory';

const memory = new Memory({ provider: 'openai', apiKey: 'sk-...' });
app.use(createMemoryRouter(memory._store, memory._extractFn));

This is how the OpenRouter integration embeds memory on port 4010.

3. Standalone server

Run memory as its own HTTP server (useful for development or isolated deployments):

npm run start -w @ekai/memory

How It Works

graph TB
  classDef input fill:#eceff1,stroke:#546e7a,stroke-width:2px
  classDef process fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
  classDef sector fill:#fff3e0,stroke:#f57c00,stroke-width:2px
  classDef store fill:#fce4ec,stroke:#c2185b,stroke-width:2px
  classDef engine fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
  classDef output fill:#e8f5e9,stroke:#388e3c,stroke-width:2px

  IN["POST /v1/ingest<br>messages + userId"]:::input
  RG{"Relevance Gate<br>(if relevancePrompt)"}:::engine
  EXT["Agent Reflection (LLM)<br>first-person, multi-fact"]:::process

  EP["Episodic"]:::sector
  SE["Semantic[]<br>+ domain"]:::sector
  PR["Procedural"]:::sector
  RE["Reflective"]:::sector

  EMB["Embed"]:::process
  CON["Consolidate"]:::process
  DB["SQLite"]:::store
  AU["agent_users"]:::store

  SEARCH["POST /v1/search<br>query + userId"]:::input
  SCOPE["user_scope filter"]:::engine
  PBWM["PBWM Gate"]:::engine
  WM["Working Memory (cap 8)"]:::engine
  OUT["Response"]:::output
  SUM["GET /v1/summary"]:::input

  IN --> RG -->|relevant| EXT
  RG -->|irrelevant| OUT
  EXT --> EP & SE & PR & RE
  SE --> CON
  EP & CON & PR & RE --> EMB --> DB
  IN -.-> AU

  SEARCH --> SCOPE --> PBWM --> WM --> OUT
  DB --> SCOPE
  SUM --> DB

Four Sectors

| Sector | What it stores | Example | |--------|---------------|---------| | Episodic | Events, conversations | "I discussed architecture with Sha on Monday" | | Semantic | Facts as triples + domain | Sha / prefers / dark mode (domain: user) | | Procedural | Multi-step workflows | When deploying: test -> build -> push | | Reflective | Agent self-observations | "I tend to overcomplicate solutions" |

Domain & User Scoping

graph LR
  classDef usr fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
  classDef wld fill:#fff3e0,stroke:#f57c00,stroke-width:2px
  classDef slf fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px

  U["user domain<br>scoped to userId<br><i>Sha prefers dark mode</i>"]:::usr
  W["world domain<br>visible to all<br><i>TypeScript supports generics</i>"]:::wld
  S["self domain<br>visible to all<br><i>I use GPT-4 for extraction</i>"]:::slf

When userId is passed: user-domain facts are only visible to that user. world/self facts are shared.

Attribution

Every memory tracks its origin: origin_type (conversation/document/api), origin_actor (who), origin_ref (source reference).

Consolidation

Semantic triples go through consolidation per triple:

  • Merge — same fact exists -> strengthen it
  • Supersede — different value for same slot -> close old, insert new
  • Insert — new fact

Predicate matching uses embeddings (>=0.9 cosine): "cofounded" ~ "is co-founder of" -> same slot.

API

POST /v1/ingest

Ingest a conversation. Full conversation (user + assistant) goes to the LLM for agent-centric reflection.

{
  "messages": [
    { "role": "user", "content": "I prefer dark mode and use TypeScript" },
    { "role": "assistant", "content": "Noted!" }
  ],
  "agent": "my-bot",
  "userId": "sha"
}
{ "stored": 3, "ids": ["...", "...", "..."], "agent": "my-bot" }

If the agent has a relevancePrompt and the content is irrelevant, the response short-circuits:

{ "stored": 0, "ids": [], "filtered": true, "reason": "Content is about geography, not cooking" }

POST /v1/search

Search with PBWM gating. Pass userId for user-scoped retrieval.

{ "query": "what does Sha prefer?", "agent": "my-bot", "userId": "sha" }
{
  "workingMemory": [
    { "sector": "semantic", "content": "Sha prefers dark mode", "score": 0.87, "details": { "subject": "Sha", "predicate": "prefers", "object": "dark mode", "domain": "user" } }
  ],
  "perSector": { "episodic": [], "semantic": [...], "procedural": [] },
  "agentId": "my-bot"
}

GET /v1/summary

Per-sector counts + recent memories.

GET /v1/summary?agent=my-bot&limit=20
{
  "summary": [
    { "sector": "episodic", "count": 3, "lastCreatedAt": 1700000000 },
    { "sector": "semantic", "count": 12, "lastCreatedAt": 1700100000 },
    { "sector": "procedural", "count": 1, "lastCreatedAt": 1700050000 }
  ],
  "recent": [{ "id": "...", "sector": "semantic", "preview": "dark mode", "details": {...} }],
  "agent": "my-bot"
}

GET /v1/agents

List all registered agents.

{
  "agents": [{ "id": "my-bot", "name": "My Bot", "soulMd": "You are helpful", "createdAt": 1700000000 }]
}

POST /v1/agents

Register a new agent. name, soul, and relevancePrompt are optional.

{ "id": "my-bot", "name": "My Bot", "soul": "You are helpful", "relevancePrompt": "Only store cooking-related memories" }
{ "agent": { "id": "my-bot", "name": "My Bot", "soulMd": "You are helpful", "relevancePrompt": "Only store cooking-related memories", "createdAt": 1700000000 } }

GET /v1/agents/:slug

Get a single agent by ID.

{ "agent": { "id": "my-bot", "name": "My Bot", "relevancePrompt": "...", "createdAt": 1700000000 } }

PUT /v1/agents/:slug

Update agent properties. Set relevancePrompt to null to remove it.

{ "name": "Updated Name", "relevancePrompt": null }

DELETE /v1/agents/:slug

Delete an agent and all its memories. Cannot delete the default agent.

{ "deleted": 5, "agent": "my-bot" }

PUT /v1/memory/:id

Update a memory's content and/or user scope. Body: { "content": "...", "sector?": "episodic", "agent?": "my-bot", "userScope?": "sha" }. Set userScope to null to make a memory shared/global.

DELETE /v1/memory/:id

Delete a single memory. Query: ?agent=my-bot.

DELETE /v1/memory

Delete all memories for an agent. Query: ?agent=my-bot.

GET /v1/users

List all users the agent has interacted with.

GET /v1/users?agent=my-bot
{
  "users": [{ "userId": "sha", "firstSeen": 1700000000, "lastSeen": 1700100000, "interactionCount": 5 }],
  "agent": "my-bot"
}

GET /v1/users/:id/memories

Get memories scoped to a specific user. Query: ?agent=my-bot&limit=20.

GET /v1/graph/triples

Query semantic triples by entity. Query: ?entity=Sha&direction=outgoing&agent=my-bot&predicate=...&maxResults=100&userId=....

DELETE /v1/graph/triple/:id

Delete a single semantic triple. Query: ?agent=my-bot.

GET /v1/graph/visualization

Graph visualization data (nodes + edges). Query: ?entity=Sha&maxDepth=2&maxNodes=50&agent=my-bot&includeHistory=true&userId=....

All Endpoints

| Method | Endpoint | Description | |--------|----------|-------------| | GET | /v1/agents | List agents | | POST | /v1/agents | Create agent | | GET | /v1/agents/:slug | Get single agent | | PUT | /v1/agents/:slug | Update agent | | DELETE | /v1/agents/:slug | Delete agent + memories | | POST | /v1/ingest | Ingest conversation | | POST | /v1/search | Search with PBWM gating | | GET | /v1/summary | Sector counts + recent | | PUT | /v1/memory/:id | Update a memory | | DELETE | /v1/memory/:id | Delete one memory | | DELETE | /v1/memory | Delete all for agent | | GET | /v1/users | List agent's users | | GET | /v1/users/:id/memories | User-scoped memories | | GET | /v1/graph/triples | Query semantic triples by entity | | GET | /v1/graph/visualization | Graph visualization data (dashboard) | | DELETE | /v1/graph/triple/:id | Delete a triple | | GET | /health | Health check |

All endpoints support agent query/body param. In the default deployment, these are served on the OpenRouter port (4010).

Retrieval Pipeline

graph LR
  classDef i fill:#eceff1,stroke:#546e7a,stroke-width:2px
  classDef p fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
  classDef e fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
  classDef o fill:#e8f5e9,stroke:#388e3c,stroke-width:2px

  Q["Query + userId"]:::i
  EMB["3 embeddings<br>(per sector)"]:::p
  ANN["sqlite-vec KNN<br>(cosine distance)"]:::p
  F["similarity >= 0.2<br>+ user_scope"]:::e
  G["PBWM gate<br>sigmoid(1.0r + 0.4e + 0.05c - 0.02n)"]:::e
  W["Working Memory<br>top-4/sector, cap 8"]:::o

  Q --> EMB --> ANN --> F --> G --> W

Data Model

erDiagram
    agents ||--o{ memory : has
    agents ||--o{ semantic_memory : has
    agents ||--o{ procedural_memory : has
    agents ||--o{ reflective_memory : has
    agents ||--o{ agent_users : has
    memory ||--|| memory_vec : "vec index"
    procedural_memory ||--|| procedural_vec : "vec index"
    semantic_memory ||--|| semantic_vec : "vec index"
    reflective_memory ||--|| reflective_vec : "vec index"

    memory { text id PK; text sector; text content; text user_scope; text origin_type }
    semantic_memory { text id PK; text subject; text predicate; text object; text domain; text user_scope }
    procedural_memory { text id PK; text trigger; json steps; text user_scope; text origin_type }
    reflective_memory { text id PK; text observation; text origin_type; text origin_actor }
    memory_vec { text memory_id; float embedding }
    procedural_vec { text memory_id; float embedding }
    semantic_vec { text memory_id; float embedding }
    reflective_vec { text memory_id; float embedding }
    agent_users { text agent_id PK; text user_id PK; int interaction_count }
    agents { text id PK; text name; text soul_md; text relevance_prompt; int created_at }

All main tables share: embedding (JSON), created_at, last_accessed, agent_id, source, origin_type, origin_actor, origin_ref. Vec tables (vec0 virtual tables via sqlite-vec) store cosine-distance-indexed copies of embeddings for ANN search. Clean schema — no migrations, old DBs re-create.

Integration

When used via @ekai/openrouter, memories are injected as:

<memory>
What I know:
- Sha prefers dark mode

What I remember:
- I discussed architecture with Sha on Monday

How I do things:
- When deploying: test -> build -> push

My observations:
- Sha responds better when I lead with the conclusion
</memory>

Notes

  • Supports Gemini, OpenAI, and OpenRouter providers for extraction/embedding
  • Provider config via constructor: new Memory({ provider: 'openai', apiKey: '...' })
  • Agents are first-class: addAgent() required before data ops (auto-created for default)
  • user_scope is opt-in — no userId = all memories returned
  • Vector search via sqlite-vec (cosine distance, vec0 virtual tables) — single-file embedded architecture