npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@one710/recollect

v1.7.0

Published

Auto-summarizing memory layer for AI agents using Node.js native SQLite.

Readme

@one710/recollect

Publish npm version npm downloads License: MIT TypeScript

Recollect gives your AI agents "infinite memory" while keeping your context window lean and your bills low.

It's a provider-agnostic memory layer that automatically compacts long-running conversations into high-density summaries, protecting your core instructions while ensuring the agent never "forgets" the goal.

🚀 Why Use Recollect?

  • Universal Compatibility: Works with any message schema. OpenAI, Anthropic, Gemini, or your own custom format—Recollect handles them all as generic dictionaries.
  • Recursive Summarization: As history grows, Recollect merges old turns into a rolling thread of "checkpoint summaries," preserving intent without bloating tokens.
  • Instruction Guardrails: Never lose your system prompt. Recollect intelligently protects "pinned" roles (like system or developer) from being summarized away.
  • Configurable Summary Roles: Need summaries to be system messages? Or developer roles for o1/o3 models? Or even user messages? You decide.
  • Fast Persistence: Built-in SQLite support for production-ready persistence, or a high-speed In-Memory adapter for ephemeral workers.

📦 Installation

npm install @one710/recollect

If you want persistent storage (recommended):

npm install sqlite3

🛠️ Quick Start

import { MemoryLayer } from "@one710/recollect";

const memory = new MemoryLayer({
  maxTokens: 4096, // Maximum context budget
  // Mandatory: Use any tokenizer or simple length check
  countTokens: (msg) => JSON.stringify(msg).length / 4,
  // High-density summarizer callback
  summarize: async ({ summaryPrompt }) => {
    // Call your LLM here (OpenAI, Anthropic, local, etc.)
    return "The user is asking about building agentic systems...";
  },
});

const sessionId = "agent:researcher-1";
const runId = "run-2026-03-19-evt-123"; // one id per agent run/event

// Add arbitrary message shapes
await memory.addMessage(sessionId, runId, {
  role: "user",
  content: "Analyze the latest trends in autonomous agents.",
});

// Retrieve the compact, ready-to-send prompt
const messages = await memory.getPromptMessages(sessionId);

🌍 Universal Provider Support

Because Recollect treats messages as generic objects, you can use it with any provider:

OpenAI / Anthropic

await memory.addMessage(id, null, {
  role: "assistant",
  content: "Understood.",
});

Multimodal / Complex Content

await memory.addMessage(id, null, {
  role: "user",
  content: [{ type: "image_url", image_url: { url: "..." } }],
});

🧵 Run-Aware Compaction (runId)

Recollect supports run-scoped compaction using a dedicated runId field in storage.

  • Use a unique runId per agent run/event.
  • Pass the same runId to addMessage/addMessages for all messages generated in that run.
  • When compaction triggers for that run, Recollect keeps that run as the tail and compacts older history first.
  • This helps avoid splitting in-progress tool chains (e.g. tool call/result pairs) during compaction.

Example

const sessionId = "slack:C123:thread-abc";
const runId = crypto.randomUUID();

await memory.addMessages(sessionId, runId, [
  {
    type: "message",
    role: "user",
    content: [{ type: "input_text", text: "check this" }],
  },
  { type: "function_call", callId: "call_1", name: "my_tool", arguments: "{}" },
  {
    type: "function_call_result",
    callId: "call_1",
    name: "my_tool",
    output: "ok",
  },
]);

If you don't need run scoping for a call, pass null for runId.

🗄️ Storage Model (Updated)

SQLite messages now stores:

  • sessionId
  • runId (nullable, dedicated column)
  • data (JSON payload)

The public prompt/history methods (getMessages, getPromptMessages) return message payloads only; run metadata stays in storage internals.

🔧 Core API Signatures

addMessage(sessionId: string, runId: string | null, message: Record<string, any>): Promise<void>
addMessages(sessionId: string, runId: string | null, messages: Record<string, any>[]): Promise<void>
getMessages(sessionId: string, runId?: string | null): Promise<Record<string, any>[]>
getPromptMessages(sessionId: string): Promise<Record<string, any>[]>
compactNow(sessionId: string, runId?: string | null): Promise<void>
compactIfNeeded(sessionId: string, options?: { mode?: "manual" | "auto-pre" | "auto-post" | "ingest"; reason?: string; runId?: string | null; force?: boolean }): Promise<void>

⚙️ Advanced Configuration

| Option | Type | Default | Description | | :---------------------- | :------------------ | :--------------- | :----------------------------------------------------- | | maxTokens | number | Required | The token budget before compaction triggers. | | countTokens | TokenCounter | Required | (message: any) => number. Your specific token logic. | | summarize | SummarizeCallable | Required | Async function that performs the summarization. | | summaryRole | string | "system" | Role assigned to generated summary messages. | | threshold | number | 0.9 | Trigger compaction at 90% of maxTokens. | | keepRecentUserTurns | number | 4 | Number of recent user turns to keep unsummarized. | | keepRecentMessagesMin | number | 8 | Minimum messages to keep at the tail of the history. | | renderMessage | MessageRenderer | JSON.stringify | Custom formatting for the summarizer's input. | | storage | Adapter | SQLite | Persistent sqlite3 or ephemeral InMemory. |

🧪 Development

npm install
npm run build
npm test

📜 License

MIT © one710