npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@letta-ai/agentic-learning

v0.4.3

Published

Drop-in SDK to add an AI memory layer to any application. Works with OpenAI, Anthropic, Gemini, Claude, Vercel AI SDK.

Readme

Learning SDK - AI Memory Layer for Any Application

Add continual learning and long-term memory to any LLM agent with one line of code. This SDK enables agents to learn from every conversation and recall context across sessions—making any agent across any platform stateful.

import OpenAI from 'openai';
import { learning } from '@letta-ai/agentic-learning';

const client = new OpenAI();

await learning({ agent: 'my_agent' }, async () => {
    // LLM is now stateful!
    response = await client.chat.completions.create(...) 
});

npm shield License Tests

Installation

npm install @letta-ai/agentic-learning

Quick Start

# Set your API keys
export OPENAI_API_KEY="your-openai-key"
export LETTA_API_KEY="your-letta-key"
import { learning } from '@letta-ai/agentic-learning';
import OpenAI from 'openai';

const client = new OpenAI();

// Add continual learning with one line
await learning({ agent: "my_assistant" }, async () => {
    // All LLM calls inside this block have learning enabled
    const response = await client.chat.completions.create({
        model: "gpt-5",
        messages: [{ role: "user", content: "My name is Alice" }]
    });

    // Agent remembers prior context
    const response2 = await client.chat.completions.create({
        model: "gpt-5",
        messages: [{ role: "user", content: "What's my name?" }]
    });
    // Returns: "Your name is Alice"
});

That's it - this SDK automatically:

  • ✅ Learns from every conversation
  • ✅ Recalls relevant context when needed
  • ✅ Remembers across sessions
  • ✅ Works with your existing LLM code

Supported Providers

| Provider | Package | Status | Example | |----------|---------|--------|---------| | OpenAI Chat | openai>=4.0.0 | ✅ Stable | openai_example.ts | | OpenAI Responses | openai>=4.0.0 | ✅ Stable | openai_responses_example.ts | | Anthropic | @anthropic-ai/sdk>=0.30.0 | ✅ Stable | anthropic_example.ts | | Claude Agent SDK | @anthropic-ai/claude-agent-sdk>=0.1.0 | ✅ Stable | claude_example.ts | | Gemini | @google/generative-ai>=0.21.0 | ✅ Stable | gemini_example.ts | | Vercel AI SDK | ai>=3.0.0 | ✅ Stable | vercel_example.ts |

Create an issue to request support for another provider, or contribute a PR.

How It Works

This SDK adds stateful memory to your existing LLM code with zero architectural changes:

Benefits:

  • 🔌 Drop-in integration - Works with your existing LLM Provider SDK code
  • 🧠 Automatic memory - Relevant context retrieved and injected into prompts
  • 💾 Persistent across sessions - Conversations remembered even after restarts
  • 💰 Cost-effective - Only relevant context injected, reducing token usage
  • Fast retrieval - Semantic search powered by Letta's optimized infrastructure
  • 🏢 Production-ready - Built on Letta's proven memory management platform

Architecture:

1. 🎯 Wrap      2. 📝 Capture       3. 🔍 Retrieve   4. 🤖 Respond
   your code       conversations      relevant         with full
   in learning     automatically      memories         context

┌─────────────┐
│  Your Code  │
│  learning() │
└──────┬──────┘
       │
       ▼
┌─────────────┐    ┌──────────────┐
│ Interceptor │───▶│ Letta Server │  (Stores conversations,
│  (Inject)   │◀───│  (Memory)    │   retrieves context)
└──────┬──────┘    └──────────────┘
       │
       ▼
┌─────────────┐
│  LLM API    │  (Sees enriched prompts)
│ OpenAI/etc  │
└─────────────┘

Key Features

Memory Across Sessions

// First session
await learning({ agent: "sales_bot" }, async () => {
    const response = await client.chat.completions.create({
        messages: [{ role: "user", content: "I'm interested in Product X" }]
    });
});

// Later session - agent remembers automatically
await learning({ agent: "sales_bot" }, async () => {
    const response = await client.chat.completions.create({
        messages: [{ role: "user", content: "Tell me more about that product" }]
    });
    // Agent knows you're asking about Product X
});

Search Agent Memory

import { AgenticLearning } from '@letta-ai/agentic-learning';

const learningClient = new AgenticLearning();

// Search past conversations
const messages = await learningClient.memory.search({
    agent: "my_agent",
    query: "What are my project requirements?"
});

Advanced Features

Capture-Only Mode

// Store conversations without injecting memory (useful for logging)
await learning({ agent: "my_agent", captureOnly: true }, async () => {
    const response = await client.chat.completions.create(...);
});

Custom Memory Blocks

// Configure which memory blocks to use
await learning({ agent: "sales_bot", memory: ["customer", "product_preferences"] }, async () => {
    const response = await client.chat.completions.create(...);
});

Local Development

Using Local Letta Server

import { AgenticLearning, learning } from '@letta-ai/agentic-learning';

// Connect to local server
const learningClient = new AgenticLearning({
    baseUrl: "http://localhost:8283"
});

await learning({ agent: 'my_agent', client: learningClient }, async () => {
    const response = await client.chat.completions.create(...);
});

Run Letta locally with Docker:

docker run \
  -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
  -p 8283:8283 \
  -e OPENAI_API_KEY="your_key" \
  letta/letta:latest

See the self-hosting guide for more options.

Development Setup

# Clone repository
git clone https://github.com/letta-ai/agentic-learning-sdk.git
cd agentic-learning-sdk/typescript

# Install dependencies
npm install

# Build
npm run build

# Run tests
npm test

# Run Claude tests (separate runner)
npm run test:claude

# Watch mode
npm run dev

Examples

See the examples/ directory for complete working examples:

cd ../examples
npm install
npx tsx openai_example.ts

Documentation

Requirements

  • Node.js 18+
  • Letta API key (sign up at letta.com)
  • At least one LLM provider SDK

License

Apache 2.0 - See LICENSE for details.

Built with Letta - the leading platform for building stateful AI agents with long-term memory.