npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@memorilabs/memori

v0.0.5

Published

The official TypeScript SDK for Memori

Readme

Memori Labs


Getting Started

Install the Memori SDK and your preferred LLM client using your package manager of choice:

npm install @memorilabs/memori

(Note: Memori currently supports openai and @anthropic-ai/sdk as peer dependencies).

Quickstart Example

import 'dotenv/config';
import { OpenAI } from 'openai';
import { Memori } from '@memorilabs/memori';

// Environment check
const OPENAI_API_KEY = process.env.OPENAI_API_KEY;
if (!OPENAI_API_KEY) {
  console.error('Error: OPENAI_API_KEY must be set in .env');
  process.exit(1);
}

// 1. Initialize the LLM Client
const client = new OpenAI({ apiKey: OPENAI_API_KEY });

// 2. Initialize Memori and Register the Client
const memori = new Memori().llm
  .register(client)
  .attribution('typescript-sdk-test-user', 'test-process-1');

async function main() {
  console.log('--- Step 1: Teaching the AI ---');
  const factPrompt = 'My favorite color is blue and I live in Paris.';
  console.log(`User: ${factPrompt}`);

  // This call automatically triggers Persistence and Augmentation in the background.
  const response1 = await client.chat.completions.create({
    model: 'gpt-4o-mini',
    messages: [{ role: 'user', content: factPrompt }],
  });

  console.log(`AI:   ${response1.choices[0].message.content}`);

  console.log('\n(Waiting 5 seconds for backend processing...)\n');
  await new Promise((resolve) => setTimeout(resolve, 5000));

  console.log('--- Step 2: Testing Recall ---');
  const questionPrompt = 'What is my favorite color?';
  console.log(`User: ${questionPrompt}`);

  // This call automatically triggers Recall, injecting the Paris/Blue facts into the prompt.
  const response2 = await client.chat.completions.create({
    model: 'gpt-4o-mini',
    messages: [{ role: 'user', content: questionPrompt }],
  });

  console.log(`AI:   ${response2.choices[0].message.content}`);
}

main().catch(console.error);

Key Features

  • Zero-Latency Memory: Background processing ensures your LLM calls are never slowed down.
  • Advanced Augmentation: Automatically extracts and structures facts, preferences, and relationships.
  • Cloud-Hosted: Fully managed infrastructure via the Memori Cloud API.
  • LLM Agnostic: Native support for the official OpenAI and Anthropic SDKs via interceptors.
  • Automatic Prompt Injection: Seamlessly fetches relevant memories and injects them into the system context.

Attribution

To get the most out of Memori, you want to attribute your LLM interactions to an entity (think person, place or thing; like a user) and a process (think your agent, LLM interaction or program).

If you do not provide any attribution, Memori cannot make memories for you.

memori.attribution('user-123', 'my-app');

Session Management

Memori uses sessions to group your LLM interactions together. For example, if you have an agent that executes multiple steps you want those to be recorded in a single session.

By default, Memori handles setting the session for you but you can start a new session or override the session by executing the following:

memori.resetSession();

or

const sessionId = memori.session.id;

// ... Later ...

memori.setSession(sessionId);

Supported LLMs

  • Anthropic Claude (@anthropic-ai/sdk)
  • OpenAI (openai)
  • Gemini (@google/genai)

Memori Advanced Augmentation

Memories are tracked at several different levels:

  • entity: think person, place, or thing; like a user
  • process: think your agent, LLM interaction or program
  • session: the current interactions between the entity, process and the LLM

Memori's Advanced Augmentation enhances memories at each of these levels with:

  • attributes
  • events
  • facts
  • people
  • preferences
  • relationships
  • rules
  • skills

Memori knows who your user is, what tasks your agent handles and creates unparalleled context between the two. Augmentation occurs asynchronously in the background incurring no latency.

By default, Memori Advanced Augmentation is available without an account but is rate limited. When you need increased limits, sign up for Memori Advanced Augmentation.

Memori Advanced Augmentation is always free for developers!

Once you've obtained an API key, simply set the following environment variable:

export MEMORI_API_KEY=[api_key]

Managing Your Quota

You can check your quota and manage your account by logging in at https://memorilabs.ai/. If you have reached your IP address quota, sign up and get an API key for increased limits.

If your API key exceeds its quota limits we will email you and let you know.

Contributing

We welcome contributions from the community! Please see our Contributing Guidelines for details on:

  • Setting up your development environment
  • Code style and standards
  • Submitting pull requests
  • Reporting issues

Support


License

Apache 2.0 - see LICENSE