npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

ts-memo-cache

v2.0.0

Published

A smart memoization and caching utility for TypeScript with Asynchronous function support. It wraps any function with caching capabilities, supports TTL, cache invalidation, and multiple caching strategies.

Readme

ts-memo-cache

Purpose

ts-memo-cache is an open-source TypeScript library designed to simplify caching in your projects by providing smart memoization capabilities. It allows you to wrap any function with memoization, supports configurable TTL (Time-To-Live), cache invalidation, and offers multiple caching strategies such as in-memory and LRU caches. This utility is especially useful for improving application performance by avoiding redundant computations.

Table of Contents

Features

  • Memoization: Wrap any function to cache its result based on its arguments.
  • Configurable TTL: Set cache expiration to ensure data freshness.
  • Cache Invalidation: Easily remove outdated entries.
  • Multiple Caching Strategies: Use a simple in-memory cache or an advanced LRU cache.
  • Type Safety: Fully typed to ensure input/output types are maintained.
  • Asynchronous Function Supports asynchronous (Promise‑returning) functions.

Installation

npm install ts-memo-cache

Asynchronous Function Support

ts-memo-cache now supports asynchronous (Promise‑returning) functions. This means you can cache results of network calls, file I/O, or any async operations while still benefiting from memoization.

  • Uses a key resolver (by default, JSON.stringify) to generate a unique key based on the function's arguments.
  • Checks if the key exists in the cache:
    • If found, the cached Promise is returned immediately.
    • If not, the function is executed, and the resulting Promise is cached.
  • Attaches a .catch() handler to the Promise so that if the Promise rejects (indicating an error), the cache entry is cleared.

Usages

In-Memory Cache

import { memoize } from 'ts-memo-cache';
import { MemoryCache } from 'ts-memo-cache/caches/MemoryCache';

function expensiveCalculation(n: number): number {
  // Simulate a heavy computation
  return n * n;
}

// Wrap the function with memoization
const memoizedCalculation = memoize(expensiveCalculation, { ttl: 5000, cache: new MemoryCache<string, number>() });

console.log(memoizedCalculation(10)); // Computes and caches the result.
console.log(memoizedCalculation(10)); // Returns cached result.

LRU Cache Strategy

import { memoize } from 'ts-memo-cache';
import { LRUCache } from 'ts-memo-cache/caches/LRUCache';

function computeResult(n: number): number {
  console.log("Computing result...");
  return n * 2;
}

// Create a memoized function with an LRU cache that holds a maximum of 50 entries
const memoizedCompute = memoize(computeResult, { ttl: 3000, cache: new LRUCache<string, number>(50) });

console.log(memoizedCompute(5)); // Computes and caches the result
console.log(memoizedCompute(5)); // Returns the cached result

Asynchronous Usages

import { memoize } from 'ts-memo-cache';
import { MemoryCache } from 'ts-memo-cache/caches/MemoryCache';

// Simulate a delayed API call that returns data asynchronously.
async function fetchData(apiUrl: string): Promise<string> {
  console.log(`Fetching data from ${apiUrl}`);
  // Simulate a delay (e.g., network latency)
  return new Promise((resolve) => {
    setTimeout(() => {
      resolve(`Data from ${apiUrl}`);
    }, 1000);
  });
}

// Wrap the asynchronous function with memoization.
// Here, the result of fetchData is cached so that identical calls return the cached Promise.
const memoizedFetchData = memoize(fetchData, {
  ttl: 30000, // Cache result for 30 seconds
  cache: new MemoryCache<string, Promise<string>>()
});

async function runDemo() {
  // First call: the API is fetched, and the result is cached.
  console.time("First call");
  const result1 = await memoizedFetchData("https://api.example.com/data");
  console.timeEnd("First call");
  console.log(result1);

  // Second call: returns the cached result immediately.
  console.time("Second call");
  const result2 = await memoizedFetchData("https://api.example.com/data");
  console.timeEnd("Second call");
  console.log(result2);
}

runDemo();

In this demo:

  • First call: fetchData is executed, simulating an API call with a delay of 1 second.
  • Second call: The same API URL is used, so the memoized function returns the cached Promise, resulting in an immediate response.

Contributing

Contributions are welcome! Please:

  • Fork the repository.
  • Create a branch for your changes.
  • Write tests for any new features.
  • Submit a pull request with detailed changes.

License

This project is licensed under the MIT License