npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@sakib11/smart-cache

v1.0.0

Published

A flexible caching library that automatically caches results of async functions with multiple pluggable storage backends (memory, Redis, filesystem, custom).

Readme

@sakib11/smart-cache

A flexible, zero-dependency caching library that automatically caches results of async functions (API calls, DB queries, computations) with multiple pluggable storage backends.

Features

  • Pluggable storage backends — in-memory (default), Redis, filesystem, or bring your own
  • wrap() any async function — transparently cache return values with a single line
  • Configurable TTL — per-entry and global defaults
  • Stale-while-revalidate — return cached data instantly while refreshing in the background
  • LRU eviction — in-memory backend evicts least-recently-used entries when capacity is reached
  • Request deduplication — concurrent calls for the same key share a single execution
  • Batch operationsgetMany, setMany, deleteMany
  • Pattern invalidation — delete keys by glob pattern (user:*)
  • Cache statistics — hits, misses, hit rate, stale hits, errors
  • Event hooksonHit, onMiss, onSet, onDelete, onStale, onError, onEvict
  • Graceful degradation — storage failures fall through to the original function
  • Smart serialization — handles Date, Map, Set, BigInt, RegExp, Error, typed arrays
  • Custom serializer support — plug in msgpack, protobuf, or any format
  • Full TypeScript support — generics, strict types, .d.ts declarations
  • Dual module output — ESM and CommonJS
  • Zero runtime dependenciesioredis is an optional peer dependency

Installation

npm install @sakib11/smart-cache

For Redis support:

npm install @sakib11/smart-cache ioredis

Quick Start

import { SmartCache } from "@sakib11/smart-cache";

const cache = new SmartCache();

// Wrap any async function
const getUser = cache.wrap(
  async (id: string) => {
    const res = await fetch(`https://api.example.com/users/${id}`);
    return res.json();
  },
  { ttl: 60, key: (id) => `user:${id}` }
);

const user = await getUser("42"); // Cache miss — calls the API
const same = await getUser("42"); // Cache hit  — instant, no API call

Storage Backends

In-Memory (default)

Zero-config, no dependencies. Supports LRU eviction and background TTL cleanup.

import { SmartCache, MemoryStorage } from "@sakib11/smart-cache";

const cache = new SmartCache({
  storage: new MemoryStorage({
    maxEntries: 1000,        // Evict LRU entries beyond this limit (0 = unlimited)
    cleanupIntervalMs: 30000 // Background expired-entry sweep interval
  }),
  defaultTTL: 300, // 5 minutes
});

Redis

Distributed caching via ioredis. Requires ioredis as a peer dependency.

import { SmartCache, RedisStorage } from "@sakib11/smart-cache";

const cache = new SmartCache({
  storage: new RedisStorage({
    host: "localhost",
    port: 6379,
    password: "secret",
    db: 0,
    keyPrefix: "myapp:", // Namespace keys in Redis
  }),
  defaultTTL: 600,
});

Or pass an existing ioredis client:

import Redis from "ioredis";
import { SmartCache, RedisStorage } from "@sakib11/smart-cache";

const redis = new Redis("redis://localhost:6379");

const cache = new SmartCache({
  storage: new RedisStorage({ client: redis }),
});

Filesystem

Persistent local caching that survives process restarts. Each key maps to a JSON file on disk.

import { SmartCache, FileStorage } from "@sakib11/smart-cache";

const cache = new SmartCache({
  storage: new FileStorage({
    directory: "./cache",         // Default: ".smart-cache"
    extension: ".cache.json",     // Default: ".cache.json"
  }),
  defaultTTL: 3600, // 1 hour
});

Custom Storage

Implement the StorageAdapter interface to use any backend (SQLite, DynamoDB, Memcached, etc.):

import { SmartCache, StorageAdapter } from "@sakib11/smart-cache";

const myStorage: StorageAdapter = {
  async get(key)           { /* ... */ },
  async set(key, val, ttl) { /* ... */ },
  async delete(key)        { /* ... */ },
  async has(key)           { /* ... */ },
  async keys(pattern?)     { /* ... */ },
  async clear()            { /* ... */ },
  async size()             { /* ... */ },
  // optional:
  async disconnect()       { /* ... */ },
};

const cache = new SmartCache({ storage: myStorage });

API Reference

new SmartCache(config?)

| Option | Type | Default | Description | |---|---|---|---| | storage | StorageAdapter | MemoryStorage | Storage backend | | defaultTTL | number | 300 | Default TTL in seconds (0 = no expiry) | | staleWhileRevalidate | number | 0 | SWR window in seconds (0 = disabled) | | prefix | string | "" | Key prefix for namespacing | | serializer | Serializer | JSON | Custom serializer | | logLevel | LogLevel | "silent" | "debug" | "info" | "warn" | "error" | "silent" | | logger | Logger | built-in | Custom logger instance | | gracefulDegradation | boolean | true | Swallow storage errors instead of throwing | | onHit | function | — | Called on cache hit | | onMiss | function | — | Called on cache miss | | onSet | function | — | Called when an entry is stored | | onDelete | function | — | Called when an entry is deleted | | onStale | function | — | Called when a stale entry is returned (SWR) | | onError | function | — | Called on storage errors | | onEvict | function | — | Called when an entry is evicted (LRU/expired) |

cache.wrap(fn, options?)

Wrap an async function so its results are cached.

const cachedFn = cache.wrap(fn, {
  ttl: 60,                          // Override default TTL
  staleWhileRevalidate: 30,         // Override default SWR
  key: (...args) => `custom:${args[0]}`, // Custom cache key
  name: "getUser",                  // Name for logs/stats
});

Manual Operations

await cache.set("key", value, { ttl: 120 });
const val = await cache.get<MyType>("key"); // undefined on miss
await cache.has("key");                     // boolean
await cache.delete("key");                  // boolean
await cache.invalidate("user:*");           // number of keys deleted
await cache.keys("user:*");                 // string[]
await cache.clear();                        // remove all entries
await cache.size();                         // number

Batch Operations

const results = await cache.getMany<User>(["user:1", "user:2", "user:3"]);
// Map<string, User | undefined>

await cache.setMany([
  { key: "user:1", value: alice, ttl: 60 },
  { key: "user:2", value: bob },
]);

const deleted = await cache.deleteMany(["user:1", "user:2"]);
// number

Statistics

const stats = cache.getStats();
// { hits, misses, hitRate, staleHits, errors, entries: 0 }

const stats = await cache.getStatsAsync();
// Same but `entries` is accurately queried from storage

cache.resetStats();

Events

cache.on("hit", ({ key, latencyMs }) => { /* ... */ });
cache.on("miss", ({ key }) => { /* ... */ });
cache.on("set", ({ key, ttl }) => { /* ... */ });
cache.on("delete", ({ key }) => { /* ... */ });
cache.on("stale", ({ key }) => { /* ... */ });
cache.on("error", ({ key, error }) => { /* ... */ });
cache.on("evict", ({ key, reason }) => { /* ... */ });

cache.off("hit", handler); // Remove a listener

Lifecycle

// Gracefully shut down (stop timers, close Redis connections, etc.)
await cache.disconnect();

Stale-While-Revalidate

Return cached data immediately even if it's slightly stale, while refreshing in the background:

const cache = new SmartCache({
  defaultTTL: 60,             // Data is "fresh" for 60s
  staleWhileRevalidate: 300,  // Serve stale data for up to 5 more minutes
});

const getProducts = cache.wrap(fetchProducts, { name: "products" });

// t=0s   — MISS:  calls fetchProducts, caches result
// t=30s  — HIT:   returns cached data (still fresh)
// t=90s  — STALE: returns cached data instantly, refreshes in background
// t=90s+ — HIT:   returns the refreshed data
// t=400s — MISS:  past SWR window, calls fetchProducts again

Framework Integration Examples

Express

import express from "express";
import { SmartCache } from "@sakib11/smart-cache";

const app = express();
const cache = new SmartCache({ defaultTTL: 60 });

const getUser = cache.wrap(
  async (id: string) => db.users.findById(id),
  { key: (id) => `user:${id}` }
);

app.get("/users/:id", async (req, res) => {
  const user = await getUser(req.params.id);
  res.json(user);
});

NestJS

import { Injectable } from "@nestjs/common";
import { SmartCache } from "@sakib11/smart-cache";

@Injectable()
export class UserService {
  private cache = new SmartCache({ prefix: "users", defaultTTL: 120 });

  private getById = this.cache.wrap(
    async (id: string) => this.prisma.user.findUnique({ where: { id } }),
    { key: (id) => id }
  );

  async findUser(id: string) {
    return this.getById(id);
  }
}

Next.js (App Router)

import { SmartCache, FileStorage } from "@sakib11/smart-cache";

// Persistent cache that survives hot reloads in development
const cache = new SmartCache({
  storage: new FileStorage({ directory: ".next-cache" }),
  defaultTTL: 300,
});

const getPost = cache.wrap(
  async (slug: string) => {
    const res = await fetch(`https://api.example.com/posts/${slug}`);
    return res.json();
  },
  { key: (slug) => `post:${slug}` }
);

export default async function PostPage({ params }: { params: { slug: string } }) {
  const post = await getPost(params.slug);
  return <article>{post.title}</article>;
}

Default Configuration

import { DEFAULT_CONFIG } from "@sakib11/smart-cache";

console.log(DEFAULT_CONFIG);
// {
//   defaultTTL: 300,           // 5 minutes
//   staleWhileRevalidate: 0,   // disabled
//   prefix: "",                // no prefix
//   logLevel: "silent",        // no logging
//   gracefulDegradation: true, // storage errors are swallowed
// }

Logging

Enable logging to see cache hits, misses, and errors:

const cache = new SmartCache({ logLevel: "debug" });
// [smart-cache:debug] MISS {"key":"user:42"}
// [smart-cache:debug] SET  {"key":"user:42","ttl":300}
// [smart-cache:debug] HIT  {"key":"user:42","latencyMs":0}

Or bring your own logger (e.g. pino, winston):

import pino from "pino";

const cache = new SmartCache({
  logger: pino({ name: "cache" }),
});

TypeScript

Full generic support:

interface User {
  id: string;
  name: string;
  email: string;
}

const getUser = cache.wrap<[string], User>(
  async (id: string) => fetchUser(id),
  { key: (id) => `user:${id}` }
);

const user = await getUser("42");
//    ^? User

License

MIT