@sakib11/smart-cache
v1.0.0
Published
A flexible caching library that automatically caches results of async functions with multiple pluggable storage backends (memory, Redis, filesystem, custom).
Maintainers
Readme
@sakib11/smart-cache
A flexible, zero-dependency caching library that automatically caches results of async functions (API calls, DB queries, computations) with multiple pluggable storage backends.
Features
- Pluggable storage backends — in-memory (default), Redis, filesystem, or bring your own
wrap()any async function — transparently cache return values with a single line- Configurable TTL — per-entry and global defaults
- Stale-while-revalidate — return cached data instantly while refreshing in the background
- LRU eviction — in-memory backend evicts least-recently-used entries when capacity is reached
- Request deduplication — concurrent calls for the same key share a single execution
- Batch operations —
getMany,setMany,deleteMany - Pattern invalidation — delete keys by glob pattern (
user:*) - Cache statistics — hits, misses, hit rate, stale hits, errors
- Event hooks —
onHit,onMiss,onSet,onDelete,onStale,onError,onEvict - Graceful degradation — storage failures fall through to the original function
- Smart serialization — handles
Date,Map,Set,BigInt,RegExp,Error, typed arrays - Custom serializer support — plug in msgpack, protobuf, or any format
- Full TypeScript support — generics, strict types,
.d.tsdeclarations - Dual module output — ESM and CommonJS
- Zero runtime dependencies —
ioredisis an optional peer dependency
Installation
npm install @sakib11/smart-cacheFor Redis support:
npm install @sakib11/smart-cache ioredisQuick Start
import { SmartCache } from "@sakib11/smart-cache";
const cache = new SmartCache();
// Wrap any async function
const getUser = cache.wrap(
async (id: string) => {
const res = await fetch(`https://api.example.com/users/${id}`);
return res.json();
},
{ ttl: 60, key: (id) => `user:${id}` }
);
const user = await getUser("42"); // Cache miss — calls the API
const same = await getUser("42"); // Cache hit — instant, no API callStorage Backends
In-Memory (default)
Zero-config, no dependencies. Supports LRU eviction and background TTL cleanup.
import { SmartCache, MemoryStorage } from "@sakib11/smart-cache";
const cache = new SmartCache({
storage: new MemoryStorage({
maxEntries: 1000, // Evict LRU entries beyond this limit (0 = unlimited)
cleanupIntervalMs: 30000 // Background expired-entry sweep interval
}),
defaultTTL: 300, // 5 minutes
});Redis
Distributed caching via ioredis. Requires ioredis as a peer dependency.
import { SmartCache, RedisStorage } from "@sakib11/smart-cache";
const cache = new SmartCache({
storage: new RedisStorage({
host: "localhost",
port: 6379,
password: "secret",
db: 0,
keyPrefix: "myapp:", // Namespace keys in Redis
}),
defaultTTL: 600,
});Or pass an existing ioredis client:
import Redis from "ioredis";
import { SmartCache, RedisStorage } from "@sakib11/smart-cache";
const redis = new Redis("redis://localhost:6379");
const cache = new SmartCache({
storage: new RedisStorage({ client: redis }),
});Filesystem
Persistent local caching that survives process restarts. Each key maps to a JSON file on disk.
import { SmartCache, FileStorage } from "@sakib11/smart-cache";
const cache = new SmartCache({
storage: new FileStorage({
directory: "./cache", // Default: ".smart-cache"
extension: ".cache.json", // Default: ".cache.json"
}),
defaultTTL: 3600, // 1 hour
});Custom Storage
Implement the StorageAdapter interface to use any backend (SQLite, DynamoDB, Memcached, etc.):
import { SmartCache, StorageAdapter } from "@sakib11/smart-cache";
const myStorage: StorageAdapter = {
async get(key) { /* ... */ },
async set(key, val, ttl) { /* ... */ },
async delete(key) { /* ... */ },
async has(key) { /* ... */ },
async keys(pattern?) { /* ... */ },
async clear() { /* ... */ },
async size() { /* ... */ },
// optional:
async disconnect() { /* ... */ },
};
const cache = new SmartCache({ storage: myStorage });API Reference
new SmartCache(config?)
| Option | Type | Default | Description |
|---|---|---|---|
| storage | StorageAdapter | MemoryStorage | Storage backend |
| defaultTTL | number | 300 | Default TTL in seconds (0 = no expiry) |
| staleWhileRevalidate | number | 0 | SWR window in seconds (0 = disabled) |
| prefix | string | "" | Key prefix for namespacing |
| serializer | Serializer | JSON | Custom serializer |
| logLevel | LogLevel | "silent" | "debug" | "info" | "warn" | "error" | "silent" |
| logger | Logger | built-in | Custom logger instance |
| gracefulDegradation | boolean | true | Swallow storage errors instead of throwing |
| onHit | function | — | Called on cache hit |
| onMiss | function | — | Called on cache miss |
| onSet | function | — | Called when an entry is stored |
| onDelete | function | — | Called when an entry is deleted |
| onStale | function | — | Called when a stale entry is returned (SWR) |
| onError | function | — | Called on storage errors |
| onEvict | function | — | Called when an entry is evicted (LRU/expired) |
cache.wrap(fn, options?)
Wrap an async function so its results are cached.
const cachedFn = cache.wrap(fn, {
ttl: 60, // Override default TTL
staleWhileRevalidate: 30, // Override default SWR
key: (...args) => `custom:${args[0]}`, // Custom cache key
name: "getUser", // Name for logs/stats
});Manual Operations
await cache.set("key", value, { ttl: 120 });
const val = await cache.get<MyType>("key"); // undefined on miss
await cache.has("key"); // boolean
await cache.delete("key"); // boolean
await cache.invalidate("user:*"); // number of keys deleted
await cache.keys("user:*"); // string[]
await cache.clear(); // remove all entries
await cache.size(); // numberBatch Operations
const results = await cache.getMany<User>(["user:1", "user:2", "user:3"]);
// Map<string, User | undefined>
await cache.setMany([
{ key: "user:1", value: alice, ttl: 60 },
{ key: "user:2", value: bob },
]);
const deleted = await cache.deleteMany(["user:1", "user:2"]);
// numberStatistics
const stats = cache.getStats();
// { hits, misses, hitRate, staleHits, errors, entries: 0 }
const stats = await cache.getStatsAsync();
// Same but `entries` is accurately queried from storage
cache.resetStats();Events
cache.on("hit", ({ key, latencyMs }) => { /* ... */ });
cache.on("miss", ({ key }) => { /* ... */ });
cache.on("set", ({ key, ttl }) => { /* ... */ });
cache.on("delete", ({ key }) => { /* ... */ });
cache.on("stale", ({ key }) => { /* ... */ });
cache.on("error", ({ key, error }) => { /* ... */ });
cache.on("evict", ({ key, reason }) => { /* ... */ });
cache.off("hit", handler); // Remove a listenerLifecycle
// Gracefully shut down (stop timers, close Redis connections, etc.)
await cache.disconnect();Stale-While-Revalidate
Return cached data immediately even if it's slightly stale, while refreshing in the background:
const cache = new SmartCache({
defaultTTL: 60, // Data is "fresh" for 60s
staleWhileRevalidate: 300, // Serve stale data for up to 5 more minutes
});
const getProducts = cache.wrap(fetchProducts, { name: "products" });
// t=0s — MISS: calls fetchProducts, caches result
// t=30s — HIT: returns cached data (still fresh)
// t=90s — STALE: returns cached data instantly, refreshes in background
// t=90s+ — HIT: returns the refreshed data
// t=400s — MISS: past SWR window, calls fetchProducts againFramework Integration Examples
Express
import express from "express";
import { SmartCache } from "@sakib11/smart-cache";
const app = express();
const cache = new SmartCache({ defaultTTL: 60 });
const getUser = cache.wrap(
async (id: string) => db.users.findById(id),
{ key: (id) => `user:${id}` }
);
app.get("/users/:id", async (req, res) => {
const user = await getUser(req.params.id);
res.json(user);
});NestJS
import { Injectable } from "@nestjs/common";
import { SmartCache } from "@sakib11/smart-cache";
@Injectable()
export class UserService {
private cache = new SmartCache({ prefix: "users", defaultTTL: 120 });
private getById = this.cache.wrap(
async (id: string) => this.prisma.user.findUnique({ where: { id } }),
{ key: (id) => id }
);
async findUser(id: string) {
return this.getById(id);
}
}Next.js (App Router)
import { SmartCache, FileStorage } from "@sakib11/smart-cache";
// Persistent cache that survives hot reloads in development
const cache = new SmartCache({
storage: new FileStorage({ directory: ".next-cache" }),
defaultTTL: 300,
});
const getPost = cache.wrap(
async (slug: string) => {
const res = await fetch(`https://api.example.com/posts/${slug}`);
return res.json();
},
{ key: (slug) => `post:${slug}` }
);
export default async function PostPage({ params }: { params: { slug: string } }) {
const post = await getPost(params.slug);
return <article>{post.title}</article>;
}Default Configuration
import { DEFAULT_CONFIG } from "@sakib11/smart-cache";
console.log(DEFAULT_CONFIG);
// {
// defaultTTL: 300, // 5 minutes
// staleWhileRevalidate: 0, // disabled
// prefix: "", // no prefix
// logLevel: "silent", // no logging
// gracefulDegradation: true, // storage errors are swallowed
// }Logging
Enable logging to see cache hits, misses, and errors:
const cache = new SmartCache({ logLevel: "debug" });
// [smart-cache:debug] MISS {"key":"user:42"}
// [smart-cache:debug] SET {"key":"user:42","ttl":300}
// [smart-cache:debug] HIT {"key":"user:42","latencyMs":0}Or bring your own logger (e.g. pino, winston):
import pino from "pino";
const cache = new SmartCache({
logger: pino({ name: "cache" }),
});TypeScript
Full generic support:
interface User {
id: string;
name: string;
email: string;
}
const getUser = cache.wrap<[string], User>(
async (id: string) => fetchUser(id),
{ key: (id) => `user:${id}` }
);
const user = await getUser("42");
// ^? User