lru-stampede-guard
v1.0.2
Published
A lightweight async deduplication cache to prevent cache stampedes. Built on tiny-lru.
Maintainers
Readme
lru-stampede-guard
A high-performance, lightweight async deduplication cache for Node.js.
lru-stampede-guard prevents cache stampedes (also known as dog-piling). When multiple concurrent requests ask for the same resource, the expensive factory function (database query, API call, etc.) is executed only once. All other callers await the same in-flight promise.
Built on top of the extremely fast and minimal tiny-lru.
Features
- Lightweight – Minimal overhead, powered by
tiny-lru - Stampede Protection – Deduplicates concurrent async requests
- Self-Healing – Failed promises are automatically evicted so retries work
- TypeScript First – Written in TypeScript with full type definitions
Installation
npm install lru-stampede-guardUsage
CommonJS (Node.js)
const { StampedeGuard } = require("lru-stampede-guard");
// Initialize the cache
const cache = new StampedeGuard({
max: 1000, // Maximum items in memory
ttl: 60000, // Time-to-live: 60 seconds (ms)
});
// Mock database function
const getUserFromDb = async (id) => {
console.log("⚡ DB Query Executed");
return { id, name: "Alice" };
};
async function handler(userId) {
// fetch(key, factoryFn)
// If 50 requests hit this at once, the DB query runs ONLY ONCE
const user = await cache.fetch(`user:${userId}`, () => getUserFromDb(userId));
return user;
}TypeScript
import { StampedeGuard } from "lru-stampede-guard";
interface User {
id: number;
name: string;
}
const cache = new StampedeGuard({
max: 500,
ttl: 10000,
});
async function getUser(id: number): Promise<User> {
// Pass a generic to fetch<T>() for strict typing
return cache.fetch<User>(`user:${id}`, async () => {
return await db.query<User>("SELECT * FROM users WHERE id = ?", [id]);
});
}API Reference
new StampedeGuard(options?)
Creates a new cache instance.
Options
options.max(number) Maximum number of items in the cache Default:1000options.ttl(number) Time-to-live in milliseconds Default:0(never expires)
fetch(key, factoryFn, ttl?)
Returns the cached value if it exists.
If the value is missing, the factoryFn is executed and its promise is cached.
Concurrent calls for the same key will await the same promise.
Parameters
key(string) Unique cache keyfactoryFn(function) Async function that returns the valuettl(number, optional) Overrides the default TTL for this entry
Returns
Promise<T>
delete(key)
Manually removes a single item from the cache.
cache.delete("user:123");clear()
Clears all items from the cache.
cache.clear();Why Use This?
The Problem (Without This Library)
If your server takes 500ms to fetch a user profile and 100 requests arrive within that window, your backend will trigger 100 identical database queries.
This causes:
- Unnecessary load
- CPU spikes
- Database exhaustion
- Increased latency
The Solution (With lru-stampede-guard)
- The first request triggers the database query
- The next 99 requests detect an in-flight promise
- All requests await the same result
- The database is hit once
Result: faster responses, lower load
License
MIT
