herdlock
v0.1.1
Published
Promise deduplication library to prevent thundering herd problems with pluggable caching, retry, and error handling
Downloads
17
Maintainers
Readme
herdlock
Promise deduplication library to prevent thundering herd problems. Wrap any async function to ensure concurrent calls with the same key share a single in-flight promise.
Features
- Promise Deduplication: Concurrent calls with the same key share one promise
- Pluggable Caching: Bring your own cache (Redis, node-cache, Map, etc.)
- Retry with Backoff: Configurable retry attempts with fixed or exponential delay
- Error Handling: Custom error handlers for fallback values or error transformation
- Full TypeScript Support: Preserves function signatures and return types
Installation
npm install herdlockQuick Start
import { herdlock } from 'herdlock';
// Wrap any async function
const lockedFetchUser = herdlock(fetchUser, {
key: (userId) => `user:${userId}`,
});
// Concurrent calls with same key share one promise
const [user1, user2, user3] = await Promise.all([
lockedFetchUser(123),
lockedFetchUser(123),
lockedFetchUser(123),
]);
// fetchUser(123) only called once!API
herdlock(fn, options)
Wraps an async function with promise deduplication.
Parameters:
fn: The async function to wrapoptions: Configuration options
Returns: A wrapped function with the same signature
Options
interface HerdLockOptions<TArgs, TReturn> {
// Required: Generate cache key from function arguments
key: (...args: TArgs) => string;
// Optional: Cache adapter for storing/retrieving results
// When provided, checks cache first, stores result after execution
cache?: CacheAdapter<TReturn>;
// Optional: Retry configuration
retry?: {
attempts: number;
delay?: number | ((attempt: number, error: Error) => number);
when?: (error: Error) => boolean;
};
// Optional: Error handler
onError?: (error: Error, context: { args: TArgs; attempt: number }) => TReturn | void;
}Examples
Basic Deduplication
const lockedFetch = herdlock(fetchData, {
key: (id) => `data:${id}`,
});
// These run in parallel but only one fetch executes
await Promise.all([
lockedFetch(1),
lockedFetch(1),
lockedFetch(1),
]);With Caching
// Create a cache adapter with TTL configured
const cache = createRedisCache({ ttlMs: 60000 });
const lockedFetch = herdlock(fetchUser, {
key: (userId) => `user:${userId}`,
cache, // Checks cache first, stores result after
});
// First call: cache miss → execute → store result
await lockedFetch(123);
// Second call: cache hit → return cached (no execution)
await lockedFetch(123);With Retry
const lockedFetch = herdlock(fetchData, {
key: (id) => `data:${id}`,
retry: {
attempts: 3,
delay: (attempt) => Math.pow(2, attempt) * 1000, // Exponential backoff
when: (error) => error.message.includes('timeout'), // Only retry timeouts
},
});With Error Handling
const lockedFetch = herdlock(fetchUser, {
key: (userId) => `user:${userId}`,
cache,
onError: (error, { args }) => {
// Return a fallback value (will be cached)
return { id: args[0], name: 'Unknown', error: true };
// Or return void to propagate the error
// Or throw a different error
},
});Static Keys
// Ignore arguments, always use the same key
const lockedGetConfig = herdlock(getConfig, {
key: () => 'config',
});Key Uniqueness
Keys are global across all wrapped functions. Include a unique prefix (like function name) to avoid collisions:
// GOOD: Unique prefixes prevent collision
const lockedFetchUser = herdlock(fetchUser, {
key: (id) => `fetchUser:${id}`,
});
const lockedFetchPost = herdlock(fetchPost, {
key: (id) => `fetchPost:${id}`,
});
// BAD: Same key format could cause unintended deduplication
const lockedFetchUser = herdlock(fetchUser, {
key: (id) => `${id}`, // Could collide with other functions!
});Cache Adapters
The cache adapter is responsible for TTL management. Configure TTL when creating the adapter, and get() should return undefined for expired entries.
interface CacheAdapter<T> {
get(key: string): Promise<T | undefined> | T | undefined;
set(key: string, value: T): Promise<void> | void;
}In-Memory Map with TTL
function createMemoryCache<T>(ttlMs?: number): CacheAdapter<T> {
const cache = new Map<string, { value: T; expires?: number }>();
return {
get(key) {
const entry = cache.get(key);
if (!entry) return undefined;
if (entry.expires && Date.now() > entry.expires) {
cache.delete(key);
return undefined;
}
return entry.value;
},
set(key, value) {
cache.set(key, {
value,
expires: ttlMs ? Date.now() + ttlMs : undefined,
});
},
};
}
// Usage
const cache = createMemoryCache<User>(60000); // 1 minute TTLioredis
import Redis from 'ioredis';
function createRedisCache<T>(options: { ttlMs?: number } = {}): CacheAdapter<T> {
const redis = new Redis();
const { ttlMs } = options;
return {
async get(key) {
const data = await redis.get(key);
return data ? JSON.parse(data) : undefined;
},
async set(key, value) {
if (ttlMs) {
await redis.set(key, JSON.stringify(value), 'PX', ttlMs);
} else {
await redis.set(key, JSON.stringify(value));
}
},
};
}
// Usage
const cache = createRedisCache<User>({ ttlMs: 60000 });node-cache
import NodeCache from 'node-cache';
function createNodeCache<T>(ttlSeconds = 0): CacheAdapter<T> {
const cache = new NodeCache({ stdTTL: ttlSeconds });
return {
get(key) {
return cache.get<T>(key);
},
set(key, value) {
cache.set(key, value);
},
};
}
// Usage
const cache = createNodeCache<User>(60); // 60 seconds TTLHow It Works
- Key Generation: Generate a key from the function arguments
- Cache Check: If cache is configured, check it first
- Deduplication: If a promise with the same key is in-flight, return it
- Execution: Execute the function (with retries if configured)
- Cleanup: Remove from in-flight tracking immediately on settlement
- Caching: Store the result in cache if configured
Concurrent Requests:
Client 1 ──┐
Client 2 ──┼──► Share in-flight promise ──► Execute once ──► All get result
Client 3 ──┘
Sequential Requests (with cache):
Call 1 ──► Cache miss ──► Execute ──► Cache result
Call 2 ──► Cache hit ──► Return cached (no execution)Utility Functions
import { isInFlight, getInFlightCount, clearInFlight } from 'herdlock';
// Check if a promise is in-flight
isInFlight('user:123'); // boolean
// Get count of in-flight promises
getInFlightCount(); // number
// Clear all in-flight promises (useful for testing)
clearInFlight();Development
Testing
# Unit tests (fast, no dependencies)
npm test
# Integration tests (requires Docker)
npm run test:integration
# All tests
npm run test:allIntegration tests use real Redis in a Docker container to verify cache adapter behavior, TTL expiration, and cross-instance caching.
Building
npm run buildReleasing
# Runs tests, builds, updates changelog, bumps version, publishes to npm
npm run release patch|minor|majorLicense
MIT
