@shkumbinhsn/cache
v0.2.0
Published
Type-safe Redis caching library for Bun with distributed locking and stale-while-revalidate
Maintainers
Readme
@shkumbinhsn/cache
A type-safe, Redis-backed caching library for Bun with distributed locking, stale-while-revalidate support, and automatic serialization.
Features
- Type-safe caching - Full TypeScript support with typed query keys
- Cache versioning - Instant global invalidation by bumping version on deploy
- Conditional caching - Skip caching based on result (e.g., empty arrays, errors)
- Distributed locking - Prevents cache stampede with Redis-based locks
- Stale-while-revalidate - Serve stale data while refreshing in background
- Graceful degradation - Falls back to direct execution when Redis is down
- Metrics hooks - Built-in onHit/onMiss/onSet callbacks for observability
- Custom serialization - Support for complex data types like Dates, Maps, etc.
- Bun-native - Built specifically for Bun runtime with native Redis support
Installation
bun install @shkumbinhsn/cacheQuick Start
import { CacheClient, cacheOptions, createBunRedisAdapter } from "@shkumbinhsn/cache";
import { RedisClient } from "bun";
// Create Redis client with Bun adapter
const bunRedis = new RedisClient("redis://localhost:6379");
const redis = createBunRedisAdapter(bunRedis);
// Initialize cache client with graceful degradation
const cache = new CacheClient(redis, {
fallbackOnError: true, // Falls back to fn() if Redis is down
onError: (err, op) => console.error(`Cache ${op} failed:`, err),
});
// Create a cached function
const getUser = cache.createFunction((id: string) =>
cacheOptions({
key: ["user", id] as const,
fn: async () => {
// Your expensive operation here
return await db.users.findById(id);
},
ttl: 3600, // 1 hour
})
);
// Use it - automatically cached
const user = await getUser("123");API Reference
CacheClient
createFunction<TArgs, TReturn>(optionsFn)
Creates a cached function with helper methods.
const getUser = cache.createFunction((id: string) =>
cacheOptions({
key: ["user", id] as const,
fn: async () => fetchUser(id),
ttl: 3600,
staleTime: 300, // Revalidate after 5 minutes
})
);
// Basic usage
const user = await getUser("123");
// Methods available on cached functions:
await getUser.invalidate("123"); // Clear cache for specific args
await getUser.prefetch("123"); // Pre-populate cache
await getUser.getData("123"); // Get cached data (null if miss)
await getUser.getStatus("123"); // Check { exists, ttl, isStale }
await getUser.setData(["123"], data); // Manually set cache (args as array)
await getUser.ensure("123"); // Get or fetch (same as calling function)fetch<T>(options)
Fetch with caching using options object.
const userOptions = (id: string) =>
cacheOptions({
key: ["user", id] as const,
fn: async () => fetchUser(id),
ttl: 3600,
});
const user = await cache.fetch(userOptions("123"));getData<T>(options | key)
Get cached data without triggering a fetch.
const userOptions = (id: string) =>
cacheOptions({
key: ["user", id] as const,
fn: async () => fetchUser(id),
ttl: 3600,
});
// With full options
const user = await cache.getData(userOptions("123"));
// With just the key (type-safe via DataTag)
const user = await cache.getData(userOptions("123").key);setData<T>(options | key, updater)
Update cached data with a value or updater function.
const userOptions = (id: string) =>
cacheOptions({
key: ["user", id] as const,
fn: async () => fetchUser(id),
ttl: 3600,
});
// Set directly
await cache.setData(userOptions("123"), { id: "123", name: "John" });
// Update with function (receives old data or null)
await cache.setData(userOptions("123"), (old) => ({
...old!,
name: old!.name + " Updated",
}));
// Using just the key (type-safe via DataTag)
await cache.setData(userOptions("123").key, newData);invalidate(options | key)
Invalidate a specific cache entry.
await cache.invalidate(userOptions("123"));
await cache.invalidate(userOptions("123").key);invalidateKey(key: string)
Invalidate by raw cache key string.
await cache.invalidateKey("cache:user:123");CacheClientOptions
const cache = new CacheClient(redis, {
// Falls back to executing fn() when Redis is unavailable
fallbackOnError: true,
// Optional error handler for logging/monitoring
onError: (error, operation) => {
console.error(`Cache ${operation} failed:`, error);
},
// Cache version - bump on deploy for instant global invalidation
version: "v2", // All keys become: v2:cache:user:123
// Metrics hooks for observability
onHit: (key) => metrics.increment('cache.hit'),
onMiss: (key) => metrics.increment('cache.miss'),
onSet: (key) => metrics.increment('cache.set'),
});Cache Options
interface CacheOptions<TReturn> {
key: DataTag<TReturn>; // Typed query key
fn: () => Promise<TReturn>; // Function to cache
ttl: number; // Time-to-live in seconds
staleTime?: number; // Stale threshold in seconds
serialize?: (value) => string; // Custom serializer
deserialize?: (string) => value; // Custom deserializer
lockTimeout?: number; // Lock acquisition timeout (ms)
lockRetryDelay?: number; // Lock retry delay (ms)
shouldCache?: (result) => boolean; // Skip caching if returns false
}DistributedLock
For manual distributed locking:
import { DistributedLock, withLock } from "@shkumbinhsn/cache";
// Using the lock class directly
const lock = new DistributedLock(redis, "resource-key", {
timeout: 5000,
retryDelay: 50,
});
if (await lock.acquire()) {
try {
// Critical section
} finally {
await lock.release();
}
}
// Or use the helper
const result = await withLock(redis, "resource-key", async () => {
// Your operation
return "result";
});Advanced Usage
Custom Serializers
Handle complex types like Dates:
const getDate = cache.createFunction((timestamp: number) =>
cacheOptions({
key: ["date", timestamp] as const,
fn: async () => new Date(timestamp),
ttl: 3600,
serialize: (d: Date) => d.toISOString(),
deserialize: (s: string) => new Date(s),
})
);Stale-While-Revalidate
Serve stale data while refreshing:
const getData = cache.createFunction((id: string) =>
cacheOptions({
key: ["data", id] as const,
fn: async () => fetchExpensiveData(id),
ttl: 3600, // Cache for 1 hour
staleTime: 300, // Mark stale after 5 minutes
})
);
// First call - fetches and caches
const data1 = await getData("123");
// After 5 minutes - returns stale data, refreshes in background
const data2 = await getData("123");Cache Versioning
Bump the version on deploy for instant global invalidation:
// Old deployment
const cacheV1 = new CacheClient(redis, { version: "v1" });
// New deployment - old caches are ignored, expire naturally
const cacheV2 = new CacheClient(redis, { version: "v2" });
// All keys become: v2:cache:user:123Conditional Caching
Skip caching based on the result:
const search = cache.createFunction((query: string) =>
cacheOptions({
key: ["search", query] as const,
fn: async () => searchAPI(query),
ttl: 300,
shouldCache: (result) => result.length > 0, // Don't cache empty results
})
);
// Empty results won't be cached
const results = await search("xyz"); // Executes every time if emptyCache Options Helper
Use cacheOptions() for reusable option factories with type-safe keys:
import { cacheOptions } from "@shkumbinhsn/cache";
const userOptions = (id: string) =>
cacheOptions({
key: ["user", id] as const,
fn: async () => fetchUser(id),
ttl: 3600,
});
// Use with direct methods
const user = await cache.fetch(userOptions("123"));
const data = await cache.getData(userOptions("123").key);
await cache.invalidate(userOptions("123"));Type-Safe Query Keys
Query keys carry type information via DataTag:
interface User {
id: string;
name: string;
}
const userOptions = (id: string) =>
cacheOptions({
key: ["user", id] as const,
fn: async (): Promise<User> => ({ id, name: "John" }),
ttl: 3600,
});
// Type is inferred from the key's DataTag
const data: User | null = await cache.getData(userOptions("123").key);Error Handling
The library throws specific error types:
import { CacheError, LockAcquisitionError, SerializationError } from "@shkumbinhsn/cache";
try {
const data = await cache.fetch(options);
} catch (error) {
if (error instanceof LockAcquisitionError) {
// Failed to acquire distributed lock
} else if (error instanceof SerializationError) {
// JSON parse/stringify failed
} else if (error instanceof CacheError) {
// General cache error
}
}Redis Client Interface
The library expects a Redis client matching this interface:
interface RedisClient {
get(key: string): Promise<string | null>;
set(key: string, value: string, options?: { ex?: number; px?: number; nx?: boolean }): Promise<string | null>;
del(key: string): Promise<number>;
ttl(key: string): Promise<number>;
exists(key: string): Promise<number>;
}Using with Bun's RedisClient
Bun's native Redis client uses a different API, so use the provided adapter:
import { createBunRedisAdapter } from "@shkumbinhsn/cache";
import { RedisClient } from "bun";
const bunRedis = new RedisClient("redis://localhost:6379");
const redis = createBunRedisAdapter(bunRedis);Using with ioredis or node-redis
These clients are directly compatible:
import Redis from "ioredis";
const redis = new Redis("redis://localhost:6379");
// Use directly with CacheClientCompatible with:
- Bun's
RedisClient(with adapter) ioredis(direct)redis(direct)- Any client implementing the interface
Releasing
This project uses Changesets for versioning and automated releases with OIDC authentication.
Adding Changes
When making changes, add a changeset to document them:
bunx changesetSelect the appropriate semver bump (patch/minor/major) and write a summary.
Release Process
- Changesets creates a "Version Packages" PR automatically
- Review and merge the PR when ready to release
- The release workflow automatically:
- Publishes to npm with provenance
- Creates GitHub releases
- Uses OIDC for secure authentication (no long-lived tokens)
Manual Release (maintainers only)
# Version packages
bun run version
# Publish to npm
bun run releaseTesting
# Run tests
bun test
# Type check
bunx tsc --noEmitLicense
MIT
