@asaidimu/utils-cache
v3.0.0
Published
Resource and cache management utilities for @asaidimu applications.
Maintainers
Readme
@asaidimu/utils-cache
An intelligent, configurable in-memory cache library for Node.js and browser environments, designed for optimal performance, data consistency, and developer observability.
🚀 Quick Links
- Overview & Features
- Installation & Setup
- Usage Documentation
- Project Architecture
- Development & Contributing
- Additional Information
📦 Overview & Features
Detailed Description
@asaidimu/utils-cache provides a robust, in-memory caching solution designed for applications that require efficient data retrieval, resilience against network failures, and state persistence across sessions or processes. It implements common caching patterns like stale-while-revalidate and Least Recently Used (LRU) eviction, along with advanced features such as automatic retries for failed fetches, an extensible persistence mechanism, and a comprehensive event system for real-time monitoring.
Unlike simpler caches, Cache manages data freshness intelligently, allowing you to serve stale data immediately while a fresh copy is being fetched in the background. Its pluggable persistence layer enables you to save and restore the cache state, making it ideal for client-side applications that need to maintain state offline or server-side applications that need rapid startup with pre-populated data. With built-in metrics and events, @asaidimu/utils-cache offers deep insights into cache performance and lifecycle, ensuring both speed and data integrity.
Key Features
- Configurable In-Memory Store: Provides fast access to cached data with an underlying
Mapstructure. - Stale-While-Revalidate (SWR): Serve existing data immediately while fetching new data in the background, minimizing perceived latency and improving user experience.
- Automatic Retries with Exponential Backoff: Configurable retry attempts and an exponentially increasing delay between retries for
fetchFunctionfailures, enhancing resilience to transient network issues. - Pluggable Persistence: Seamlessly integrates with any
SimplePersistenceimplementation (e.g., LocalStorage, IndexedDB via@asaidimu/utils-persistence, or custom backend) to save and restore cache state across application restarts or sessions.- Debounced Persistence Writes: Optimizes write frequency to the underlying persistence layer, reducing I/O operations and improving performance.
- Remote Update Handling: Automatically synchronizes cache state when the persistence layer is updated externally by other instances or processes.
- Custom Serialization/Deserialization: Provides options to serialize and deserialize complex data types (e.g.,
Date,Map, custom classes) for proper storage and retrieval.
- Configurable Eviction Policies:
- Time-Based (TTL): Automatically evicts entries that haven't been accessed for a specified
cacheTime, managing memory efficiently. - Size-Based (LRU): Evicts least recently used items when the
maxSizelimit is exceeded, preventing unbounded memory growth.
- Time-Based (TTL): Automatically evicts entries that haven't been accessed for a specified
- Comprehensive Event System: Subscribe to granular, scoped cache events (e.g.,
'cache:read:hit','cache:fetch:start','cache:data:set') for real-time logging, debugging, analytics, and advanced reactivity. Wildcard subscriptions (e.g.,'cache:read:*') are supported for capturing related events. - Performance Metrics: Built-in tracking for
hits,misses,fetches,errors,evictions, andstaleHits, providing insights into cache efficiency with calculated hit rates. - Flexible Query Management: Register asynchronous
fetchFunctions for specific keys, allowing theCacheinstance to intelligently manage their data lifecycle, including fetching, caching, and invalidation. - Imperative Control: Offers direct methods for
invalidate(making data stale),prefetch(loading data proactively),refresh(forcing a re-fetch),setData(manual data injection), andremoveoperations. - TypeScript Support: Fully typed API for enhanced developer experience, compile-time safety, and autocompletion.
🛠️ Installation & Setup
Prerequisites
- Node.js (v14.x or higher)
- npm, yarn, or bun
Installation Steps
Install @asaidimu/utils-cache and its peer dependency @asaidimu/events:
bun add @asaidimu/utils-cache @asaidimu/events
# or
npm install @asaidimu/utils-cache @asaidimu/events
# or
yarn add @asaidimu/utils-cache @asaidimu/eventsConfiguration
Cache is initialized with a CacheOptions object, allowing you to customize its behavior globally. Individual queries registered via registerQuery can override these options for specific data keys.
import { Cache } from '@asaidimu/utils-cache';
// Example persistence layer (install separately, e.g., @asaidimu/utils-persistence)
import { IndexedDBPersistence } from '@asaidimu/utils-persistence'; // Example
const myCache = new Cache({
staleTime: 5 * 60 * 1000, // Data considered stale after 5 minutes (5 * 60 * 1000ms)
cacheTime: 30 * 60 * 1000, // Data evicted if not accessed for 30 minutes
retryAttempts: 2, // Retry fetch up to 2 times on failure
retryDelay: 2000, // 2-second initial delay between retries (doubles each attempt)
maxSize: 500, // Maximum 500 entries in cache (LRU eviction)
enableMetrics: true, // Enable performance tracking
// Persistence options (optional but recommended for stateful caches)
persistence: new IndexedDBPersistence('my-app-db'), // Plug in your persistence layer
persistenceId: 'my-app-cache-v1', // Unique ID for this cache instance in persistence
persistenceDebounceTime: 1000, // Debounce persistence writes by 1 second
// Custom serializers/deserializers for non-JSON-serializable data (optional)
serializeValue: (value: any) => {
if (value instanceof Map) return { _type: 'Map', data: Array.from(value.entries()) };
if (value instanceof Date) return { _type: 'Date', data: value.toISOString() };
return value;
},
deserializeValue: (value: any) => {
if (typeof value === 'object' && value !== null) {
if (value._type === 'Map') return new Map(value.data);
if (value._type === 'Date') return new Date(value.data);
}
return value;
},
});
// Negative option values are automatically clamped to 0 with a console warning.
const invalidCache = new Cache({ staleTime: -100, cacheTime: -1, maxSize: -5 });
// console.warn output for each negative value will appearVerification
To verify that Cache is installed and initialized correctly, you can run a simple test:
import { Cache } from '@asaidimu/utils-cache';
const cache = new Cache();
console.log('Cache initialized successfully!');
// Register a simple query
cache.registerQuery('hello', async () => {
console.log('Fetching "hello" data...');
return 'world';
});
// Try to fetch data
cache.get('hello').then(data => {
console.log(`Fetched 'hello': ${data}`); // Expected: Fetching "hello" data... \n Fetched 'hello': world
}).catch(error => {
console.error('Error fetching:', error);
});📖 Usage Documentation
Basic Usage
The core of Cache involves registering queries (data fetching functions) and then retrieving data using those queries.
import { Cache } from '@asaidimu/utils-cache';
const myCache = new Cache({
staleTime: 5000, // Data becomes stale after 5 seconds
cacheTime: 60000, // Data will be garbage collected if not accessed for 1 minute
});
// 1. Register a query with a unique string key and an async function to fetch the data.
myCache.registerQuery('user/123', async () => {
console.log('--- Fetching user data from API... ---');
// Simulate network delay
await new Promise(resolve => setTimeout(resolve, 1000));
return { id: 123, name: 'Alice', email: '[email protected]' };
}, { staleTime: 2000 }); // Override staleTime for this specific query to 2 seconds
// 2. Retrieve data from the cache using `get()`.
async function getUserData(label: string) {
console.log(`\n${label}: Requesting user/123`);
const userData = await myCache.get('user/123'); // Default: stale-while-revalidate
console.log(`${label}: User data received:`, userData);
}
// First call: Data is not in cache (miss). Triggers fetch.
getUserData('Initial Call');
// Subsequent calls (within staleTime): Data is returned instantly from cache. No fetch.
setTimeout(() => getUserData('Cached Call'), 500);
// Call after query's staleTime: Data is returned instantly, but a background fetch is triggered.
setTimeout(() => getUserData('Stale & Background Fetch'), 2500);
// Example of waiting for fresh data
async function getFreshUserData() {
console.log('\n--- Requesting FRESH user data (waiting for fetch)... ---');
try {
const freshUserData = await myCache.get('user/123', { waitForFresh: true });
console.log('Fresh user data received:', freshUserData);
} catch (error) {
console.error('Failed to get fresh user data:', error);
}
}
// This will wait for the background fetch triggered by the previous call (if still ongoing) or trigger a new one.
setTimeout(() => getFreshUserData(), 3000);API Usage
new Cache(defaultOptions?: CacheOptions)
Creates a new Cache instance with global default options.
import { Cache } from '@asaidimu/utils-cache';
const cache = new Cache({
staleTime: 5 * 60 * 1000, // 5 minutes
cacheTime: 30 * 60 * 1000, // 30 minutes
maxSize: 1000,
});cache.registerQuery<T>(key: string, fetchFunction: () => Promise<T>, options?: CacheOptions): void
Registers a data fetching function associated with a unique key. This fetchFunction will be called when data for the key is not in cache, is stale, or explicitly invalidated/refreshed.
key: A unique string identifier for the data.fetchFunction: Anasyncfunction that returns aPromiseresolving to the data of typeT.options: OptionalCacheOptionsto override the instance's default options for this specific query (e.g., a shorterstaleTimefor frequently changing data).
cache.registerQuery('products/featured', async () => {
const response = await fetch('https://api.example.com/products/featured');
if (!response.ok) throw new Error('Failed to fetch featured products');
return response.json();
}, {
staleTime: 60 * 1000, // This query's data is stale after 1 minute
retryAttempts: 5, // It will retry fetching up to 5 times
});cache.get<T>(key: string, options?: { waitForFresh?: boolean; throwOnError?: boolean }): Promise<T | undefined>
Retrieves data for a given key.
- If data is fresh, returns it immediately.
- If data is stale (and
waitForFreshisfalseor unset), returns it immediately and triggers a background refetch (stale-while-revalidate). - If data is not in cache (miss), it triggers a fetch.
waitForFresh: Iftrue, the method will await thefetchFunctionto complete and return fresh data. Iffalse(default), it will return existing stale data immediately if available, otherwiseundefinedwhile a fetch is ongoing in the background.throwOnError: Iftrue, and thefetchFunctionfails after all retries, the promise returned bygetwill reject with the error. Iffalse(default), it will returnundefinedon fetch failure, or the last successfully fetched data if available.
// Basic usage (stale-while-revalidate)
const post = await cache.get('posts/latest');
// Wait for fresh data, throw if fetch fails
try {
const userProfile = await cache.get('user/profile', { waitForFresh: true, throwOnError: true });
console.log('Latest user profile:', userProfile);
} catch (error) {
console.error('Could not get fresh user profile due to an error:', error);
}cache.peek<T>(key: string): T | undefined
Retrieves data from the cache without triggering any fetches, updating lastAccessed time, or accessCount. Useful for quick synchronous checks.
const cachedValue = cache.peek('some-config-key');
if (cachedValue) {
console.log('Value is in cache:', cachedValue);
} else {
console.log('Value not found in cache.');
}cache.has(key: string): boolean
Checks if a non-stale, non-loading entry exists in the cache for the given key.
if (cache.has('config/app')) {
console.log('App config is ready and fresh.');
} else {
console.log('App config is missing, stale, or currently loading.');
}cache.invalidate(key: string, refetch = true): Promise<void>
Marks a specific cache entry as stale, forcing the next get call for that key to trigger a refetch. Optionally triggers an immediate background refetch.
key: The cache key to invalidate.refetch: Iftrue(default), triggers an immediate background fetch for the invalidated key using its registeredfetchFunction.
// After updating a user, invalidate their profile data to ensure next fetch is fresh
await cache.invalidate('user/123/profile');
// Invalidate and don't refetch until `get` is explicitly called later
await cache.invalidate('admin/dashboard/stats', false);cache.invalidatePattern(pattern: RegExp, refetch = true): Promise<void>
Invalidates all cache entries whose keys match the given regular expression. Similar to invalidate, it optionally triggers immediate background refetches for all matched keys.
pattern: ARegExpobject to match against cache keys.refetch: Iftrue(default), triggers immediate background fetches for all matched keys.
// Invalidate all product-related data (e.g., after a mass product update)
await cache.invalidatePattern(/^products\//); // Matches 'products/1', 'products/list', etc.
// Invalidate all items containing 'temp' in their key, without immediate refetch
await cache.invalidatePattern(/temp/, false);cache.prefetch(key: string): Promise<void>
Triggers a background fetch for a key if it's not already in cache or is stale. Useful for loading data proactively before it's explicitly requested.
// On application startup or route change, prefetch common data
cache.prefetch('static-content/footer');
cache.prefetch('user/notifications/unread');cache.refresh<T>(key: string): Promise<T | undefined>
Forces a re-fetch of data for a given key, bypassing staleness checks and any existing fetch promises. This ensures you always get the latest data. Returns the fresh data or undefined if the fetch ultimately fails.
// After an API call modifies a resource, force update its cached version
const updatedUser = await cache.refresh('user/current');
console.log('User data refreshed:', updatedUser);cache.setData<T>(key: string, data: T): void
Manually sets or updates data in the cache for a given key. This immediately updates the cache entry, marks it as fresh (by setting lastUpdated to Date.now()), and triggers persistence if configured. It bypasses any registered fetchFunction.
// Manually update a shopping cart item count after a local UI interaction
cache.setData('cart/item-count', 5);
// Directly inject data fetched from another source or computed locally
const localConfig = { theme: 'dark', fontSize: 'medium' };
cache.setData('app/settings', localConfig);cache.remove(key: string): boolean
Removes a specific entry from the cache. Returns true if an entry was found and removed, false otherwise. Also clears any ongoing fetches for that key and triggers persistence.
// When a user logs out, remove their specific session data
cache.remove('user/session');cache.on<EType extends CacheEventType>(event: EType, listener: (ev: Extract<CacheEvent, { type: EType }>) => void): () => void
Subscribes a listener function to specific cache events. Returns an unsubscribe function.
event: The type of event to listen for (e.g.,'cache:read:hit','cache:fetch:error'). Wildcards like'cache:read:*'are supported. SeeCacheEventTypeintypes.tsfor all available types.listener: A callback function that receives the specific event payload for the subscribed event type.
import { Cache, CacheEvent, CacheEventType } from '@asaidimu/utils-cache';
const myCache = new Cache();
const unsubscribeHit = myCache.on('cache:read:hit', (e) => {
console.log(`[CacheEvent] HIT for ${e.key} (isStale: ${e.isStale})`);
});
myCache.on('cache:read:miss', (e) => {
console.log(`[CacheEvent] MISS for ${e.key}`);
});
myCache.on('cache:fetch:error', (e) => {
console.error(`[CacheEvent] ERROR for ${e.key} (attempt ${e.attempt}):`, e.error.message);
});
myCache.on('cache:persistence:save:success', (e) => {
console.log(`[CacheEvent] Persistence: Cache state saved successfully for ID: ${e.key}`);
});
myCache.on('cache:persistence:load:error', (e) => {
console.error(`[CacheEvent] Persistence: Failed to load cache state for ID: ${e.key}`, e.error);
});
// For demonstration, register a query and trigger events
myCache.registerQuery('demo-item', async () => {
console.log('--- Fetching demo-item ---');
await new Promise(r => setTimeout(r, 200));
return 'demo-data';
}, { staleTime: 100 });
myCache.get('demo-item'); // Triggers miss, fetch, set_data
setTimeout(() => myCache.get('demo-item'), 50); // Triggers hit
setTimeout(() => myCache.get('demo-item'), 150); // Triggers stale hit, background fetch
// To unsubscribe from a specific event later:
unsubscribeHit();cache.getStats(): { size: number; metrics: CacheMetrics; hitRate: number; staleHitRate: number; entries: Array<{ key: string; lastAccessed: number; lastUpdated: number; accessCount: number; isStale: boolean; isLoading?: boolean; error?: boolean }> }
Returns current cache statistics and detailed metrics.
size: Number of active entries in the cache.metrics: An object containing raw counts (hits,misses,fetches,errors,evictions,staleHits).hitRate: Ratio of hits to total requests (hits + misses).staleHitRate: Ratio of stale hits to total hits.entries: An array of objects providing details for each cached item (key, lastAccessed, lastUpdated, accessCount, isStale, isLoading, error status).
const stats = myCache.getStats();
console.log('Cache Size:', stats.size);
console.log('Metrics:', stats.metrics);
console.log('Overall Hit Rate:', (stats.hitRate * 100).toFixed(2) + '%');
console.log('Entries details:', stats.entries);cache.clear(): Promise<void>
Clears all data from the in-memory cache, resets metrics, and attempts to clear the associated persisted state via the persistence layer.
console.log('Clearing cache...');
await myCache.clear();
console.log('Cache cleared. Current size:', myCache.getStats().size);cache.destroy(): void
Shuts down the cache instance, clearing all data, stopping the automatic garbage collection timer, unsubscribing from persistence updates, and clearing all internal maps. Call this when the cache instance is no longer needed (e.g., on application shutdown or component unmount) to prevent memory leaks and ensure proper cleanup.
myCache.destroy();
console.log('Cache instance destroyed. All timers stopped and data cleared.');Configuration Examples
The CacheOptions interface provides extensive control over the cache's behavior:
import { CacheOptions, SimplePersistence, SerializableCacheState } from '@asaidimu/utils-cache';
// A mock persistence layer for demonstration purposes.
// In a real application, you'd use an actual implementation like IndexedDBPersistence.
class MockPersistence implements SimplePersistence<SerializableCacheState> {
private store = new Map<string, SerializableCacheState>();
private subscribers = new Map<string, Array<(data: SerializableCacheState) => void>>();
async get(id: string): Promise<SerializableCacheState | undefined> {
console.log(`[MockPersistence] Getting state for ID: ${id}`);
return this.store.get(id);
}
async set(id: string, data: SerializableCacheState): Promise<void> {
console.log(`[MockPersistence] Setting state for ID: ${id}`);
this.store.set(id, data);
// Simulate remote update notification to all subscribed instances
this.subscribers.get(id)?.forEach(cb => cb(data));
}
async clear(id?: string): Promise<void> {
console.log(`[MockPersistence] Clearing state ${id ? 'for ID: ' + id : '(all)'}`);
if (id) {
this.store.delete(id);
} else {
this.store.clear();
}
}
subscribe(id: string, callback: (data: SerializableCacheState) => void): () => void {
console.log(`[MockPersistence] Subscribing to ID: ${id}`);
if (!this.subscribers.has(id)) {
this.subscribers.set(id, []);
}
this.subscribers.get(id)?.push(callback);
// Return unsubscribe function
return () => {
const callbacks = this.subscribers.get(id);
if (callbacks) {
this.subscribers.set(id, callbacks.filter(cb => cb !== callback));
}
console.log(`[MockPersistence] Unsubscribed from ID: ${id}`);
};
}
}
const fullOptions: CacheOptions = {
staleTime: 1000 * 60 * 5, // 5 minutes: After this time, data is stale; a background fetch is considered.
cacheTime: 1000 * 60 * 60, // 1 hour: Items idle (not accessed) for this long are eligible for garbage collection.
retryAttempts: 3, // Max 3 fetch attempts (initial + 2 retries) on network/fetch failures.
retryDelay: 1000, // 1 second initial delay for retries (doubles each subsequent attempt).
maxSize: 2000, // Keep up to 2000 entries; LRU eviction kicks in beyond this limit.
enableMetrics: true, // Enable performance tracking (hits, misses, fetches, etc.).
persistence: new MockPersistence(), // Provide an instance of your persistence layer implementation.
persistenceId: 'my-unique-cache-instance', // A unique identifier for this cache instance within the persistence store.
persistenceDebounceTime: 750, // Wait 750ms after a cache change before writing to persistence to batch writes.
// Custom serializers/deserializers for data that isn't natively JSON serializable (e.g., Maps, Dates, custom classes).
serializeValue: (value: any) => {
// Example: Convert Date objects to ISO strings for JSON serialization
if (value instanceof Date) {
return { _type: 'Date', data: value.toISOString() };
}
// Example: Convert Map objects to an array for JSON serialization
if (value instanceof Map) {
return { _type: 'Map', data: Array.from(value.entries()) };
}
return value; // Return as is for other types
},
deserializeValue: (value: any) => {
// Example: Convert ISO strings back to Date objects
if (typeof value === 'object' && value !== null && value._type === 'Date') {
return new Date(value.data);
}
// Example: Convert array back to Map objects
if (typeof value === 'object' && value !== null && value._type === 'Map') {
return new Map(value.data);
}
return value; // Return as is for other types
},
};
const configuredCache = new Cache(fullOptions);Common Use Cases
Caching API Responses with SWR (Stale-While-Revalidate)
This is the default and most common pattern, where you prioritize immediate responsiveness while ensuring data freshness in the background.
import { Cache } from '@asaidimu/utils-cache';
const apiCache = new Cache({
staleTime: 5 * 60 * 1000, // Data considered stale after 5 minutes
cacheTime: 30 * 60 * 1000, // Idle data garbage collected after 30 minutes
retryAttempts: 3, // Retry fetching on network failures
});
// Register a query for a list of blog posts
apiCache.registerQuery('blog/posts', async () => {
console.log('--- Fetching ALL blog posts from API... ---');
const response = await fetch('https://api.example.com/blog/posts');
if (!response.ok) throw new Error('Failed to fetch blog posts');
return response.json();
});
// Function to display blog posts
async function displayBlogPosts(source: string) {
console.log(`\nDisplaying blog posts from: ${source}`);
const posts = await apiCache.get('blog/posts'); // Uses SWR by default
if (posts) {
console.log(`Received ${posts.length} posts (first 2):`, posts.slice(0, 2).map((p: any) => p.title));
} else {
console.log('No posts yet, waiting for initial fetch...');
}
}
displayBlogPosts('Initial Load'); // First `get`: cache miss, triggers fetch.
setTimeout(() => displayBlogPosts('After 1 sec (cached)'), 1000); // Second `get`: cache hit, returns instantly.
setTimeout(() => displayBlogPosts('After 6 mins (stale & background fetch)'), 6 * 60 * 1000); // After `staleTime`: returns cached, triggers background fetch.Using waitForFresh for Critical Data
For scenarios where serving outdated data is unacceptable (e.g., user permissions, critical configuration).
import { Cache } from '@asaidimu/utils-cache';
const criticalCache = new Cache({ retryAttempts: 5, retryDelay: 1000 });
criticalCache.registerQuery('user/permissions', async () => {
console.log('--- Fetching user permissions from API... ---');
// Simulate potential network flakiness
if (Math.random() > 0.7) {
throw new Error('Network error during permission fetch!');
}
await new Promise(resolve => setTimeout(resolve, 500));
return { canEdit: true, canDelete: false, roles: ['user', 'editor'] };
});
async function checkPermissionsBeforeAction() {
console.log('\nAttempting to get FRESH user permissions...');
try {
// We MUST have the latest permissions before proceeding with a sensitive action
const permissions = await criticalCache.get('user/permissions', { waitForFresh: true, throwOnError: true });
console.log('User permissions received:', permissions);
// Proceed with action based on permissions
} catch (error) {
console.error('CRITICAL: Failed to load user permissions:', error);
// Redirect to error page, show critical alert, or disable functionality
}
}
checkPermissionsBeforeAction();
// You might call this repeatedly in a test scenario to see retries and eventual success/failure
setInterval(() => checkPermissionsBeforeAction(), 3000);Real-time Monitoring with Events
Utilize the comprehensive event system to log, monitor, or react to cache lifecycle events.
import { Cache } from '@asaidimu/utils-cache';
const monitorCache = new Cache({ enableMetrics: true });
monitorCache.registerQuery('stock/AAPL', async () => {
const price = Math.random() * 100 + 150;
console.log(`--- Fetching AAPL price: ${price.toFixed(2)} ---`);
return { symbol: 'AAPL', price: parseFloat(price.toFixed(2)), timestamp: Date.now() };
}, { staleTime: 1000 }); // Very short staleTime for frequent fetches
// Subscribe to various cache events
monitorCache.on('cache:fetch:start', (e) => {
console.log(`[EVENT] Fetching ${e.key} (attempt ${e.attempt})`);
});
monitorCache.on('cache:read:hit', (e) => {
console.log(`[EVENT] Cache hit for ${e.key}. Stale: ${e.isStale}`);
});
monitorCache.on('cache:read:miss', (e) => {
console.log(`[EVENT] Cache miss for ${e.key}`);
});
monitorCache.on('cache:data:evict', (e) => {
console.log(`[EVENT] Evicted ${e.key} due to ${e.reason}`);
});
monitorCache.on('cache:data:set', (e) => {
console.log(`[EVENT] Data for ${e.key} manually set. Old price: ${e.oldData?.price}, New price: ${e.newData.price}`);
});
monitorCache.on('cache:persistence:save:success', (e) => {
console.log(`[EVENT] Persistence: ${e.message || 'Save successful'}`);
});
// Continuously try to get data (will trigger fetches due to short staleTime)
setInterval(() => {
monitorCache.get('stock/AAPL');
}, 500);
// Manually set data to trigger 'set_data' and 'persistence' events
setTimeout(() => {
monitorCache.setData('stock/AAPL', { symbol: 'AAPL', price: 160.00, timestamp: Date.now() });
}, 3000);
// Log cache statistics periodically
setInterval(() => {
const stats = monitorCache.getStats();
console.log(`\n--- CACHE STATS ---`);
console.log(`Size: ${stats.size}, Hits: ${stats.metrics.hits}, Misses: ${stats.metrics.misses}, Fetches: ${stats.metrics.fetches}`);
console.log(`Hit Rate: ${(stats.hitRate * 100).toFixed(2)}%, Stale Hit Rate: ${(stats.staleHitRate * 100).toFixed(2)}%`);
console.log(`Active entries: ${stats.entries.map(e => `${e.key} (stale:${e.isStale})`).join(', ')}`);
console.log(`-------------------\n`);
}, 5000); // Log stats every 5 seconds🏗️ Project Architecture
The @asaidimu/utils-cache library is structured to provide a clear separation of concerns, making it modular, testable, and extensible.
Directory Structure
src/cache/
├── cache.ts # Main Cache class implementation
├── index.ts # Entry point for the module (re-exports Cache)
├── types.ts # TypeScript interfaces and types for options, entries, events, etc.
└── cache.test.ts # Unit tests for the Cache class
package.json # Package metadata and dependencies for this specific moduleCore Components
CacheClass (cache.ts): The central component of the library. It orchestrates all caching logic, including:- Managing the in-memory
Map(this.cache) that storesCacheEntryobjects. - Handling data fetching, retries, and staleness checks.
- Implementing time-based (TTL) and size-based (LRU) garbage collection.
- Integrating with the pluggable persistence layer.
- Emitting detailed cache events.
- Tracking performance metrics.
- Managing the in-memory
CacheOptions(types.ts): An interface defining the configurable parameters for aCacheinstance or individual queries. This includesstaleTime,cacheTime,retryAttempts,maxSize, persistence settings, and custom serialization/deserialization functions.CacheEntry(types.ts): Represents a single item stored within the cache. It encapsulates the actualdata,lastUpdatedandlastAccessedtimestamps,accessCount, and flags likeisLoadingorerrorstatus.QueryConfig(types.ts): Stores thefetchFunctionand the resolvedCacheOptions(merged with instance defaults) for each registered query, enabling tailored behavior per data key.CacheMetrics(types.ts): Defines the structure for tracking cache performance statistics, including hits, misses, fetches, errors, and evictions.SimplePersistence<SerializableCacheState>(from@asaidimu/utils-persistence): An external interface thatCacherelies on for persistent storage. It requires implementations ofget(),set(),clear(), and optionallysubscribe()methods to handle data serialization and deserialization for the specific storage medium (e.g., IndexedDB, LocalStorage, or a remote backend).CacheEvent/CacheEventType(types.ts): A union type defining all possible events emitted by the cache (e.g.,'cache:read:hit','cache:fetch:start','cache:data:evict'). This enables a fine-grained, scoped observability model for the cache's lifecycle.
Data Flow
Initialization:
- The
Cacheconstructor sets up global default options, initializes performance metrics, and starts the automatic garbage collection timer. - If a
persistencelayer is configured, it attempts to load a previously saved state usingpersistence.get(), emitting'cache:persistence:load:success'or'cache:persistence:load:error'. - It then subscribes to
persistence.subscribe()(if available) to listen for remote state changes from the underlying storage, ensuring cache consistency across multiple instances or processes.
- The
registerQuery:- When
registerQuery(key, fetchFunction, options)is called, thefetchFunctionand its specificoptions(merged with the globaldefaultOptions) are stored internally in thethis.queriesmap. This prepares the cache to handle requests for thatkey.
- When
getRequest:- When
get(key, options)is invoked,Cachefirst checksthis.cachefor an existingCacheEntryfor thekey. - Cache Hit: If an entry exists,
lastAccessedandaccessCountare updated, a'cache:read:hit'event is emitted, and metrics are incremented. The entry's staleness is evaluated based onstaleTime.- If
waitForFreshistrueOR if the entry is stale/loading, it proceeds tofetchAndWait. - If
waitForFreshisfalse(default) and the entry is stale, the cached data is returned immediately, and a backgroundfetchis triggered to update the data. - If
waitForFreshisfalseand the entry is fresh, the cached data is returned immediately.
- If
- Cache Miss: If no entry exists, a
'cache:read:miss'event is emitted. A placeholderCacheEntry(markedisLoading) is created, and afetchis immediately triggered to retrieve the data.
- When
fetch/fetchAndWait:- These methods ensure that only one
fetchFunctionruns concurrently for a givenkeyby tracking ongoing fetches inthis.fetching. - They delegate the actual data retrieval and retry logic to
performFetchWithRetry.
- These methods ensure that only one
performFetchWithRetry:- This is where the registered
fetchFunctionis executed. It attempts to call thefetchFunctionmultiple times (up toretryAttempts) with exponential backoff (retryDelay). - Before each attempt, a
'cache:fetch:start'event is emitted, andfetchesmetrics are updated. - On Success: The
CacheEntryis updated with the newdata,lastUpdatedtimestamp, and itsisLoadingstatus is set tofalse. The cache then callsschedulePersistState()to save the updated state andenforceSizeLimit()to maintain themaxSize. - On Failure: If the
fetchFunctionfails, a'cache:fetch:error'event is emitted, anderrorsmetrics are updated. IfretryAttemptsare remaining, it waits (delay) and retries. After all attempts, theCacheEntryis updated with the lasterror,isLoadingis set tofalse, andschedulePersistState()is called.
- This is where the registered
schedulePersistState:- This method debounces write operations to the
persistencelayer. It prevents excessive writes by waiting for a configurablepersistenceDebounceTimebefore serializing the current cache state (usingserializeCacheandserializeValue) and writing it viapersistence.set(). Appropriate persistence events ('cache:persistence:save:success'/'cache:persistence:save:error') are emitted.
- This method debounces write operations to the
handleRemoteStateChange:- This callback is invoked by the
persistencelayer'ssubscribemechanism when an external change to the persisted state is detected. It deserializes theremoteState(usingdeserializeValue) and intelligently updates the localthis.cacheto reflect these external changes, emitting a'cache:persistence:sync'event.
- This callback is invoked by the
garbageCollect:- Running on a
setIntervaltimer (gcTimer), this method periodically scansthis.cache. It removes anyCacheEntrythat has not beenlastAccessedfor longer than its (or global)cacheTime, emitting'cache:data:evict'events.
- Running on a
enforceSizeLimit:- Triggered after successful data updates (
fetchsuccess orsetData). If thecache.sizeexceedsmaxSize, it evicts the Least Recently Used (LRU) entries until themaxSizeis satisfied, emitting'cache:data:evict'events.
- Triggered after successful data updates (
Extension Points
The design of @asaidimu/utils-cache provides several powerful extension points for customization and integration:
SimplePersistenceInterface: This is the primary mechanism for integratingCachewith various storage backends. By implementing this interface, you can useCachewithlocalStorage,IndexedDB(e.g., via@asaidimu/utils-persistence), a custom database, a server-side cache, or any other persistent storage solution.serializeValue/deserializeValueOptions: These functions withinCacheOptionsallow you to define custom logic for how your specific data types are converted to and from a serializable format (e.g., JSON-compatible strings or objects) before being passed to and received from thepersistencelayer. This is crucial for handlingDateobjects,Maps,Sets, or custom class instances.- Event Listeners (
on): The comprehensive event system, powered by@asaidimu/events, allows you to subscribe to a wide range of cache lifecycle events. This enables powerful integrations for:- Logging: Detailed logging of cache activity (hits, misses, errors, evictions).
- Analytics: Feeding cache performance metrics into an analytics platform.
- UI Reactivity: Updating UI components in response to cache changes (e.g., showing a "stale data" indicator or a "refreshing" spinner).
- Debugging: Gaining deep insights into cache behavior during development.
- External Synchronization: Triggering side effects or synchronizing with other systems based on cache events.
🤝 Development & Contributing
We welcome contributions to @asaidimu/utils-cache! Whether it's a bug fix, a new feature, or an improvement to the documentation, your help is appreciated.
Development Setup
To set up the development environment for @asaidimu/utils-cache:
- Clone the monorepo:
git clone https://github.com/asaidimu/erp-utils.git cd erp-utils - Navigate to the cache package:
cd src/cache - Install dependencies:
npm install # or yarn install # or bun install - Build the project:
npm run build # or yarn build # or bun run build
Scripts
The following npm scripts are typically available in this project's setup:
npm run build: Compiles TypeScript source files fromsrc/to JavaScript output indist/.npm run test: Runs the test suite usingVitest.npm run test:watch: Runs tests in watch mode for continuous feedback during development.npm run lint: Runs ESLint to check for code style and potential errors.npm run format: Formats code using Prettier according to the project's style guidelines.
Testing
Tests are written using Vitest. To run tests:
npm test
# or
yarn test
# or
bun testWe aim for high test coverage. Please ensure that new features or bug fixes come with appropriate unit and/or integration tests to maintain code quality and prevent regressions.
Contributing Guidelines
Please follow these steps to contribute:
- Fork the repository on GitHub.
- Create a new branch for your feature or bug fix:
git checkout -b feature/my-awesome-featureorbugfix/resolve-issue-123. - Make your changes, ensuring they adhere to the existing code style and architecture.
- Write or update tests to cover your changes and ensure existing functionality is not broken.
- Ensure all tests pass locally by running
npm test. - Run lint and format checks (
npm run lintandnpm run format) and fix any reported issues. - Write clear, concise commit messages following the Conventional Commits specification (e.g.,
feat: add new caching strategy,fix: correct staleTime calculation). - Push your branch to your fork.
- Open a Pull Request to the
mainbranch of the original repository. Provide a detailed description of your changes and why they are necessary.
Issue Reporting
Found a bug, have a feature request, or need clarification? Please open an issue on our GitHub Issues page.
When reporting a bug, please include:
- A clear and concise description of the issue.
- Detailed steps to reproduce the behavior.
- The expected behavior.
- Any relevant screenshots or code snippets.
- Your environment details (Node.js version, OS, browser, package version).
📚 Additional Information
Troubleshooting
- "No query registered for key: [key]" Error:
- Cause: This error occurs if you try to
get(),prefetch(), orrefresh()akeythat has not been previously associated with afetchFunctionusingcache.registerQuery(). - Solution: Ensure you call
cache.registerQuery(key, fetchFunction)for everykeyyou intend to use with the cache before attempting to retrieve data.
- Cause: This error occurs if you try to
- Data not persisting:
- Cause: The cache state is not being correctly saved or loaded from the underlying storage.
- Solution:
persistenceinstance: Double-check that you are passing a validSimplePersistenceinstance to theCacheconstructor'spersistenceoption.persistenceId: Ensure you've provided a uniquepersistenceIdif multiple cache instances share the same persistence layer.- Serialization: Verify that your data types are correctly handled by
serializeValueanddeserializeValueoptions, especially for non-JSON-serializable types likeMaps,Dateobjects, or custom classes. - Persistence Layer: Confirm your
SimplePersistenceimplementation correctly handlesget(),set(),clear(), andsubscribe()operations for the specific storage medium (e.g., local storage quota, IndexedDB permissions). - Event Errors: Check for persistence event errors in your browser's or Node.js console (e.g.,
cache.on('cache:persistence:save:error', ...)).
- Cache not evicting data:
- Cause: Eviction policies might be disabled or configured with very long durations.
- Solution:
cacheTime: EnsurecacheTimeinCacheOptionsis set to a finite, non-zero positive number (in milliseconds).Infinityor0forcacheTimedisables time-based garbage collection.maxSize: EnsuremaxSizeis set to a finite, non-zero positive number.Infinitydisables size-based LRU eviction, and0means the cache will always be empty (evicting immediately).- Garbage Collection Interval: The garbage collection runs periodically. While generally sufficient, verify that
cacheTimeisn't so large that you rarely hit the GC interval.
- Event listeners not firing:
- Cause: The listener might be removed, or the expected event is not actually occurring.
- Solution:
- Correct Event Type: Ensure you are subscribing to the exact
CacheEventTypeyou expect (e.g.,'cache:read:hit','cache:fetch:error'). enableMetrics: If you expect metric-related events or updates, ensureenableMetricsis not set tofalsein yourCacheOptions.- Unsubscribe Function: Ensure you are not accidentally calling the
unsubscribefunction returned byon()prematurely.
- Correct Event Type: Ensure you are subscribing to the exact
FAQ
Q: How does staleTime differ from cacheTime?
A: staleTime determines when a cached data entry is considered "stale." Once Date.now() - entry.lastUpdated exceeds staleTime, the data is marked stale. If waitForFresh is false (default), get() will return the stale data immediately while triggering a background refetch. cacheTime, on the other hand, determines how long an item can remain unaccessed (idle) before it's eligible for garbage collection and removal from the cache. An item can be stale but still within its cacheTime.
Q: When should I use waitForFresh?
A: Use waitForFresh: true when your application absolutely needs the most up-to-date data before proceeding, and cannot tolerate serving stale or old data. This will block execution until the fetchFunction successfully resolves. For most UI display purposes where latency is critical and a slightly outdated display is acceptable, waitForFresh: false (the default SWR behavior) is usually preferred, as it provides an immediate response.
Q: Can I use Cache in a web worker?
A: Yes, Cache is designed to be environment-agnostic. Its persistence mechanism is pluggable, so you can implement a SimplePersistence that works within a web worker (e.g., using IndexedDB directly or communicating with the main thread via postMessage).
Q: Is Cache thread-safe (or safe with concurrent access)?
A: JavaScript is single-threaded. Cache manages its internal state with Maps and Promises. For concurrent get requests to the same key, it ensures only one fetchFunction runs via the this.fetching map, preventing redundant fetches. Therefore, it is safe for concurrent access within a single JavaScript runtime context. For multiple JavaScript runtimes (e.g., different browser tabs or Node.js processes), the persistence layer's subscribe mechanism handles synchronization.
Changelog/Roadmap
For a detailed history of changes, new features, and bug fixes, please refer to the CHANGELOG.md file in the repository root. Our future plans are outlined in the ROADMAP.md (if available).
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- Inspired by modern data fetching and caching libraries like React Query and SWR.
- Uses the
uuidlibrary for generating unique cache instance IDs. - Event system powered by
@asaidimu/events.
