cache-sync
v1.4.2
Published
cache-sync core package
Maintainers
Readme
Cache-Sync
A powerful, type-safe caching library for TypeScript/JavaScript applications that provides intelligent cache management with automatic refresh, layered caching, and flexible storage backends.
Features
- 🚀 Multiple Storage Backends: In-memory LRU cache and custom store support
- 🔄 Intelligent Refresh: Automatic background cache refresh based on TTL thresholds
- 📚 Layered Caching: Multi-tier cache architecture for optimal performance
- 🛡️ Type Safety: Full TypeScript support with generic type inference
- 🎯 Promise Coalescing: Prevents duplicate concurrent requests for the same key
- 📊 Event-Driven: Listen to cache events and errors for monitoring
- ⚡ Performance Optimized: Built-in LRU eviction and configurable TTL
- 🔧 Flexible Configuration: Customizable caching conditions and clone behavior
Installation
npm install cache-sync
# or
yarn add cache-sync
# or
pnpm add cache-syncQuick Start
Basic Usage
import { cacheProviderFactory } from 'cache-sync';
// Create an in-memory cache
const cache = await cacheProviderFactory('inMemory', {
maxSize: 1000,
ttl: 60000, // 1 minute TTL
});
// Cache a value
await cache.set('user:123', { id: 123, name: 'John Doe' });
// Retrieve a value
const user = await cache.get<User>('user:123');
// Cache with automatic refresh
const userData = await cache.cacheWithRefresh(
'user:456',
async () => {
// This function will be called to fetch fresh data
return await fetchUserFromAPI(456);
},
{
ttl: 300000, // 5 minutes
refreshThreshold: 60000, // Refresh when 1 minute remaining
deferAsync: false, // Wait for cache set to complete (default)
}
);Layered Cache
import { cacheProviderFactory, useLayeredCache } from 'cache-sync';
// Create multiple cache layers
const l1Cache = await cacheProviderFactory('inMemory', { maxSize: 100 });
const l2Cache = await cacheProviderFactory('inMemory', { maxSize: 1000 });
// Combine them into a layered cache
const layeredCache = useLayeredCache([l1Cache, l2Cache]);
// The layered cache will check L1 first, then L2, then fetch from source
const result = await layeredCache.cacheWithRefresh(
'expensive-operation',
async () => await performExpensiveOperation(),
{
deferAsync: true, // Set cache asynchronously for better performance
}
);API Reference
Cache Provider Factory
cacheProviderFactory(provider, options?)Creates a cache instance with the specified provider and options.
Parameters:
provider:'inMemory'| Custom store function | Store instanceoptions: Configuration options for the cache provider
Cache Methods
set(key, value, ttl?)
Store a value in the cache with an optional TTL.
get<T>(key)
Retrieve a value from the cache with type safety.
delete(key)
Remove a specific key from the cache.
clear()
Clear all entries from the cache.
cacheWithRefresh<T>(key, fetchFn, options?)
Intelligent caching with automatic background refresh.
Parameters:
key: Cache keyfetchFn: Function to fetch fresh dataoptions: Refresh configurationttl: Time to live for the cached valuerefreshThreshold: When to trigger background refresh (milliseconds before expiry)deferAsync: Iftrue, cache set operations execute asynchronously (fire-and-forget); iffalse(default), they complete before returning
Configuration Options
In-Memory Cache Options
interface InMemoryCacheOptions {
maxSize?: number; // Maximum number of entries (default: 1000)
ttl?: number; // Default TTL in milliseconds
refreshThreshold?: number; // Background refresh threshold
cloneOnWrite?: boolean; // Clone objects when caching (default: true)
cacheConditionCheck?: (value: unknown) => boolean; // Custom cache condition
}Advanced Usage
Custom Cache Store
import { GenericCacheStore } from 'cache-sync';
class RedisStore implements GenericCacheStore {
// Implement all required methods
async get<T>(key: string): Promise<T | undefined> {
// Redis implementation
}
async set<T>(key: string, value: T, ttl?: number): Promise<void> {
// Redis implementation
}
// ... other methods
}
const cache = await cacheProviderFactory(() => new RedisStore(), {
refreshThreshold: 30000,
});Error Handling
import { CacheSyncEvents } from 'cache-sync';
cache.onError(CacheSyncEvents.ON_EVENT_ERROR, (error) => {
console.error('Cache error:', error);
});
cache.onError(CacheSyncEvents.ON_REFRESH_ERROR, (error) => {
console.error('Background refresh failed:', error);
});Performance Monitoring
// Monitor cache performance
const startTime = Date.now();
const result = await cache.cacheWithRefresh('key', fetchFunction);
const duration = Date.now() - startTime;
console.log(`Cache operation took ${duration}ms`);Deferred Cache Operations
The deferAsync option allows you to control whether cache set operations should block or execute asynchronously:
// Synchronous mode (default) - waits for cache set to complete
const userData = await cache.cacheWithRefresh(
'critical-data',
async () => await fetchCriticalData(),
{
deferAsync: false, // Ensures data is cached before returning
}
);
// Asynchronous mode - fire-and-forget for better performance
const metrics = await cache.cacheWithRefresh(
'analytics-metrics',
async () => await fetchMetrics(),
{
deferAsync: true, // Returns immediately, cache set happens in background
}
);Use Cases:
deferAsync: false(default): Critical data where you need to ensure it's cached before proceedingdeferAsync: true: Non-critical data where lower latency is preferred over immediate cache consistency
Best Practices
- Set appropriate TTL values: Balance between data freshness and performance
- Use refresh thresholds: Enable background refresh for frequently accessed data
- Monitor cache hit rates: Track cache effectiveness for optimization
- Handle errors gracefully: Implement proper error handling for cache operations
- Size your cache appropriately: Set
maxSizebased on your memory constraints
Performance Considerations
- LRU Eviction: Automatically removes least recently used items when cache is full
- Promise Coalescing: Prevents duplicate requests for the same key
- Background Refresh: Updates cache without blocking read operations
- Memory Management: Optional object cloning to prevent memory leaks
Requirements
- Node.js >= 20.17.0
- TypeScript >= 4.5 (for type safety)
License
MIT © Bruno Moraes
Contributing
Contributions are welcome! Please read our contributing guidelines and submit pull requests for any improvements.
Support
If you encounter any issues or have questions, please file an issue on our GitHub repository.
