npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@bernierllc/cache-manager

v1.0.4

Published

Multi-tier caching with TTL support, cache invalidation, and multiple storage backends

Readme

@bernierllc/cache-manager

Multi-tier caching with TTL support, cache invalidation, and multiple storage backends.

Installation

npm install @bernierllc/cache-manager

For Redis support:

npm install @bernierllc/cache-manager redis

Quick Start

import { CacheManager } from '@bernierllc/cache-manager';

// Create cache with default memory backend
const cache = new CacheManager({
  strategy: 'lru',
  maxSize: 1000,
  defaultTtl: 60 * 1000 // 1 minute
});

// Set cache value
await cache.set('user:123', { name: 'John', email: '[email protected]' });

// Get cache value
const user = await cache.get('user:123');
console.log(user); // { name: 'John', email: '[email protected]' }

// Get or set pattern
const userData = await cache.getOrSet('user:456', async () => {
  return await fetchUserFromDatabase('456');
}, 5 * 60 * 1000); // Cache for 5 minutes

Core Features

  • Multiple Backends: Memory, Redis, database, multi-tier
  • Cache Strategies: LRU, LFU, TTL-based eviction policies
  • Intelligent Invalidation: Tag-based, pattern-based cache invalidation
  • Serialization: JSON, binary, custom serializers
  • Compression: Optional compression for large values
  • Stats & Monitoring: Hit rates, memory usage, performance metrics
  • Distributed Support: Redis-backed distributed caching

API Reference

CacheManager

Constructor

const cache = new CacheManager(options: CacheOptions)

Options:

  • backend?: CacheBackend | CacheBackend[] - Storage backend(s)
  • strategy?: 'lru' | 'lfu' | 'ttl' - Eviction strategy (default: 'lru')
  • maxSize?: number - Maximum cache size (default: 1000)
  • defaultTtl?: number - Default TTL in milliseconds
  • keyPrefix?: string - Prefix for all cache keys
  • onEviction?: (key, value) => void - Eviction callback
  • onExpiration?: (key, value) => void - Expiration callback

Core Methods

// Basic operations
await cache.set(key: string, value: any, ttl?: number, tags?: string[]): Promise<void>
await cache.get<T>(key: string): Promise<T | null>
await cache.delete(key: string): Promise<boolean>
await cache.clear(): Promise<void>
await cache.has(key: string): Promise<boolean>

// Batch operations  
await cache.mget<T>(keys: string[]): Promise<(T | null)[]>
await cache.mset<T>(entries: Record<string, T>, ttl?: number): Promise<void>
await cache.getOrSet<T>(key: string, factory: () => Promise<T>, ttl?: number): Promise<T>

// Invalidation
await cache.invalidateByTag(tag: string): Promise<void>
await cache.invalidateByPattern(pattern: string): Promise<void>

// Utilities
await cache.keys(pattern?: string): Promise<string[]>
await cache.size(): Promise<number>
await cache.getStats(): Promise<CacheStats>
await cache.cleanup(): Promise<number>

Usage Examples

Basic Memory Caching

import { CacheManager, MemoryCacheBackend } from '@bernierllc/cache-manager';

const cache = new CacheManager({
  backend: new MemoryCacheBackend({
    maxSize: 1000,
    strategy: 'lru'
  }),
  defaultTtl: 60 * 60 * 1000 // 1 hour
});

// Cache user data
await cache.set('user:123', { 
  name: 'John Doe', 
  email: '[email protected]' 
});

const user = await cache.get('user:123');
console.log(user.name); // 'John Doe'

Redis-backed Distributed Cache

import { CacheManager, RedisCacheBackend } from '@bernierllc/cache-manager';

const cache = new CacheManager({
  backend: new RedisCacheBackend({
    host: 'localhost',
    port: 6379,
    keyPrefix: 'myapp:'
  }),
  defaultTtl: 15 * 60 * 1000 // 15 minutes
});

// Works across multiple application instances
await cache.set('global:config', configObject);

Multi-tier Caching

import { 
  CacheManager, 
  MemoryCacheBackend, 
  RedisCacheBackend 
} from '@bernierllc/cache-manager';

const cache = new CacheManager({
  backend: [
    new MemoryCacheBackend({ maxSize: 100 }),    // L1 cache - fast, small
    new RedisCacheBackend({ host: 'localhost' })  // L2 cache - shared, persistent
  ]
});

// Automatically checks L1, then L2, updates L1 on L2 hits
const data = await cache.get('expensive:computation');

Tag-based Invalidation

const cache = new CacheManager({
  backend: new RedisCacheBackend({ host: 'localhost' })
});

// Cache with tags
await cache.set('post:1', postData, 60 * 60 * 1000, ['user:123', 'category:tech']);
await cache.set('post:2', postData2, 60 * 60 * 1000, ['user:123', 'category:news']);

// Invalidate all posts by user
await cache.invalidateByTag('user:123');

// Invalidate all tech posts  
await cache.invalidateByTag('category:tech');

Pattern-based Invalidation

// Cache user-specific data
await cache.set('user:123:profile', profileData);
await cache.set('user:123:settings', settingsData);
await cache.set('user:456:profile', otherProfileData);

// Invalidate all data for user 123
await cache.invalidateByPattern('user:123:*');

Cache Warming and Preloading

// Warm cache with frequently accessed data
async function warmCache() {
  const popularUsers = await getPopularUsers();
  
  for (const user of popularUsers) {
    await cache.set(`user:${user.id}`, user, 60 * 60 * 1000);
  }
}

// Background cache refresh
setInterval(async () => {
  const keys = await cache.keys('user:*');
  
  for (const key of keys) {
    const userId = key.split(':')[1];
    const freshData = await fetchUserFromDatabase(userId);
    await cache.set(key, freshData, 60 * 60 * 1000);
  }
}, 30 * 60 * 1000); // Refresh every 30 minutes

Performance Monitoring

// Monitor cache performance
setInterval(async () => {
  const stats = await cache.getStats();
  
  console.log(`Cache Performance:
    Hit Rate: ${(stats.hitRate * 100).toFixed(1)}%
    Size: ${stats.size} entries
    Memory: ${Math.round(stats.memoryUsage / 1024)}KB
    Evictions: ${stats.evictions}
  `);
}, 60000);

// Automatic cleanup
cache.startCleanupInterval(5 * 60 * 1000); // Clean every 5 minutes

Configuration

Cache Backends

MemoryCacheBackend

new MemoryCacheBackend({
  maxSize: 1000,              // Maximum number of entries
  strategy: 'lru',            // Eviction strategy
  onEviction: (key, value) => console.log('Evicted:', key),
  onExpiration: (key, value) => console.log('Expired:', key)
})

RedisCacheBackend

new RedisCacheBackend({
  host: 'localhost',
  port: 6379,
  password: 'secret',         // Optional
  db: 0,                      // Database number
  keyPrefix: 'cache:',        // Key prefix
  connectionString: 'redis://localhost:6379' // Alternative to host/port
})

MultiTierCacheBackend

new MultiTierCacheBackend({
  backends: [
    new MemoryCacheBackend({ maxSize: 100 }),
    new RedisCacheBackend({ host: 'localhost' })
  ]
})

Cache Strategies

  • LRU (Least Recently Used): Evicts the least recently accessed entries
  • LFU (Least Frequently Used): Evicts the least frequently accessed entries
  • TTL (Time To Live): Evicts entries based on expiration time

Serializers

import { JSONSerializer, BinarySerializer } from '@bernierllc/cache-manager';

// JSON serialization (default)
const cache = new CacheManager({
  serializer: new JSONSerializer()
});

// Binary serialization for better performance
const cache = new CacheManager({
  serializer: new BinarySerializer()
});

Error Handling

try {
  await cache.set('key', 'value');
  const value = await cache.get('key');
} catch (error) {
  console.error('Cache operation failed:', error);
  // Fallback to original data source
}

Best Practices

  1. Choose appropriate TTL: Set TTL based on data freshness requirements
  2. Use appropriate cache size: Balance memory usage with hit rates
  3. Implement cache warming: Preload frequently accessed data
  4. Monitor performance: Track hit rates and adjust configuration
  5. Handle failures gracefully: Always have fallback mechanisms
  6. Use tags wisely: Group related cache entries for efficient invalidation

Performance

The cache manager is designed for high-performance scenarios:

  • Memory backend: >50,000 operations/second
  • Redis backend: >10,000 operations/second
  • Multi-tier: Combines speed of memory with persistence of Redis
  • Efficient serialization: Minimal overhead for data conversion
  • Smart eviction: Algorithms optimized for real-world usage patterns

Integration Documentation

Logger Integration

The cache manager supports optional logger integration using @bernierllc/logger:

import { CacheManager } from '@bernierllc/cache-manager';
import { detectLogger } from '@bernierllc/logger';

const cache = new CacheManager({
  strategy: 'lru',
  maxSize: 1000
});

// Auto-detect logger if available
const logger = await detectLogger();
if (logger) {
  // Enhanced logging for cache operations
  cache.on('hit', (key) => {
    logger.debug('Cache hit', { key });
  });
  
  cache.on('miss', (key) => {
    logger.debug('Cache miss', { key });
  });
  
  cache.on('evicted', (key, reason) => {
    logger.info('Cache eviction', { key, reason });
  });
}

NeverHub Integration

The cache manager integrates with NeverHub when available for enhanced service discovery and monitoring:

import { CacheManager } from '@bernierllc/cache-manager';
import { detectNeverHub } from '@bernierllc/neverhub-adapter';

async function initializeCacheManager() {
  const cache = new CacheManager({
    strategy: 'lru',
    maxSize: 1000
  });

  // Auto-detect NeverHub
  const neverhub = await detectNeverHub();
  if (neverhub) {
    // Register cache manager as a service
    await neverhub.register({
      type: 'cache-manager',
      name: '@bernierllc/cache-manager',
      version: '1.0.0',
      capabilities: [
        { type: 'cache', name: 'memory', version: '1.0.0' },
        { type: 'cache', name: 'redis', version: '1.0.0' }
      ]
    });

    // Publish cache events
    cache.on('hit', async (key) => {
      await neverhub.publishEvent({
        type: 'cache.hit',
        data: { key, timestamp: Date.now() }
      });
    });

    // Subscribe to cache invalidation events
    await neverhub.subscribe('cache.invalidate', async (event) => {
      if (event.data.pattern) {
        await cache.invalidatePattern(event.data.pattern);
      } else if (event.data.key) {
        await cache.delete(event.data.key);
      }
    });
  }

  return cache;
}

Graceful Degradation

The cache manager implements graceful degradation patterns:

  • Works without external services: Core functionality operates independently
  • Logger integration: Enhanced monitoring when logger service is available
  • NeverHub integration: Service discovery and events when NeverHub is present
  • Backend flexibility: Falls back to memory storage if Redis is unavailable
  • Error resilience: Cache failures don't break application functionality

Dependencies

  • Required: None (core functionality)
  • Optional: redis for Redis backend support
  • Optional: lz4 for compression support
  • Optional: @bernierllc/logger for enhanced logging capabilities
  • Optional: @bernierllc/neverhub-adapter for service discovery integration
  • Internal: @bernierllc/connection-parser for connection string parsing

License

Copyright (c) 2025 Bernier LLC. All rights reserved.