npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

cache-sync

v1.4.2

Published

cache-sync core package

Readme

Cache-Sync

A powerful, type-safe caching library for TypeScript/JavaScript applications that provides intelligent cache management with automatic refresh, layered caching, and flexible storage backends.

Features

  • 🚀 Multiple Storage Backends: In-memory LRU cache and custom store support
  • 🔄 Intelligent Refresh: Automatic background cache refresh based on TTL thresholds
  • 📚 Layered Caching: Multi-tier cache architecture for optimal performance
  • 🛡️ Type Safety: Full TypeScript support with generic type inference
  • 🎯 Promise Coalescing: Prevents duplicate concurrent requests for the same key
  • 📊 Event-Driven: Listen to cache events and errors for monitoring
  • Performance Optimized: Built-in LRU eviction and configurable TTL
  • 🔧 Flexible Configuration: Customizable caching conditions and clone behavior

Installation

npm install cache-sync
# or
yarn add cache-sync
# or
pnpm add cache-sync

Quick Start

Basic Usage

import { cacheProviderFactory } from 'cache-sync';

// Create an in-memory cache
const cache = await cacheProviderFactory('inMemory', {
  maxSize: 1000,
  ttl: 60000, // 1 minute TTL
});

// Cache a value
await cache.set('user:123', { id: 123, name: 'John Doe' });

// Retrieve a value
const user = await cache.get<User>('user:123');

// Cache with automatic refresh
const userData = await cache.cacheWithRefresh(
  'user:456',
  async () => {
    // This function will be called to fetch fresh data
    return await fetchUserFromAPI(456);
  },
  {
    ttl: 300000, // 5 minutes
    refreshThreshold: 60000, // Refresh when 1 minute remaining
    deferAsync: false, // Wait for cache set to complete (default)
  }
);

Layered Cache

import { cacheProviderFactory, useLayeredCache } from 'cache-sync';

// Create multiple cache layers
const l1Cache = await cacheProviderFactory('inMemory', { maxSize: 100 });
const l2Cache = await cacheProviderFactory('inMemory', { maxSize: 1000 });

// Combine them into a layered cache
const layeredCache = useLayeredCache([l1Cache, l2Cache]);

// The layered cache will check L1 first, then L2, then fetch from source
const result = await layeredCache.cacheWithRefresh(
  'expensive-operation',
  async () => await performExpensiveOperation(),
  {
    deferAsync: true, // Set cache asynchronously for better performance
  }
);

API Reference

Cache Provider Factory

cacheProviderFactory(provider, options?)

Creates a cache instance with the specified provider and options.

Parameters:

  • provider: 'inMemory' | Custom store function | Store instance
  • options: Configuration options for the cache provider

Cache Methods

set(key, value, ttl?)

Store a value in the cache with an optional TTL.

get<T>(key)

Retrieve a value from the cache with type safety.

delete(key)

Remove a specific key from the cache.

clear()

Clear all entries from the cache.

cacheWithRefresh<T>(key, fetchFn, options?)

Intelligent caching with automatic background refresh.

Parameters:

  • key: Cache key
  • fetchFn: Function to fetch fresh data
  • options: Refresh configuration
    • ttl: Time to live for the cached value
    • refreshThreshold: When to trigger background refresh (milliseconds before expiry)
    • deferAsync: If true, cache set operations execute asynchronously (fire-and-forget); if false (default), they complete before returning

Configuration Options

In-Memory Cache Options

interface InMemoryCacheOptions {
  maxSize?: number; // Maximum number of entries (default: 1000)
  ttl?: number; // Default TTL in milliseconds
  refreshThreshold?: number; // Background refresh threshold
  cloneOnWrite?: boolean; // Clone objects when caching (default: true)
  cacheConditionCheck?: (value: unknown) => boolean; // Custom cache condition
}

Advanced Usage

Custom Cache Store

import { GenericCacheStore } from 'cache-sync';

class RedisStore implements GenericCacheStore {
  // Implement all required methods
  async get<T>(key: string): Promise<T | undefined> {
    // Redis implementation
  }

  async set<T>(key: string, value: T, ttl?: number): Promise<void> {
    // Redis implementation
  }

  // ... other methods
}

const cache = await cacheProviderFactory(() => new RedisStore(), {
  refreshThreshold: 30000,
});

Error Handling

import { CacheSyncEvents } from 'cache-sync';

cache.onError(CacheSyncEvents.ON_EVENT_ERROR, (error) => {
  console.error('Cache error:', error);
});

cache.onError(CacheSyncEvents.ON_REFRESH_ERROR, (error) => {
  console.error('Background refresh failed:', error);
});

Performance Monitoring

// Monitor cache performance
const startTime = Date.now();
const result = await cache.cacheWithRefresh('key', fetchFunction);
const duration = Date.now() - startTime;

console.log(`Cache operation took ${duration}ms`);

Deferred Cache Operations

The deferAsync option allows you to control whether cache set operations should block or execute asynchronously:

// Synchronous mode (default) - waits for cache set to complete
const userData = await cache.cacheWithRefresh(
  'critical-data',
  async () => await fetchCriticalData(),
  {
    deferAsync: false, // Ensures data is cached before returning
  }
);

// Asynchronous mode - fire-and-forget for better performance
const metrics = await cache.cacheWithRefresh(
  'analytics-metrics',
  async () => await fetchMetrics(),
  {
    deferAsync: true, // Returns immediately, cache set happens in background
  }
);

Use Cases:

  • deferAsync: false (default): Critical data where you need to ensure it's cached before proceeding
  • deferAsync: true: Non-critical data where lower latency is preferred over immediate cache consistency

Best Practices

  1. Set appropriate TTL values: Balance between data freshness and performance
  2. Use refresh thresholds: Enable background refresh for frequently accessed data
  3. Monitor cache hit rates: Track cache effectiveness for optimization
  4. Handle errors gracefully: Implement proper error handling for cache operations
  5. Size your cache appropriately: Set maxSize based on your memory constraints

Performance Considerations

  • LRU Eviction: Automatically removes least recently used items when cache is full
  • Promise Coalescing: Prevents duplicate requests for the same key
  • Background Refresh: Updates cache without blocking read operations
  • Memory Management: Optional object cloning to prevent memory leaks

Requirements

  • Node.js >= 20.17.0
  • TypeScript >= 4.5 (for type safety)

License

MIT © Bruno Moraes

Contributing

Contributions are welcome! Please read our contributing guidelines and submit pull requests for any improvements.

Support

If you encounter any issues or have questions, please file an issue on our GitHub repository.