npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

cluster-shared-cache

v1.0.4

Published

A shared in-memory cache for Node.js cluster applications

Readme

API Reference

Constructor

const cache = new ClusterSharedCache(options);

Options:

  • maxSize (number): Maximum number of items in cache (default: 1000)
  • ttl (number): Default TTL in milliseconds (default: 300000 = 5 minutes)
  • checkPeriod (number): Cleanup interval in milliseconds (default: 60000 = 1 minute)

Methods

set(key, value, ttl?)

Store a value in the cache.

await cache.set('user:123', userData, 300000); // 5 minutes TTL
await cache.set('config', configData); // Uses default TTL

get(key)

Retrieve a value from the cache.

const user = await cache.get('user:123');
if (user) {
  console.log('Cache hit!', user);
} else {
  console.log('Cache miss');
}

delete(key)

Remove a specific key from the cache.

await cache.delete('user:123');

clear()

Clear all items from the cache.

await cache.clear();

has(key)

Check if a key exists in the local cache.

if (cache.has('user:123')) {
  console.log('Key exists locally');
}

size()

Get the number of items in the local cache.

console.log('Cache size:', cache.size());

keys(), values(), entries()

Get arrays of keys, values, or entries from the local cache.

console.log('All keys:', cache.keys());
console.log('All values:', cache.values());
console.log('All entries:', cache.entries());

How It Works

  1. Master Process: Maintains the authoritative cache state
  2. Worker Processes: Keep local copies and sync with master via IPC
  3. Cache Operations:
    • SET: Updates master cache and broadcasts to all workers
    • GET: Checks local cache first, then queries master if needed
    • DELETE: Removes from master and all workers
  4. TTL Management: Both master and workers handle expiration independently
  5. Memory Management: LRU-style eviction when maxSize is reached

Performance Considerations

  • Local Cache First: GET operations check the local worker cache first for maximum speed
  • Async Operations: All cache operations are asynchronous to prevent blocking
  • Efficient IPC: Minimal message passing between processes
  • Memory Limits: Configurable max size prevents memory leaks

Best Practices

  1. Initialize Early: Set up the cache before forking workers
  2. Reasonable TTL: Don't set TTL too low to avoid constant cache misses
  3. Monitor Memory: Keep track of cache size in production
  4. Error Handling: Always handle cache misses gracefully
  5. Cleanup: Call cache.destroy() when shutting down

Troubleshooting

Cache Not Syncing

Ensure the cache is initialized in the master process before forking workers.

High Memory Usage

Reduce maxSize or lower TTL values to limit memory consumption.

Slow Performance

Check if TTL is too aggressive or if you're over-using the cache.