npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

crisp-cache

v1.5.4

Published

A crispy fresh cache that will try and use updated data where it can, but can use a stale entry if need be.

Downloads

46

Readme

crisp-cache

A crispy fresh cache that will use updated data where it can, but can use a stale entry if need be - useful for high throughput applications that want to avoid cache-slams and blocking.

crisp-cache is now v1.x, tested, and stable.

Master Build Status: Build Status Coverage Status

This cache is for high throughput applications where cache data may become stale before being invalidated. It adds a state to a cache entry - Valid, [Stale], and Expired. This allows the program to ask for a value before the data is evicted from the cache. If the data is stale, the cache will return the stale data, but asynchronously re-fetch data to ensure data stays available. A locking mechanism is also provided so when a cache misses, data will only be retrieved once.

This project sponsored in part by:

AerisWeather - Empowering the next generation, aerisweather.com

Example

var CrispCache = require('crisp-cache');
var data = {
    hello: "world",
    foo: "bar",
    arr: [1, 2, 3],
    hash: {key: "value", nested: [4, 5, 6]}
};
function fetcher(key, callback) {
    return callback(null, data[key]);
}

crispCacheBasic = new CrispCache({
    fetcher: fetcher,
    defaultStaleTtl: 300,
    defaultExpiresTtl: 500,
    staleCheckInterval: 100
});
crispCacheBasic.set('new', 'A new value, not from fetcher', function (err, success) {
    if (success) {
        console.log("Set 'new' to our provided string.");
    }
});

crispCacheBasic.get('foo', function (err, value) {
    if (!err) {
        console.log("Got 'foo', is: " + value);
    }
});
//Wait any amount of time

crispCacheBasic.get('foo', {skipFetch: true}, function (err, value) {
    //We wont have to re-fetch when we call `get`, since it is keeping it up to date for us.
    if (!err) {
        console.log("Got 'foo', is: " + value);
    }
});

Mentions

Usage

new CrispCache({options})

Crisp Cache is instantiated because it holds config for many of it's methods.

| Option | Type | Default | Description | | ------ | ---- | ------- | ----------- | | fetcher | (callable)* | null | A method to call when we need to update a cache entry, should have signature: function(key, callback(err, value, options))[1] | | defaultStaleTtl | (integer, ms) | 300000 | How long the cache entry is valid before becoming stale. | | staleTtlVariance | (integer, ms) | 0 | How many ms to vary the staleTtl (+/-, to prevent cache slams) | | staleCheckInterval | (integer, ms) | 0 | If >0, how often to check for stale keys and re-fetch | | defaultExpiresTtl | (integer, ms) | 0 | If >0, cache entries that are older than this time will be deleted | | expiresTtlVariance | (integer, ms) | 0 | How many ms to vary the expiresTtl (+/-, to prevent cache slams) | | evictCheckInterval | (integer, ms) | 0 | If >0, will check for expired cache entries and delete them from the cache | | ttlVariance | (integer, ms) | 0 | (Alias for other variance options) How many ms to vary the staleTtl and expiresTtl (+/-, to prevent cache slams) | | maxSize | (integer) | null | Adds a max size for the cache, when elements are added a size is needed. When the cache gets too big LRU purging occurs.[2] | | emitEvents | (boolean) | true | Enable event emission, see 'Event' section | | events | (Object) | {} | A list of callbacks for events, keyed by the event name. Ex. { fetch: function(fetchInfo) { console.log(fetchInfo.key); } } will log each key that is fetched from the original data source. |

Notes:

[1] The fetcher callback's options are the same as set() below. This allows indivudual keys to have different settings.

[2] maxSize is most effective when combined with the size option when individual keys are set. See the below methods for more information.

get(key, [options], callback)

This will try and get key (a string) from the cache. By default if the key doesn't exist, the cache will call the configured fetcher to get the value. A lock is also set on the key while the value is retrieved. When the value is retrieved it is saved in the cache and used to call callback. Other requests to get this key from the cache are also resolved.

| Option | Type | Default | Description | | ------ | ---- | ------- | ----------- | | skipFetch | (boolean) | false | If true, will not try and fetch value if it doesn't exist in the cache. | | forceFetch | (boolean) | false | If true, will always refetch from the configured fetcher and not use the cache. |

set(key, value, [options], callback)

Set a value to the cache. Will call callback (an error first callback) with a true/false for success when done.

| Option | Type | Default | Description | | ------ | ---- | ------- | ----------- | | staleTtl | (integer, ms) | crispCache.defaultStaleTtl | How long the cache entry is valid before becoming stale. | | expiresTtl | (integer, ms) | crispCache.defaultExpiresTtl | If >0, cache entries that are older than this time will be deleted | | size | (integer) | 1 | Required when maxSize is set on the cache, specifies the size for this cache entry. |

del(key, [callback])

Removes the provided key (a string) from the cache, will call callback (an error first callback) when the delete is done.

getUsage([options])

Returns some basic usage when using maxSize/LRU capabilities.

| Option | Type | Default | Description | | ------ | ---- | ------- | ----------- | | keysLimit | (integer) | 0 | Limit the returned keys array to this value. None by default (fastest) |

Returns: An object of the current cache state. Also returns a sorted keys array. The keys are sorted by size when LRU is enabled, otherwise they are in alphabetical order.

{
	size (integer),
	maxSize (integer),
	hitRatio (integer),
	getSetRatio (integer),
	get: {
		count (integer),
		hit (integer),
		miss (integer),
		stale (integer)
	},
	set: {
		count (integer)
	},
	count (integer), // The total number of keys in the cache (even expired ones)
	keys: [
		{
			key (integer),
			size (integer)
		},
		...
	]
}

resetUsage()

Reset usage stats back to zero. Subsequent calls to getUsage() will only reflect activity since the last time resetUsage() was called. Stats like size, maxSize, count, and keys aren't reset since those are derived from options or cached data.

CrispCache.wrap(originalFn, [options])

Wraps an asynchronous function in a CrispCache cache. This allows you to easily create cached versions of functions, which implement the same interface as the original functions.

For example:

var cachedReadFile = Cache.wrap(fs.readFile, {
  // Create a cache key from the original function arguments
  createKey: function(filePath, encoding) {
	  return [filePath, encoding].join('__');
  }),
  // Convert your cache key back to an array of arguments
  // to pass to the original function
  parseKey: function(key) {
	  return key.split('__');
  },
  // Update cache entry options, based on the cached value, 
  // and the original function arguments.
  // Accepts all of the same options as CrispCache#set()
  getOptions: function(data, args) {
	  return {
		  size: data.length,
		  expiresTtl: new RegExp('/tmp').test(args[0]) ? 0 : 1000 * 60
	  };
  },
  // Accepts all of the same options as `new CrispCache`
  defaultExpiresTtl: 1000 * 60,
  maxSize: 1024 * 1024 * 5
});

// cachedReadFile has the same signature as `fs.readFile`
cachedReadFile('/path/to/file', 'utf8', function(err, contents) {
  // contents are now cached

  // Calling the cached function again will return the cached value
  cachedReadFile('/path/to/file', 'utf8', /*... */)
});

| Option | Type | Default | Description | | ------ | ---- | ------- | ----------- | | createKey | (Function) | If omitted, a static key will be used for all calls to the cached function | Create a unique cache key using the function arguments. | | parseKey | (Function) | Not required if createKey is omitted (in which case, the original function will receive no arguments besides callback). | Convert a cache key into an array of function arguments. This should be the inverse of createKey (parseKey(createKey(key)) === key). See: Events | | events | (Object) | null | A list of callbacks for events, keyed by the event name. Ex. { fetch: function(fetchInfo) { console.log(fetchInfo.key); } } will log each key that is fetched from the original data source. | | ... | | | All options accepted by the CrispCache constructor are also accepted by CrispCache.wrap. See new CrispCache() documentation. Note: Underlying cache instance is exposed via Cache.wrap()._cache. Be careful with this as the keys are computed with the provided createKey function.

Advanced Usage

Events

Events are emitted by Crisp Cache via the emitEvents creation option, true by default. The following method emit events:

get

| Event Name | Fired When | Arguments | | ---------- | ---- | --------- | | hit | The cache is hit | { key, entry } key being the requested key, entry is the found cache entry (entry.value may be helpful) | | miss | There is a cache miss | { key } key being the requested key |

fetch

When fetch (the function provided to keep the cache up to date, configured at creation) is called internally, Crisp Cache will emit the following:

| Event Name | Fired When | Arguments | | ---------- | ---- | --------- | | fetch | Right before fetch() is called | { key } key being the requested key | | fetchDone | Once fetch returns with a value | { key, value, options } key being the requested key, value the value returned from fetch(), and options are the caching options returned. |

del

| Event Name | Fired When | Arguments | | ---------- | ---- | --------- | | delete | An entry is deleted from the cache | { key, entry } key being the requested key, entry is the found cache entry (entry.value may be helpful) |

staleCheck

When the stale check is called (on the configured interval) the following events will be emitted:

| Event Name | Fired When | Arguments | | ---------- | ---- | --------- | | staleCheck | Right before stale check loop is called | none | | staleCheckDone | After the stale check is complete | [ key0, key1, etc. ] array of keys that were sent to the fetcher to be refetched. |

evictCheck

When the evict check is called (on the configured interval) the following events will be emitted:

| Event Name | Fired When | Arguments | | ---------- | ---- | --------- | | evictCheck | Right before evict check loop is called | none | | evictCheckDone | After the evict check is complete | { key: cacheObj, key2: cacheObj, etc. } a cache like object of keys and cache objects that were evicted from the cache. |

Dynamic TTLs

TTLs can be set on a per-item basis in the fetch() callable provided to Crisp Cache.

Lets say we want to create a for data we know expires every minute (60,000 ms). Our data source will provide how long ago each record was created. We can dynamically set our TTL so we are never serving bad data.

var CrispCache = require('crisp-cache');

var MAX_AGE = 60000;
var data = {
    a: {
        name: "Aaron",
        createdAgo: 12000
    },
    b: {
        name: "Betsy",
        createdAgo: 24000
    },
    c: {
        name: "Charlie",
        createdAgo: 35000
    }
};
function fetcher(key, callback) {
    var record = data[key];
    if (record) {
        var timeLeft = MAX_AGE - record;
        return callback(null, record, {expiresTtl: timeLeft});
    }
    else {
        return callback(new Error("Record with key: " + key + " wasn't found"));
    }
}

crispCacheBasic = new CrispCache({
    fetcher: fetcher
});

crispCacheBasic.get('a', function (err, value) {
    //CrispCache will keep "a" in the cache for 48 seconds (60 - 12)
});

What about stale times?

The previous example is great, but can we be smarter about how we fetch data?

If we want a high throughput application, we can ensure users of the cache are getting fast results by using a stale ttl in accordance with expires.

[Same MAX_TIME and data from above example]

function fetcher(key, callback) {
    var record = data[key];
    if (record) {
        var staleTime = MAX_AGE - record;
        var expiresTime = staleTime + 10000
        return callback(null, record, { staleTtl: staleTime, expiresTtl: expiresTime });
    }
    else {
        return callback(new Error("Record with key: " + key + " wasn't found"));
    }
}

crispCacheBasic = new CrispCache({
    fetcher: fetcher,
    staleCheckInterval: 5000 //Check for stale records every 5 seconds
});

crispCacheBasic.get('a', function (err, value) {
    // CrispCache will keep "a" in the cache for 58 seconds max (60 - 12 + our 10 second buffer)
    // NOTE: CrispCache will automatically look for stale records and try to update them in the background.
    // Users will get near immediate response times when looking key 'a', users looking for 'a' around 48 seconds after
    //    it was cached may still see the original value for 'a', but CrispCache is in the background asking for an 
    //    update to the stale data. When new data is available, users requesting 'a' will get the new record instead.
});

maxSize and LRU

If a maxCache option is provided a Least Recently Used (LRU) module is loaded to handle evicting cache entries that haven't been touched in a while. This helps us maintain a maxSize for the cache.

We can create and use a new cache using maxSize:

var crispCacheBasic = new CrispCache({
    fetcher: fetcher,
    maxSize: 10
});

// Call the following series, taking a small liberties
crispCacheBasic.set("testA", "The Value A", {size: 2}, callback);
crispCacheBasic.set("testB", "The Value B", {size: 8}, callback);
crispCacheBasic.set("testC", "The Value C", {size: 5}, callback);

Will result in the cache containing just the testC entry. The testA entry was added, then the testB entry. These are both held in cache because their sizes meet the maxSize of 10 but don't exceed it yet. When testC is added however, the cache finds that testA is the oldest and removes it. Seeing that the cache is still too large (testC's 5 + testB's 8 > our maxSize of 10) it removes testB too, leaving us with just testC in the cache.

Error Handling

CrispCache handles errors returned by the fetcher differently, depending on the state of your cache. The intent of this behavior to smooth out hiccups in flaky asynchronous services, using a valid cached value whenever possible.

While your cache is empty, fetcher errors will be propagated:

var cache = new CrispCache({
  fetcher: function(key, cb) {
    cb(new Error());
  }
});

cache.get('key', function(err, val) {
  // err!
});

While your cache is active or stale, fetcher errors will be ignored, and the last available value will be used:

var i = 0;
var cache = new CrispCache({
  fetcher: function(key, cb) {
	  // Return a value, on the first request
	  if (i === 0) {
		i++;
		return cb(null, 'first value');
	  }
	  // Throw errors, after the first request
	  cb(null, new Error());
  },
  defaultStaleTtl: 1000 * 60,
  defaultExpiresTtl: 1000 * 60 * 5,
});

cache.get('key', function(err, val) {
	// val === 'first value';
});
 
//...anytime within the next 5 minutes...
cache.get('key', function(err, val) {
	// val === 'first value'
});

While your cache is expired, fetcher errors will be propagated:

var i = 0;
var cache = new CrispCache({
  fetcher: function(key, cb) {
	  // Return a value, on the first request
	  if (i === 0) {
		i++;
		return cb(null, 'first value');
	  }
	  // Throw errors, after the first request
	  cb(null, new Error());
  },
  defaultStaleTtl: 1000 * 60,
  defaultExpiresTtl: 1000 * 60 * 5,
});

cache.get('key', function(err, val) {
	// val === 'first value';
});
 
// ...5 minutes later...
cache.get('key', function(err, val) {
	// err!
});

Caching errors

If you want all errors to be propagated, you could wrap CrispCache to cache errors like so:

function asyncFn(key, cb) {
	// ...
}

var cache = new CrispCache({
	fetcher: function(key, cb) {
		asyncFn(key, function(err, val) {
			// Cache the error, as though it were a value
			if (err) {
				return cb(null, err);
			}
			cb(null, val);
		});
	},
	getOptions: function(val) {
		// Only cache errors for 30 seconds
		if (val instanceof Error) {
			return {
				expiresTtl: 1000 * 30
			}
		}

		// Cache regular values for 5 minutes
		return {
			expiresTtl: 1000 * 60 * 5
		};
	}
});

function cachedAsyncFn(key, cb) {
	cache.get(key, function(err, val) {
		// Return error-type values as errors
		if (val instanceof Error) {
			return cb(val);
		}

		cb(err, val);
	});
}

Roadmap

  • Add different caching backends (memory is the only one supported now)