shared-http-cache
v1.0.12
Published
Node.Js Utility for fetching multiple HTTP resources with browser-like cache management.
Maintainers
Readme
title: Shared HTTP Cache
description: Node.Js Utility for fetching multiple HTTP resources with browser-like cache management.
Overview
Shared HTTP Cache Semantics
This implementation models a shared HTTP cache that follows the semantics defined in RFC 9111, with explicitly documented assumptions and controlled decisions. It uses a content-addressed cache through cacache providing lockless, high-concurrency cache access, that stores along with the downloaded content the associated HTTP response headers in the cache metadata. Overall, the design aims to provide browser-like caching behavior in a Node.js environment.
Scope and assumptions
The cache is shared (not private) and applies shared-cache rules.
- Request
methodsother thanGETare not stored. - Private responses (request
Authorization, responseCache-Control="private"andSet-Cookieheaders) not stored. - Variant responses (response header
Vary="*") are not stored. - Partial content (response
Content-Rangeheader) is not stored. - Time calculations rely exclusively on locally recorded timestamps, not on server-provided
Date. - No heuristic freshness is used.
- Storage and eviction are deterministic; no background or implicit cleanup is assumed.
Request initialization and cache lookup
Each request begins by determining whether a cached response exists.
If no cached entry exists:
- If the request
Cache-Controlheader hasonly-if-cacheddirective, the cache returns a504 HTTP status. - Otherwise, the request is sent to the origin server.
Cache-Control exclusions
Two Cache-Control directives may short-circuit normal cache usage:
no-cache(request or response): cached data cannot be used without revalidation.no-store(request or response): the response must not be stored. Whenno-storeapplies, the response is served directly and bypasses storage entirely.
Freshness evaluation
If a cached response exists and is not excluded, strict freshness is evaluated first.
Freshness lifetime is computed from response headers:
Cache-Controlheader'ss-maxagedirective first, thenmax-age, if present- otherwise
Expiresheader, if present
Current age is derived from local metadata:
currentAge = now − storedTime + incomingAgeThe incomingAge is taken from the stored response Age header, if present.
Remaining freshness:
remainingFreshness = freshnessLifetime − currentAgeIf request Cache-Control header includes min-fresh directive, its value is deducted from the remainingFreshness:
remainingFreshness = remainingFreshness − minimumFreshnessIf remainingFreshness ≥ 0, the response is served as fresh.
Stale handling
If the response is stale, request Cache-Control header's max-stale directive is evaluated, if present.
If max-stale is present, but its value is unspecified → accept any staleness; otherwise the response is acceptable if:
currentAge ≤ freshnessLifetime + maximumStalenessIf staleness exceeds the acceptable max-stale, the cache proceeds toward revalidation or origin fetch.
Revalidation constraints
Even though the request Cache-Control header's max-stale directive allows use of stale data:
- response
Cache-Controlheader'smust-revalidateorproxy-revalidatedirectives forbid serving stale. - In that case, the cache must revalidate or fetch from the origin.
- If request
Cache-Controlheader'sonly-if-cacheddirective also applies, the cache returns a504 HTTP statusinstead of revalidating. - If no revalidation constraint applies, stale content may be served.
On revalidation, if the cached content includes ETag or Last-Modified, If-None-Match or respectively If-Modified-Since headers are automatically added to the request. Revalidated entries are explicitly replaced during each successful fetch to avoid unbounded growth in the index.
Subresource integrity
The subresource integrity specifications are implemented on both, fetch and storage:
- if a request does not provide an
integrity hash, thencacachewill compute it using its default algorithm:sha512, and subsequent requests may use thathashto retrieve the resource directly from cache usingstore.get.byDigestfunction along with the regular search by urlstore.get. - if a request does provide an
integrity hash:- if the related resource is not stored, then
node:fetchwill use it to verify the incoming response, - if the related resource is stored and fresh, or is stale but revalidated with the origin server, then
cacachewill use thehashto:- get the resource directly from cache if the stored
integrity hashmatches the one provided, or, - recompute the path and rebase the resource on the new path if the provided
integrity hashis different. In other words, multipleintegrity hashesmay validate a resource, but only the last providedhashis responsible for its storage path, ascacachecan work with a singlealgorithmat a time. This situation may only be encountered when a resource was initially stored without anintegrity hashprovided in the request, or when a differentintegrity hashis provided in the request for the same resource. An exception for resource rebase is the case when a differentintegrity hashis provided in the request along withCache-Controlheader'smax-staledirective.
- get the resource directly from cache if the stored
- if the related resource is not stored, then
Origin request outcomes
When a request is sent to the origin:
2xx: response is stored (unless restricted) and served.304 Not Modified: cached metadata is updated; response is served as fresh.410 Gone: cached entry is removed.- Other responses: treated as errors and returned directly.
Cleanup behavior
The cache does not:
- apply heuristic freshness
- perform automatic eviction based on staleness
However:
- a
410 Goneresponse explicitly removes the cached entry. - additional cleanup mechanisms are available to the user via the underlying storage system.
State diagram
The accompanying state diagram represents the full decision flow:

Legend
no-cachemay appear on request or response and always requires revalidation.no-storemay appear on request or response; See scope and assumptions for storage limitations.- Freshness evaluation excludes
max-stalethat is evaluated only after strict freshness fails. 410 Gonecleanup is an explicit design choice to keep the cache coherent; no heuristic eviction is used.
Install
npm i shared-http-cacheUsage
Init
new SharedHttpCache(options?) -> SharedHttpCacheconst SharedHttpCache = require('shared-http-cache');
const sharedHttpCache = new SharedHttpCache();Init options
new SharedHttpCache({ cacheDir?: string, requestTimeoutMs?: number, awaitStorage?: boolean, deferGarbageCollection: boolean }) -> SharedHttpCachecacheDir: cache storage directory (default.cache).requestTimeoutMs: amount of time in milliseconds after which a request is timed out (default:5000).awaitStorage: await cache writes before continuing (defaultfalse).deferGarbageCollection: defer garbage collection to a later action (defaulttrue). Iffalse, the stored content index file is replaced with a clean new one impacting performance.
const sharedHttpCache = new SharedHttpCache({ cacheDir: '/tmp/http-cache', awaitStorage: true, requestTimeoutMs: 1000 });Fetch
fetch is the only method available. On success, fetch resolves to the same instance, enabling chained workflows.
sharedHttpCache.fetch(requests) -> Promise<this | Error[]>Syntax:
fetch([{ url: string, integrity?: string, options?: RequestInit, callback?: function }]) -> Promise<this | Error[]>Simple fetch call
await sharedHttpCache.fetch([
{
url: 'https://example.com/data.txt',
callback: ({ buffer }) => console.log(buffer.toString()),
},
]);Fetch with callback and error handling
Errors encountered during fetches are collected, and the returned promise either resolves with the instance itself for successful fetches or rejects with a list of errors for failed requests.
The response is converted into a Buffer served to callback, then stored in the cache along with the response headers.
callback({ buffer: Buffer, headers: Headers, fromCache: boolean, index: number }) -> voidThe callback provided for each request is executed before storing new content, allowing implementers to inspect, transform or validate the data before it's cached. The errors thrown by the callback are also caught and stored in the errors delivered by the Promise.reject().
await sharedHttpCache
.fetch([
{
url: 'https://example.com/data.txt',
callback: ({ buffer, headers, fromCache, index }) => {
console.log(buffer.toString());
console.log(headers);
console.log(index, fromCache);
},
},
])
.catch((errors) => errors.forEach((entry) => console.error(entry.index, entry.url, entry.error.message)));Fetch multiple files
const urls = ['https://example.com/file1', 'https://example.com/file2'];
const parser = ({ url, buffer, headers, fromCache, index }) => {
console.log(index, fromCache, url);
console.log(headers);
console.log(buffer.toString());
};
const requests = urls.map((url) => ({ url, callback: (response) => parser({ ...response, url }) }));
sharedHttpCache.fetch(requests).catch((errors) => errors.forEach((entry) => console.error(entry.index, entry.url, entry.error.message)));Fetch with integrity
await sharedHttpCache.fetch([
{
url: 'https://example.com/file.bin',
integrity: 'sha256-abcdef...',
callback: ({ buffer }) => console.log(buffer.length),
},
]);Fetch options
fetch.options -> RequestInitfetch.options are passed directly to node:fetch.
They follow standard RequestInit semantics (method, credentials, headers, mode, cache-mode, etc.).
Fetch with Accept: application/json
await sharedHttpCache.fetch([
{
url: 'https://api.example.com/list',
options: { headers: { Accept: 'application/json' } },
callback: ({ buffer }) => console.log(buffer.toString()),
},
]);Fetch with Cache-Control: no-cache
await sharedHttpCache.fetch([
{
url: 'https://example.com/data',
options: { headers: { 'Cache-Control': 'no-cache' } },
callback: ({ fromCache }) => console.log(fromCache),
},
]);Fetch with Cache-Control: max-stale
await sharedHttpCache.fetch([
{
url: 'https://example.com/data',
options: { headers: { 'Cache-Control': 'max-stale=3600' } },
callback: ({ fromCache }) => console.log(fromCache),
},
]);Fetch with HEAD method
await sharedHttpCache.fetch([
{
url: 'https://example.com/resource',
options: { method: 'HEAD' },
callback: ({ headers }) => console.log(headers),
},
]);Storage management
The underlying cache store (cacache) is exposed directly.
sharedHttpCache.store -> cacacheListing (example with promise)
sharedHttpCache
.fetch(requests)
.then((sharedHttpCache) => sharedHttpCache.store.ls(sharedHttpCache.cacheDir))
.then(console.log)
.catch((errors) => console.error('Errors:', errors));Compacting (example with await)
sharedHttpCache.store.verify(cacheDir) -> Promise<Object>// deadbeef collected, because of invalid checksum.
sharedHttpCache.store.verify(sharedHttpCache.cacheDir).then((stats) => {
console.log('cache is much nicer now! stats:', stats);
});Basic cleanup strategy
const SharedHttpCache = require('shared-http-cache');
// only-if-cached also means ... and is not stale!
(async () => {
const cache = new SharedHttpCache({ cacheDir: '.cache', awaitStorage: true });
const entries = await cache.store.ls(cache.cacheDir);
const requests = Object.keys(entries).map((url) => ({ url, options: { headers: { 'cache-control': 'only-if-cached' } } }));
await cache.fetch(requests).catch(async (errors) => {
for (const { url } of errors) {
const file = url && await cache.store.get.info(cache.cacheDir, url);
if (file) {
await cache.store.rm.entry(cache.cacheDir, url, { removeFully: true });
await cache.store.rm.content(cache.cacheDir, file.integrity);
}
}
});
})();Note:
- This is a fully
RFC 9111compliant strategy that cleans up all the resources that can be determined as expired based on the stored response headers. For a more flexible approach, themax-stale=$acceptedStalenessdirective can be used in conjunction withonly-if-cached. Cleanup strategies that rely on empirical calculations, such asleast recently used, areNOT RECOMMENDED.
Other available operations
- sharedHttpCache.store.put(...)
- sharedHttpCache.store.get(...)
- sharedHttpCache.store.get.info(...)
- sharedHttpCache.store.rm.entry(...)
- sharedHttpCache.store.rm.content(...)
See full list of cacache options.
Bottom line
max-staleis intended to be used: many servers enforcemax-age=0, but clients know how much staleness they can tolerate. Usingmax-stale(recommended up to 24 h) can significantly reduce network requests.- providing
integrityon requests enables fast loads by allowing cached content to be read directly from store. SharedHttpCacheinit withawaitStorage: trueis important whenfetchis continued withstoreactions.- private or sensitive content is served, but not stored.
- cache cleanup and eviction are deliberately left to the consumer; a well-chosen cleanup strategy is essential for maintaining good performance.
