npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

shared-http-cache

v1.0.12

Published

Node.Js Utility for fetching multiple HTTP resources with browser-like cache management.

Readme


title: Shared HTTP Cache

description: Node.Js Utility for fetching multiple HTTP resources with browser-like cache management.


Overview

Shared HTTP Cache Semantics

This implementation models a shared HTTP cache that follows the semantics defined in RFC 9111, with explicitly documented assumptions and controlled decisions. It uses a content-addressed cache through cacache providing lockless, high-concurrency cache access, that stores along with the downloaded content the associated HTTP response headers in the cache metadata. Overall, the design aims to provide browser-like caching behavior in a Node.js environment.

Scope and assumptions

The cache is shared (not private) and applies shared-cache rules.

  • Request methods other than GET are not stored.
  • Private responses (request Authorization, response Cache-Control="private" and Set-Cookie headers) not stored.
  • Variant responses (response header Vary="*") are not stored.
  • Partial content (response Content-Range header) is not stored.
  • Time calculations rely exclusively on locally recorded timestamps, not on server-provided Date.
  • No heuristic freshness is used.
  • Storage and eviction are deterministic; no background or implicit cleanup is assumed.

Request initialization and cache lookup

Each request begins by determining whether a cached response exists.

If no cached entry exists:

  • If the request Cache-Control header has only-if-cached directive, the cache returns a 504 HTTP status.
  • Otherwise, the request is sent to the origin server.

Cache-Control exclusions

Two Cache-Control directives may short-circuit normal cache usage:

  • no-cache (request or response): cached data cannot be used without revalidation.
  • no-store (request or response): the response must not be stored. When no-store applies, the response is served directly and bypasses storage entirely.

Freshness evaluation

If a cached response exists and is not excluded, strict freshness is evaluated first.

Freshness lifetime is computed from response headers:

  • Cache-Control header's s-maxage directive first, then max-age, if present
  • otherwise Expires header, if present

Current age is derived from local metadata:

currentAge = now − storedTime + incomingAge

The incomingAge is taken from the stored response Age header, if present.

Remaining freshness:

remainingFreshness = freshnessLifetime − currentAge

If request Cache-Control header includes min-fresh directive, its value is deducted from the remainingFreshness:

remainingFreshness = remainingFreshness − minimumFreshness

If remainingFreshness ≥ 0, the response is served as fresh.

Stale handling

If the response is stale, request Cache-Control header's max-stale directive is evaluated, if present.

If max-stale is present, but its value is unspecified → accept any staleness; otherwise the response is acceptable if:

currentAge ≤ freshnessLifetime + maximumStaleness

If staleness exceeds the acceptable max-stale, the cache proceeds toward revalidation or origin fetch.

Revalidation constraints

Even though the request Cache-Control header's max-stale directive allows use of stale data:

  • response Cache-Control header's must-revalidate or proxy-revalidate directives forbid serving stale.
  • In that case, the cache must revalidate or fetch from the origin.
  • If request Cache-Control header's only-if-cached directive also applies, the cache returns a 504 HTTP status instead of revalidating.
  • If no revalidation constraint applies, stale content may be served.

On revalidation, if the cached content includes ETag or Last-Modified, If-None-Match or respectively If-Modified-Since headers are automatically added to the request. Revalidated entries are explicitly replaced during each successful fetch to avoid unbounded growth in the index.

Subresource integrity

The subresource integrity specifications are implemented on both, fetch and storage:

  1. if a request does not provide an integrity hash, then cacache will compute it using its default algorithm: sha512, and subsequent requests may use that hash to retrieve the resource directly from cache using store.get.byDigest function along with the regular search by url store.get.
  2. if a request does provide an integrity hash:
    • if the related resource is not stored, then node:fetch will use it to verify the incoming response,
    • if the related resource is stored and fresh, or is stale but revalidated with the origin server, then cacache will use the hash to:
      • get the resource directly from cache if the stored integrity hash matches the one provided, or,
      • recompute the path and rebase the resource on the new path if the provided integrity hash is different. In other words, multiple integrity hashes may validate a resource, but only the last provided hash is responsible for its storage path, as cacache can work with a single algorithm at a time. This situation may only be encountered when a resource was initially stored without an integrity hash provided in the request, or when a different integrity hash is provided in the request for the same resource. An exception for resource rebase is the case when a different integrity hash is provided in the request along with Cache-Control header's max-stale directive.

Origin request outcomes

When a request is sent to the origin:

  • 2xx: response is stored (unless restricted) and served.
  • 304 Not Modified: cached metadata is updated; response is served as fresh.
  • 410 Gone: cached entry is removed.
  • Other responses: treated as errors and returned directly.

Cleanup behavior

The cache does not:

  • apply heuristic freshness
  • perform automatic eviction based on staleness

However:

  • a 410 Gone response explicitly removes the cached entry.
  • additional cleanup mechanisms are available to the user via the underlying storage system.

State diagram

The accompanying state diagram represents the full decision flow:

Diagram

Legend

  1. no-cache may appear on request or response and always requires revalidation.
  2. no-store may appear on request or response; See scope and assumptions for storage limitations.
  3. Freshness evaluation excludes max-stale that is evaluated only after strict freshness fails.
  4. 410 Gone cleanup is an explicit design choice to keep the cache coherent; no heuristic eviction is used.

Install

npm i shared-http-cache

Usage

Init

new SharedHttpCache(options?) -> SharedHttpCache
const SharedHttpCache = require('shared-http-cache');
const sharedHttpCache = new SharedHttpCache();

Init options

new SharedHttpCache({ cacheDir?: string, requestTimeoutMs?: number, awaitStorage?: boolean, deferGarbageCollection: boolean }) -> SharedHttpCache
  • cacheDir: cache storage directory (default .cache).
  • requestTimeoutMs: amount of time in milliseconds after which a request is timed out (default: 5000).
  • awaitStorage: await cache writes before continuing (default false).
  • deferGarbageCollection: defer garbage collection to a later action (default true). If false, the stored content index file is replaced with a clean new one impacting performance.
const sharedHttpCache = new SharedHttpCache({ cacheDir: '/tmp/http-cache', awaitStorage: true, requestTimeoutMs: 1000 });

Fetch

fetch is the only method available. On success, fetch resolves to the same instance, enabling chained workflows.

sharedHttpCache.fetch(requests) -> Promise<this | Error[]>

Syntax:

fetch([{ url: string, integrity?: string, options?: RequestInit, callback?: function }]) -> Promise<this | Error[]>

Simple fetch call

await sharedHttpCache.fetch([
    {
        url: 'https://example.com/data.txt',
        callback: ({ buffer }) => console.log(buffer.toString()),
    },
]);

Fetch with callback and error handling

Errors encountered during fetches are collected, and the returned promise either resolves with the instance itself for successful fetches or rejects with a list of errors for failed requests.

The response is converted into a Buffer served to callback, then stored in the cache along with the response headers.

callback({ buffer: Buffer, headers: Headers, fromCache: boolean, index: number }) -> void

The callback provided for each request is executed before storing new content, allowing implementers to inspect, transform or validate the data before it's cached. The errors thrown by the callback are also caught and stored in the errors delivered by the Promise.reject().

await sharedHttpCache
    .fetch([
        {
            url: 'https://example.com/data.txt',
            callback: ({ buffer, headers, fromCache, index }) => {
                console.log(buffer.toString());
                console.log(headers);
                console.log(index, fromCache);
            },
        },
    ])
    .catch((errors) => errors.forEach((entry) => console.error(entry.index, entry.url, entry.error.message)));

Fetch multiple files

const urls = ['https://example.com/file1', 'https://example.com/file2'];
const parser = ({ url, buffer, headers, fromCache, index }) => {
    console.log(index, fromCache, url);
    console.log(headers);
    console.log(buffer.toString());
};

const requests = urls.map((url) => ({ url, callback: (response) => parser({ ...response, url }) }));

sharedHttpCache.fetch(requests).catch((errors) => errors.forEach((entry) => console.error(entry.index, entry.url, entry.error.message)));

Fetch with integrity

await sharedHttpCache.fetch([
    {
        url: 'https://example.com/file.bin',
        integrity: 'sha256-abcdef...',
        callback: ({ buffer }) => console.log(buffer.length),
    },
]);

Fetch options

fetch.options -> RequestInit

fetch.options are passed directly to node:fetch. They follow standard RequestInit semantics (method, credentials, headers, mode, cache-mode, etc.).

Fetch with Accept: application/json

await sharedHttpCache.fetch([
    {
        url: 'https://api.example.com/list',
        options: { headers: { Accept: 'application/json' } },
        callback: ({ buffer }) => console.log(buffer.toString()),
    },
]);

Fetch with Cache-Control: no-cache

await sharedHttpCache.fetch([
    {
        url: 'https://example.com/data',
        options: { headers: { 'Cache-Control': 'no-cache' } },
        callback: ({ fromCache }) => console.log(fromCache),
    },
]);

Fetch with Cache-Control: max-stale

await sharedHttpCache.fetch([
    {
        url: 'https://example.com/data',
        options: { headers: { 'Cache-Control': 'max-stale=3600' } },
        callback: ({ fromCache }) => console.log(fromCache),
    },
]);

Fetch with HEAD method

await sharedHttpCache.fetch([
    {
        url: 'https://example.com/resource',
        options: { method: 'HEAD' },
        callback: ({ headers }) => console.log(headers),
    },
]);

Storage management

The underlying cache store (cacache) is exposed directly.

sharedHttpCache.store -> cacache

Listing (example with promise)

sharedHttpCache
    .fetch(requests)
    .then((sharedHttpCache) => sharedHttpCache.store.ls(sharedHttpCache.cacheDir))
    .then(console.log)
    .catch((errors) => console.error('Errors:', errors));

Compacting (example with await)

sharedHttpCache.store.verify(cacheDir) -> Promise<Object>
// deadbeef collected, because of invalid checksum.
sharedHttpCache.store.verify(sharedHttpCache.cacheDir).then((stats) => {
    console.log('cache is much nicer now! stats:', stats);
});

Basic cleanup strategy

const SharedHttpCache = require('shared-http-cache');
// only-if-cached also means ... and is not stale!
(async () => {
    const cache = new SharedHttpCache({ cacheDir: '.cache', awaitStorage: true });
    const entries = await cache.store.ls(cache.cacheDir);
    const requests = Object.keys(entries).map((url) => ({ url, options: { headers: { 'cache-control': 'only-if-cached' } } }));
    await cache.fetch(requests).catch(async (errors) => {
        for (const { url } of errors) {
            const file = url && await cache.store.get.info(cache.cacheDir, url);
            if (file) {
                await cache.store.rm.entry(cache.cacheDir, url, { removeFully: true });
                await cache.store.rm.content(cache.cacheDir, file.integrity);
            }
        }
    });
})();

Note:

  • This is a fully RFC 9111 compliant strategy that cleans up all the resources that can be determined as expired based on the stored response headers. For a more flexible approach, the max-stale=$acceptedStaleness directive can be used in conjunction with only-if-cached. Cleanup strategies that rely on empirical calculations, such as least recently used, are NOT RECOMMENDED.

Other available operations

  • sharedHttpCache.store.put(...)
  • sharedHttpCache.store.get(...)
  • sharedHttpCache.store.get.info(...)
  • sharedHttpCache.store.rm.entry(...)
  • sharedHttpCache.store.rm.content(...)

See full list of cacache options.

Bottom line

  • max-stale is intended to be used: many servers enforce max-age=0, but clients know how much staleness they can tolerate. Using max-stale (recommended up to 24 h) can significantly reduce network requests.
  • providing integrity on requests enables fast loads by allowing cached content to be read directly from store.
  • SharedHttpCache init with awaitStorage: true is important when fetch is continued with store actions.
  • private or sensitive content is served, but not stored.
  • cache cleanup and eviction are deliberately left to the consumer; a well-chosen cleanup strategy is essential for maintaining good performance.