npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@callsy/cache

v1.1.1

Published

Cache utilities.

Readme

@callsy/cache

A multi-tier caching library for Node.js and TypeScript. Three cache levels — in-memory, Redis, and DynamoDB — behind a single unified API with automatic fallback population and TTL management.

Features

  • Three cache tiers — Local (in-memory), Hot (Redis), and Cold (DynamoDB) with identical interfaces.
  • Multi-tier fallback chains — link any number of cache tiers into a chain. Misses cascade down; hits promote back up. Like CPU L1/L2/L3 caches.
  • Fallback-driven caching — provide a fallback function and the cache populates itself on a miss. No manual put calls needed.
  • Automatic promotion — when a deeper tier serves a cache hit, the value is automatically promoted into every faster tier above it.
  • Write-through propagationput, clearItem, and clearAll propagate through the entire chain, keeping all tiers consistent.
  • Two primary accessorstryGet for nullable results, forceGet for guaranteed non-null results.
  • Automatic TTL — time-to-live support across all tiers, delegated to the native TTL mechanism of each backing store.
  • Hit tracking — built-in per-key hit counters for monitoring cache effectiveness.
  • Graceful degradation — network failures in Hot and Cold caches are logged, not thrown. Your app keeps running.
  • TypeScript-first — fully typed generics, exported interfaces, and declaration files.

Installation

npm install @callsy/cache

Requirements: Node.js >= 18.0.0

Cache tiers — when to use which

The library provides three cache implementations. All share the same API. The difference is where data lives and how long it survives.

| | LocalCache | HotCache | ColdCache | |---|---|---|---| | Backing store | In-memory (@isaacs/ttlcache) | Redis | AWS DynamoDB | | Latency | Sub-millisecond, zero I/O | Low milliseconds (network) | ~10-50ms (network) | | Persistence | Process lifetime only | Survives process restarts | Permanent until TTL or deletion | | Shared across processes | No | Yes | Yes | | Max entries | 10,000 (hardcoded) | Unlimited (Redis memory) | Unlimited (DynamoDB capacity) | | External dependency | None | Redis server | AWS account + DynamoDB table |

LocalCache — fast, in-process, zero setup

Use LocalCache when you need the fastest possible reads with no network overhead. Data lives in the Node.js process memory and disappears when the process exits.

Best for:

  • High hit-rate lookups in a single long-running server.
  • Caching expensive computations (template rendering, config parsing).
  • Development and testing with no infrastructure requirements.

Not suited for:

  • Serverless functions (each invocation is a new process — the cache is always empty).
  • Multi-server deployments where cache must be shared.
import { LocalCache } from '@callsy/cache'

// Standalone — no fallback.
const cache = new LocalCache()

// Or as the front of a chain (see "Multi-tier fallback chains" below).
const cache = new LocalCache(hotCache)

HotCache — distributed, shared across processes

Use HotCache when multiple processes, servers, or serverless function instances need to share the same cached data. Backed by Redis, it survives individual process restarts and is accessible from any instance that can reach the Redis server.

Best for:

  • Distributed systems, server fleets, or auto-scaling groups.
  • Serverless functions (Lambda, Cloud Functions) where cache must persist between invocations.
  • Session-like data that multiple services need to read.

Not suited for:

  • Data that must survive a Redis restart (Redis is in-memory by default).
  • Very long-term storage (weeks/months) — use ColdCache instead.
import { HotCache } from '@callsy/cache'

// Standalone — no fallback.
const cache = new HotCache('redis://localhost:6379')

// Or with a deeper fallback tier.
const cache = new HotCache('redis://localhost:6379', coldCache)

The constructor connects to Redis immediately. Connection errors are logged to stderr but do not throw — subsequent get/put calls will fail gracefully and return null.

ColdCache — persistent, long-term, serverless

Use ColdCache for data that must survive indefinitely and is accessed infrequently. Backed by DynamoDB, it scales automatically and charges per-request — ideal for archival or low-frequency lookups.

Best for:

  • Long-lived cache entries (days, weeks, months).
  • Infrequently accessed data where Redis costs aren't justified.
  • Compliance or audit scenarios where cached results must be retained.

Not suited for:

  • High-frequency, low-latency lookups (DynamoDB adds network latency per call).
  • Use cases where sub-millisecond response matters — use LocalCache.
import { ColdCache } from '@callsy/cache'

// Standalone — no fallback.
const cache = new ColdCache('my-cache-table', {
  region: 'eu-west-1',
  credentials: {
    accessKeyId: 'AKIA...',
    secretAccessKey: '...'
  }
})

// Or with an even deeper fallback tier.
const cache = new ColdCache('my-cache-table', credentials, evenDeeperCache)

DynamoDB table requirements:

  • Partition key: key (String).
  • TTL attribute: ttl (Number, unix timestamp in seconds). Enable DynamoDB TTL on this attribute for automatic expiration.

Quick start

import { LocalCache } from '@callsy/cache'

const cache = new LocalCache()

// Fetch a user — from cache if available, otherwise from the database.
const user = await cache.tryGet({
  prefix: 'users',
  key: 'user-123',
  fallback: async () => await db.users.findById('user-123'),
  durationSeconds: 300
})

// user is User | null
if (user) {
  console.log(user.name)
}

That's it. You never call put manually — the fallback function is invoked on a cache miss and its result is automatically stored.

Multi-tier fallback chains

Any cache tier accepts an optional fallbackCache in its constructor, forming a chain similar to CPU L1/L2/L3 caches. You interact only with the front of the chain — cascading reads, promotion, and write-through are handled automatically.

Building a chain

Chains are built deepest-first. Each tier wraps the next one.

import { LocalCache, HotCache, ColdCache } from '@callsy/cache'

const cold  = new ColdCache('my-cache-table', awsCredentials)           // L3
const hot   = new HotCache('redis://localhost:6379', cold)              // L2 → L3
const cache = new LocalCache(hot)                                       // L1 → L2 → L3

You only call methods on cache. The chain depth is unlimited and any combination of tiers works. Two-tier chains (e.g. LocalCacheHotCache) are equally valid.

Read path (tryGet)

On a tryGet call, each tier checks its own store. On a miss, it delegates to the next tier. Only the terminal cache (the one with no fallback) calls fallback(). On the way back up, each tier promotes the result into its own store so subsequent reads hit the fastest tier.

L1 miss → L2 miss → L3 miss → fallback() → cache in L3 → promote to L2 → promote to L1

After a full miss, all tiers contain the value. After a process restart, L1 is empty but L2 (Redis) serves the hit and promotes back to L1 — no database call needed.

Write-through

put, clearItem, and clearAll propagate through the entire chain sequentially, keeping all tiers consistent.

await cache.put({ prefix: 'config', key: 'v1', value: JSON.stringify(data) })
// Written to L1, L2, and L3.

await cache.clearItem('config', 'v1')
// Deleted from L1, L2, and L3.

fresh flag in chains

When fresh: true, every tier in the chain skips its own store. The request cascades all the way to fallback(), and the fresh result is re-cached in every tier on the way back up.

Core API: tryGet and forceGet

These two methods are the primary way to interact with every cache tier. All other methods (put, get, clearItem, etc.) are lower-level building blocks that you rarely need to call directly.

How it works

tryGet / forceGet called
       |
       v
  Cache enabled       --no--+
  and fresh != true?        |
       |                    |
      yes                   |
       |                    |
       v                    |
  Read from own store       |
       |                    |
       v                    |
  Cache hit? --yes--> Increment hit counter --> Return
       |
       no  <----------------+
       |
       v
  Has fallback cache? --yes--> fallback.tryGet(props) --> Promote result to this tier --> Return
       |
       no
       |
       v
  Call fallback() --> Cache result --> Return

In a chain, the middle step recurses — each tier delegates to the next until a hit is found or the terminal cache calls fallback().

tryGet<T>(props): Promise<T | null>

The safe accessor. Returns T on a cache hit or successful fallback, null if the result is empty or if serialization fails.

const product = await cache.tryGet<Product>({
  prefix: 'products',
  key: 'prod-456',
  fallback: async () => await api.getProduct('prod-456'),
  durationSeconds: 600
})

if (!product) {
  // Handle missing product — either the fallback returned null
  // or there was a serialization error.
  return notFound()
}

Use tryGet when:

  • The data might legitimately not exist (e.g. a database lookup that can return null).
  • You want to handle the null case yourself.
  • You prefer graceful degradation over exceptions.

forceGet<T>(props): Promise<T>

The strict accessor. Identical to tryGet, but throws CacheError if the result is null. The return type is T (never null), so you don't need a null check.

import { CacheError } from '@callsy/cache'

try {
  const config = await cache.forceGet<AppConfig>({
    prefix: 'config',
    key: 'app-settings',
    fallback: async () => await loadConfig(),
    durationSeconds: 3600
  })

  // config is AppConfig — guaranteed non-null.
  console.log(config.appName)
} catch (error) {
  if (error instanceof CacheError) {
    // The fallback returned null — this is unexpected for config.
    console.error('Critical: app config is missing')
  }
}

Use forceGet when:

  • The data must exist — a null result indicates a bug or system failure.
  • You want to catch problems early rather than propagate null through your code.
  • The data is something like configuration, feature flags, or seed data that should always be present.

FallbackGetProps<T>

Both tryGet and forceGet accept the same props object:

| Property | Type | Required | Description | |---|---|---|---| | prefix | string | Yes | Namespace for key separation. Combined with key as prefix#key. | | key | string | Yes | Unique identifier within the prefix. | | fallback | () => Promise<T> | Yes | Async function called on a cache miss. Its return value is cached. | | durationSeconds | number | No | TTL in seconds. Omit for no expiration (tier defaults apply). | | fresh | boolean | No | When true, skip cache and call fallback immediately. The result is still cached for subsequent calls. |

Key format: Cache keys are stored as ${prefix}#${key}. Both prefix and key must be non-empty strings — if either is empty, the method returns null immediately.

The fresh flag

Setting fresh: true forces a cache bypass — the fallback function is called regardless of whether a cached value exists. The fresh result is then cached, replacing any stale entry. In a multi-tier chain, fresh propagates through every tier (see "Multi-tier fallback chains").

// Normal read — returns cached value if available.
const user = await cache.tryGet({
  prefix: 'users',
  key: 'user-123',
  fallback: async () => await db.users.findById('user-123'),
  durationSeconds: 300
})

// Force a fresh read — always calls the database, then caches the result.
const freshUser = await cache.tryGet({
  prefix: 'users',
  key: 'user-123',
  fallback: async () => await db.users.findById('user-123'),
  durationSeconds: 300,
  fresh: true
})

This is useful after a write operation when you know the cached value is stale, or when a user explicitly requests the latest data.

Additional methods

These methods are available on all cache tiers. You typically don't need them — tryGet/forceGet handle most use cases — but they're available for cache management and debugging.

| Method | Signature | Description | |---|---|---| | put | put(item: CacheItem): Promise<void> | Store an item directly. Propagates through the entire fallback chain. | | get | get(prefix: string, key: string): Promise<CacheItemResponse \| null> | Retrieve a raw cache entry. Falls back through the chain but does not promote values upward. | | clearItem | clearItem(prefix: string, key: string): Promise<void> | Remove a single cache entry. Propagates through the entire fallback chain. | | clearAll | clearAll(): Promise<void> | Remove all entries. Propagates through the entire fallback chain. | | reset | reset(): Promise<void> | Clear all entries and reset hit counters to zero. Cascades through the chain. | | enable | enable(): void | Enable caching (default state). | | disable | disable(): void | Disable caching. The chain still delegates through a disabled tier, but the disabled tier does not read, write, or receive promotions. | | getHits | getHits(prefix: string, key: string): number | Return the number of cache hits for a specific key. Per-tier — only counts hits from this tier's own store. |

CacheItem

The raw shape accepted by put:

interface CacheItem {
  prefix: string
  key: string
  value: string        // Must be a JSON-serialized string.
  durationSeconds?: number
}

CacheItemResponse

The shape returned by get, extending CacheItem with metadata:

interface CacheItemResponse extends CacheItem {
  createdAt: Date      // When the entry was stored.
  ttlAt?: Date         // When the entry expires (undefined = no expiration).
}

Error handling

The library is designed to fail gracefully. Network errors in HotCache and ColdCache are caught and logged to stderr — they never crash your application.

| Scenario | Behavior | |---|---| | put fails (Redis down, DynamoDB error) | Error logged. No exception thrown. | | get fails (network error) | Error logged. Returns null. Fallback is called. | | Corrupt cache entry (invalid JSON) | Error logged. Entry cleared. Returns null. | | Serialization of fallback result fails | Error logged. Result is still returned (just not cached). | | forceGet result is null | Throws CacheError. This is the only case where an error is thrown. | | Empty prefix or key | Warning logged. Returns null / 0 (no exception). |

CacheError

A simple Error subclass. Thrown only by forceGet when the result is null.

import { CacheError } from '@callsy/cache'

try {
  const data = await cache.forceGet({ ... })
} catch (error) {
  if (error instanceof CacheError) {
    // forceGet returned null — handle accordingly.
  }
}

Exports

Everything is exported from the package entry point:

// Classes
import {
  CacheBase,    // Abstract base class — extend to build custom tiers.
  LocalCache,   // In-memory cache.
  HotCache,     // Redis cache.
  ColdCache,    // DynamoDB cache.
  CacheError    // Error thrown by forceGet.
} from '@callsy/cache'

// Types
import type {
  FallbackGetProps,     // Props for tryGet / forceGet.
  CacheItem,            // Raw cache item shape.
  CacheItemResponse     // Cache item with metadata (createdAt, ttlAt).
} from '@callsy/cache'

License

ISC