npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@utkarsh851/typed-cache

v0.1.0

Published

A strongly typed in-memory LRU cache with TTL and async read-through loading

Downloads

4

Readme

typed-cache

A strongly typed, in-memory LRU cache for TypeScript with TTL support and async read-through loading.

typed-cache is designed to solve a common problem in backend and frontend infrastructure code: efficiently caching data while preserving type safety, predictable eviction, and clean async workflows.


Why typed-cache?

Most JavaScript caching solutions are either:

  • weakly typed
  • difficult to reason about
  • missing TTL or eviction guarantees
  • cumbersome to integrate with async data fetching

typed-cache provides:

  • full TypeScript generics for keys and values
  • O(1) average-time operations
  • LRU (Least Recently Used) eviction
  • optional TTL (time-to-live)
  • async read-through caching
  • zero runtime dependencies

Features

  • Strongly typed keys and values
  • LRU eviction using insertion-order semantics
  • Optional global TTL for cache entries
  • Async read-through loading
  • Lazy expiration (no background timers)
  • Minimal and predictable API
  • Zero dependencies

Installation

npm install typed-cache

Basic Usage

import { TypedCache } from "typed-cache"

const cache = new TypedCache<string, number>()

cache.set("a", 1)
cache.set("b", 2)

console.log(cache.getSync("a")) // 1
console.log(cache.getSync("b")) // 2

Type Safety

The cache enforces types at compile time.

type User = {
  id: number
  name: string
}

const userCache = new TypedCache<string, User>()

userCache.set("user:1", { id: 1, name: "Alice" })

// Type error:
// userCache.set("user:2", 42)

LRU Eviction

You can limit the maximum number of entries. When the limit is exceeded, the least recently used entry is evicted.

const cache = new TypedCache<string, number>({ maxSize: 2 })

cache.set("a", 1)
cache.set("b", 2)
cache.getSync("a") // "a" becomes most recently used
cache.set("c", 3)  // evicts "b"

console.log(cache.getSync("b")) // undefined
console.log(cache.getSync("a")) // 1
console.log(cache.getSync("c")) // 3

TTL (Time-To-Live)

Entries can automatically expire after a fixed duration.

const cache = new TypedCache<string, number>({
  ttl: 1000 // 1 second
})

cache.set("x", 42)

setTimeout(() => {
  console.log(cache.getSync("x")) // undefined
}, 1500)

TTL is enforced lazily on access. There are no background timers or cleanup threads.


Async Read-Through Caching

You can provide an async loader function to fetch data on cache misses.

const cache = new TypedCache<string, number>()

const value = await cache.get("answer", async () => {
  return 42
})

console.log(value) // 42
console.log(cache.getSync("answer")) // 42

This pattern:

  • avoids repeated fetch logic
  • centralizes caching behavior
  • keeps calling code clean

Cache-Only Reads

If no loader is provided, get behaves like a safe read.

const cache = new TypedCache<string, number>()

const value = await cache.get("missing")

console.log(value) // undefined

Utility Methods

Check for Existence

cache.has("key")

Returns true only if the entry exists and is not expired.


Delete a Key

cache.delete("key")

Clear the Cache

cache.clear()

Get Cache Size

cache.size()

Returns the number of currently stored entries.


API Reference

new TypedCache<K, V>(options?)

type CacheOptions = {
  maxSize?: number
  ttl?: number // milliseconds
}

set(key: K, value: V): void

Stores or updates a value in the cache.


getSync(key: K): V | undefined

Retrieves a value synchronously. Returns undefined if the key is missing or expired.


get(key: K, loader?): Promise<V | undefined>

Retrieves a value asynchronously with optional read-through loading.


has(key: K): boolean

Checks whether a valid entry exists.


delete(key: K): boolean

Removes a specific entry.


clear(): void

Clears all entries.


size(): number

Returns the number of cached entries.


Design Decisions

  • Map-based storage Enables O(1) operations and reliable LRU behavior.

  • Lazy expiration Avoids background timers and reduces complexity.

  • Explicit eviction Eviction occurs only on writes, ensuring predictable behavior.

  • Strong typing Prevents entire classes of runtime bugs.


Limitations

  • Cache is in-memory only
  • TTL is global (per-key TTL may be added later)
  • Values of undefined are treated as cache misses

These trade-offs are intentional to keep the API simple and predictable.


When to Use

  • API response caching
  • Database query caching
  • Configuration or metadata caching
  • Frontend data caching
  • Lightweight backend services

License

MIT