@utkarsh851/typed-cache
v0.1.0
Published
A strongly typed in-memory LRU cache with TTL and async read-through loading
Downloads
4
Maintainers
Readme
typed-cache
A strongly typed, in-memory LRU cache for TypeScript with TTL support and async read-through loading.
typed-cache is designed to solve a common problem in backend and frontend infrastructure code:
efficiently caching data while preserving type safety, predictable eviction, and clean async workflows.
Why typed-cache?
Most JavaScript caching solutions are either:
- weakly typed
- difficult to reason about
- missing TTL or eviction guarantees
- cumbersome to integrate with async data fetching
typed-cache provides:
- full TypeScript generics for keys and values
- O(1) average-time operations
- LRU (Least Recently Used) eviction
- optional TTL (time-to-live)
- async read-through caching
- zero runtime dependencies
Features
- Strongly typed keys and values
- LRU eviction using insertion-order semantics
- Optional global TTL for cache entries
- Async read-through loading
- Lazy expiration (no background timers)
- Minimal and predictable API
- Zero dependencies
Installation
npm install typed-cacheBasic Usage
import { TypedCache } from "typed-cache"
const cache = new TypedCache<string, number>()
cache.set("a", 1)
cache.set("b", 2)
console.log(cache.getSync("a")) // 1
console.log(cache.getSync("b")) // 2Type Safety
The cache enforces types at compile time.
type User = {
id: number
name: string
}
const userCache = new TypedCache<string, User>()
userCache.set("user:1", { id: 1, name: "Alice" })
// Type error:
// userCache.set("user:2", 42)LRU Eviction
You can limit the maximum number of entries. When the limit is exceeded, the least recently used entry is evicted.
const cache = new TypedCache<string, number>({ maxSize: 2 })
cache.set("a", 1)
cache.set("b", 2)
cache.getSync("a") // "a" becomes most recently used
cache.set("c", 3) // evicts "b"
console.log(cache.getSync("b")) // undefined
console.log(cache.getSync("a")) // 1
console.log(cache.getSync("c")) // 3TTL (Time-To-Live)
Entries can automatically expire after a fixed duration.
const cache = new TypedCache<string, number>({
ttl: 1000 // 1 second
})
cache.set("x", 42)
setTimeout(() => {
console.log(cache.getSync("x")) // undefined
}, 1500)TTL is enforced lazily on access. There are no background timers or cleanup threads.
Async Read-Through Caching
You can provide an async loader function to fetch data on cache misses.
const cache = new TypedCache<string, number>()
const value = await cache.get("answer", async () => {
return 42
})
console.log(value) // 42
console.log(cache.getSync("answer")) // 42This pattern:
- avoids repeated fetch logic
- centralizes caching behavior
- keeps calling code clean
Cache-Only Reads
If no loader is provided, get behaves like a safe read.
const cache = new TypedCache<string, number>()
const value = await cache.get("missing")
console.log(value) // undefinedUtility Methods
Check for Existence
cache.has("key")Returns true only if the entry exists and is not expired.
Delete a Key
cache.delete("key")Clear the Cache
cache.clear()Get Cache Size
cache.size()Returns the number of currently stored entries.
API Reference
new TypedCache<K, V>(options?)
type CacheOptions = {
maxSize?: number
ttl?: number // milliseconds
}set(key: K, value: V): void
Stores or updates a value in the cache.
getSync(key: K): V | undefined
Retrieves a value synchronously.
Returns undefined if the key is missing or expired.
get(key: K, loader?): Promise<V | undefined>
Retrieves a value asynchronously with optional read-through loading.
has(key: K): boolean
Checks whether a valid entry exists.
delete(key: K): boolean
Removes a specific entry.
clear(): void
Clears all entries.
size(): number
Returns the number of cached entries.
Design Decisions
Map-based storage Enables O(1) operations and reliable LRU behavior.
Lazy expiration Avoids background timers and reduces complexity.
Explicit eviction Eviction occurs only on writes, ensuring predictable behavior.
Strong typing Prevents entire classes of runtime bugs.
Limitations
- Cache is in-memory only
- TTL is global (per-key TTL may be added later)
- Values of
undefinedare treated as cache misses
These trade-offs are intentional to keep the API simple and predictable.
When to Use
- API response caching
- Database query caching
- Configuration or metadata caching
- Frontend data caching
- Lightweight backend services
License
MIT
