npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

runtime-memory-cache

v0.4.0

Published

A lightweight, high-performance in-memory cache for Node.js with TTL support, configurable eviction policies (FIFO/LRU), statistics tracking, and zero dependencies.

Downloads

17

Readme

Runtime Memory Cache

NPM version NPM downloads GitHub stars

A lightweight, high-performance in-memory cache for Node.js with TTL support, automatic cleanup, memory usage tracking, and zero dependencies.

✨ Features

  • Fast O(1) lookups using native JavaScript Map
  • TTL (Time To Live) support with automatic expiration
  • Size limiting with FIFO or LRU eviction policy
  • Skip Touch functionality for LRU caches - check existence without affecting eviction order
  • Statistics tracking (optional)
  • Manual cleanup of expired entries
  • Memory usage tracking with getMemoryUsage()
  • Zero dependencies
  • TypeScript support with full type definitions
  • Memory efficient with automatic garbage collection

🧪 Test Coverage

This package includes comprehensive test coverage for all features, including edge cases, validation, eviction, TTL, statistics, and utility logic. Run npm test to verify all tests pass.

📦 Installation

npm install runtime-memory-cache

🚀 Quick Start

import RuntimeMemoryCache from 'runtime-memory-cache';

// Create cache with options
const cache = new RuntimeMemoryCache({
  ttl: 60000,           // 1 minute default TTL
  maxSize: 1000,        // Maximum 1000 entries
  enableStats: true,    // Enable statistics tracking
  evictionPolicy: 'LRU' // Use LRU eviction policy
});

// Store data
cache.set('user:123', { name: 'John', age: 30 });

// Retrieve data
const user = cache.get('user:123');
console.log(user); // { name: 'John', age: 30 }

// Check if key exists
if (cache.has('user:123')) {
  console.log('User exists!');
}

📚 API Reference

Constructor Options

interface CacheOptions {
  ttl?: number;        // Default TTL in milliseconds
  maxSize?: number;    // Maximum cache entries (default: 1000)
  enableStats?: boolean; // Enable statistics tracking (default: false)
  evictionPolicy?: 'FIFO' | 'LRU'; // Eviction policy (default: 'FIFO')
}

Methods

set(key: string, value: any, ttl?: number): void

Store a value with optional TTL override.

cache.set('key', 'value');                    // Uses default TTL
cache.set('key', 'value', 30000);            // 30 second TTL
cache.set('key', 'value', undefined);        // No expiration

get(key: string): any

Retrieve a value. Returns undefined if key doesn't exist or has expired.

const value = cache.get('key');

has(key: string, skipTouch?: boolean): boolean

Check if key exists and is not expired. Optionally skip updating access time.

if (cache.has('key')) {
  // Key exists and is valid (updates access time for LRU)
}

// Skip updating access time (useful for LRU caches)
if (cache.has('key', true)) {
  // Key exists but LRU order is not affected
}

Parameters:

  • key: The cache key to check
  • skipTouch: Optional. When true, skips updating access time and LRU order. Default: false

del(key: string): boolean

Delete a specific key. Returns true if key existed.

const wasDeleted = cache.del('key');

size(): number

Get current number of entries in cache.

console.log(cache.size()); // 42

clear(): void

Remove all entries from cache.

cache.clear();

keys(): string[]

Get array of all keys in cache.

const allKeys = cache.keys();

cleanup(): number

Manually remove expired entries. Returns number of entries removed.

const removedCount = cache.cleanup();
console.log(`Removed ${removedCount} expired entries`);

getStats(): CacheStats | null

Get cache statistics (if enabled).

const stats = cache.getStats();
if (stats) {
  console.log(`Hits: ${stats.hits}, Misses: ${stats.misses}`);
}

resetStats(): void

Reset statistics counters (if enabled).

getEvictionPolicy(): 'FIFO' | 'LRU'

Get the current eviction policy being used.

console.log(cache.getEvictionPolicy()); // 'FIFO' or 'LRU'

getMemoryUsage(): { estimatedBytes: number; averageBytesPerEntry: number }

Get estimated memory usage of the cache.

const memInfo = cache.getMemoryUsage();
console.log(`Cache uses ~${memInfo.estimatedBytes} bytes`);
console.log(`Average: ${memInfo.averageBytesPerEntry} bytes per entry`);

📊 Statistics

When enableStats: true, you can track cache performance:

interface CacheStats {
  hits: number;        // Cache hits
  misses: number;      // Cache misses  
  size: number;        // Current cache size
  maxSize: number;     // Maximum allowed size
  evictions: number;   // Number of evicted entries
}

interface MemoryUsage {       // Memory usage tracking
  estimatedBytes: number;        // Total estimated bytes
  averageBytesPerEntry: number;  // Average bytes per entry
};

🔧 Usage Examples

API Response Caching

const apiCache = new RuntimeMemoryCache({ 
  ttl: 300000,    // 5 minutes
  maxSize: 1000,
  enableStats: true 
});

async function fetchUser(userId: string) {
  const cacheKey = `user:${userId}`;
  
  // Try cache first
  let user = apiCache.get(cacheKey);
  if (user) return user;
  
  // Fetch from API and cache
  user = await fetch(`/api/users/${userId}`).then(r => r.json());
  apiCache.set(cacheKey, user);
  
  return user;
}

Database Query Caching

const dbCache = new RuntimeMemoryCache({ ttl: 120000 }); // 2 minutes

async function getProductById(id: string) {
  const cacheKey = `product:${id}`;
  
  let product = dbCache.get(cacheKey);
  if (product) return product;
  
  product = await db.query('SELECT * FROM products WHERE id = ?', [id]);
  dbCache.set(cacheKey, product);
  
  return product;
}

Session Management

const sessionCache = new RuntimeMemoryCache({ 
  ttl: 1800000,  // 30 minutes
  maxSize: 10000 
});

function createSession(userId: string, data: any): string {
  const sessionId = generateId();
  sessionCache.set(`session:${sessionId}`, { userId, ...data });
  return sessionId;
}

function getSession(sessionId: string) {
  return sessionCache.get(`session:${sessionId}`);
}

Rate Limiting

const rateLimiter = new RuntimeMemoryCache({ ttl: 60000 }); // 1 minute window

function checkRateLimit(clientId: string, maxRequests: number = 100): boolean {
  const key = `rate:${clientId}`;
  const current = rateLimiter.get(key) || 0;
  
  if (current >= maxRequests) {
    return false; // Rate limit exceeded
  }
  
  rateLimiter.set(key, current + 1);
  return true;
}

Skip Touch Feature with LRU Cache

const lruCache = new RuntimeMemoryCache({ 
  maxSize: 3,
  evictionPolicy: 'LRU',
  enableStats: true 
});

// Fill cache to capacity
lruCache.set('user:1', 'Alice');
lruCache.set('user:2', 'Bob');
lruCache.set('user:3', 'Charlie');

// Check existence without affecting LRU order
if (lruCache.has('user:1', true)) {
  console.log('User 1 exists, but LRU order unchanged');
}

// Normal check that updates LRU order
if (lruCache.has('user:2')) {
  console.log('User 2 exists and moved to most recent');
}

// Add new user - 'user:1' will be evicted (least recently used)
// because it wasn't "touched" by the skipTouch check
lruCache.set('user:4', 'David');

console.log(lruCache.has('user:1')); // false - evicted
console.log(lruCache.has('user:2')); // true - recently accessed

🏗️ Architecture

The cache is built with a modular architecture:

  • types.ts - TypeScript interfaces and type definitions
  • utils.ts - Utility functions for cache operations
  • stats.ts - Statistics tracking functionality
  • index.ts - Main cache implementation

This structure makes the code maintainable, testable, and easy to extend.

🔄 Eviction Policies

When the cache reaches maxSize, it automatically removes entries based on the configured eviction policy:

FIFO (First In, First Out) - Default

  • Removes the oldest inserted entry first
  • Simple and predictable behavior
  • Good for time-based caching scenarios
  • Tracks access time for consistency (but doesn't use it for eviction)

LRU (Least Recently Used)

  • Removes the entry that hasn't been accessed for the longest time
  • Better cache hit rates for access-pattern-based scenarios
  • Uses access time tracking for eviction decisions
  • On get(), has(), and set() (for existing keys), the accessed key is reordered via delete-and-reinsert to the Map, preserving createdAt and expiresAt while updating lastAccessedAt.
  • Skip Touch Feature: Use has(key, true) to check existence without affecting LRU order
// FIFO Cache (default)
const fifoCache = new RuntimeMemoryCache({ 
  maxSize: 100, 
  evictionPolicy: 'FIFO' 
});

// LRU Cache  
const lruCache = new RuntimeMemoryCache({ 
  maxSize: 100, 
  evictionPolicy: 'LRU' 
});

Both policies:

  1. Automatically remove entries when cache is full
  2. Track eviction count in statistics
  3. Maintain O(1) average performance
  4. Update access time on get(), has(), and set() (for existing keys) operations for consistency

⚡ Performance

  • O(1) average case for get, set, has, and delete operations
  • Memory efficient with automatic cleanup of expired entries
  • Zero dependencies - no external libraries
  • TypeScript optimized with proper type inference

📄 License

MIT

🤝 Contributing

Contributions welcome! Please read our contributing guidelines and submit pull requests to our repository.

📞 Support

  • GitHub Issues: Create an issue
  • Documentation: This README
  • Examples: See src/playground.ts