newton-cache
v1.3.0
Published
Lightweight cache library with pluggable adapters. Zero dependencies, TTL support, TypeScript-first.
Maintainers
Readme
newton-cache
Lightweight async cache library with pluggable adapters. Zero dependencies, TTL support, TypeScript-first. Ships as an ES module with complete type definitions. All cache operations return Promises.
Install
npm install newton-cacheUsage
Choosing an Adapter
All adapters implement the same CacheAdapter interface. Currently available:
- FileCache - Persistent file-based storage (survives restarts)
- FlatFileCache - Persistent single-file storage (all keys in one JSON file)
- MemoryCache - Fast in-memory storage (data lost on restart)
FileCache vs FlatFileCache Comparison
| Feature | FileCache | FlatFileCache |
|---------|-----------|---------------|
| Storage | One file per key | Single JSON file |
| Best for | Large caches (>1000 keys) | Small-medium caches (<1000 keys) |
| Write performance | Fast (only affected key) | Slower (rewrites entire cache) |
| Read performance | O(1) file read | O(1) after initial load |
| Memory usage | Minimal | Full cache loaded in memory |
| Backup/restore | Copy directory | Copy single file ✨ |
| Inode usage | One per key | Just one file ✨ |
| Inspection | ls cache directory | Read JSON file directly ✨ |
| Persistence | Survives restarts ✅ | Survives restarts ✅ |
Use FileCache when: You have many keys (>1000), high write frequency, or large values per key.
Use FlatFileCache when: You want easy backup (single file), minimal inode usage, or simple inspection of all cached data.
Initializing
FileCache (persistent, survives restarts):
import { FileCache } from 'newton-cache';
// Stores files in the OS tmp directory by default
const cache = new FileCache<string>();
// Or provide your own directory:
const cache = new FileCache({ cachePath: "/var/tmp/my-cache" });FlatFileCache (single file, survives restarts):
import { FlatFileCache } from 'newton-cache';
// Stores all entries in a single JSON file in the OS tmp directory
const cache = new FlatFileCache<string>();
// Or provide your own file path:
const cache = new FlatFileCache({
filePath: "/var/cache/my-app.json"
});MemoryCache (fast, in-memory only):
import { MemoryCache } from 'newton-cache';
// Stores data in memory
const cache = new MemoryCache<string>();All adapters implement the same interface, so you can easily switch between them.
Getting items from the cache
// If a file named "answer" exists in the cache directory, read it:
const value = await cache.get('answer'); // parsed value, or undefined if missing
// Provide a default if the file doesn't exist or is unreadable:
const fallback = await cache.get('missing-key', 'default');
// Or pass a factory/closure so the default is only computed when needed:
const fromFactory = await cache.get('missing', () => expensiveLookup());
// You can also pass the function reference directly:
const directFactory = await cache.get('missing', expensiveLookup);
// Inline anonymous factory
const twoLine = await cache.get('computed', () => {
const value = expensiveLookup();
return value;
});Checking existence
// Returns true when the file exists and contains a defined value.
if (await cache.has('answer')) {
// ...
}Storing items
// Store with a 10-second TTL:
await cache.put('key', 'value', 10);
// Store indefinitely (no TTL):
await cache.put('key', 'value');
// Store permanently (alias for put without TTL):
await cache.forever('key', 'value');Store only when missing
// Add only when missing; returns true if stored:
const added = await cache.add('key', 'value', 10);Deleting items
// Remove and return whether it existed:
const removed = await cache.forget('key');
// Clear all cached entries:
await cache.flush();Special (compose read + write)
// Retrieve or compute and store for 60 seconds (TTL is in seconds):
const users = await cache.remember('users', 60, () => fetchUsers());
// Store forever when missing:
const usersAlways = await cache.rememberForever('users', () => fetchUsers());
// Retrieve and remove the cached value. Returns undefined when missing.
const pulled = await cache.pull('answer');
// Provide a static default:
const staticDefault = await cache.pull('missing', 'default');
// Provide a default (or factory) when missing:
const fallback = await cache.pull('missing', () => expensiveLookup());If the entry is missing or expired, the factory runs and the result is written to disk. Otherwise, the cached value is returned. pull removes the file after reading.
Introspection
// Get all cache keys
const keys = await cache.keys(); // ['user:1', 'user:2', 'session:abc']
// Count cached items
const count = await cache.count(); // 3
// Get total cache size in bytes
const bytes = await cache.size(); // 1024
console.log(`Cache size: ${(bytes / 1024).toFixed(2)} KB`);Cleanup
// Remove expired entries (keeps valid ones)
const removed = await cache.prune();
console.log(`Removed ${removed} expired entries`);
// Clear everything (removes all entries)
await cache.flush();TTL management
// Get remaining time-to-live in seconds
await cache.put('session', data, 3600); // 1 hour
const ttl = await cache.ttl('session'); // e.g., 3599
// Extend TTL of existing entry
await cache.touch('session', 7200); // Extend to 2 hours from now
// Remove expiration
await cache.touch('session', Number.POSITIVE_INFINITY);Atomic counters
// Increment counters
await cache.increment('page-views'); // 1
await cache.increment('page-views'); // 2
await cache.increment('page-views', 10); // 12
// Decrement counters
await cache.put('credits', 100);
await cache.decrement('credits'); // 99
await cache.decrement('credits', 20); // 79
// Use together
await cache.increment('balance', 50); // 50
await cache.decrement('balance', 10); // 40Batch operations
Process multiple cache keys in a single operation:
// Store multiple key-value pairs at once
await cache.putMany({
'user:1': { name: 'Alice', role: 'admin' },
'user:2': { name: 'Bob', role: 'user' },
'user:3': { name: 'Charlie', role: 'user' },
}, 3600); // Optional TTL applies to all
// Retrieve multiple values
const users = await cache.getMany(['user:1', 'user:2', 'user:3']);
// Returns: { 'user:1': {...}, 'user:2': {...}, 'user:3': {...} }
// Missing keys return undefined
const partial = await cache.getMany(['user:1', 'missing', 'user:3']);
// Returns: { 'user:1': {...}, 'missing': undefined, 'user:3': {...} }
// Remove multiple keys
const removed = await cache.forgetMany(['user:1', 'user:2']);
// Returns: 2 (number of keys actually removed)Batch operations are ideal for:
- Bulk user data loading
- Multi-key cache warming
- Batch invalidation
- Reducing I/O overhead when working with multiple keys
Real-world Examples
Rate Limiting
const cache = new FileCache<number>();
async function checkRateLimit(userId: string): Promise<boolean> {
const key = `rate-limit:${userId}`;
const requests = await cache.get(key, 0);
if (requests >= 100) {
return false; // Rate limit exceeded
}
await cache.increment(key);
await cache.touch(key, 3600); // 1 hour window
return true;
}API Response Caching
const cache = new FileCache<APIResponse>();
async function fetchUserProfile(userId: string) {
return await cache.remember(`user:${userId}`, 300, async () => {
const response = await fetch(`/api/users/${userId}`);
return response.json();
});
}Session Storage
const sessions = new FileCache<SessionData>();
async function createSession(userId: string, data: SessionData) {
const sessionId = generateId();
await sessions.put(sessionId, data, 86400); // 24 hours
return sessionId;
}
async function extendSession(sessionId: string) {
await sessions.touch(sessionId, 86400); // Extend by 24 hours
}Feature Flags
const flags = new FileCache<boolean>();
async function isFeatureEnabled(feature: string): Promise<boolean> {
return await flags.remember(feature, 60, () => {
// Fetch from remote config service
return fetchFeatureFlag(feature);
});
}Job Queue Deduplication
const jobs = new FileCache<string>();
async function enqueueJob(jobId: string, payload: any) {
const added = await jobs.add(`job:${jobId}`, payload, 3600);
if (!added) {
console.log('Job already queued');
return false;
}
return true;
}Performance Characteristics
Read Performance
- Cache hit: ~0.5-2ms (includes file I/O, JSON parsing, TTL check)
- Cache miss: ~0.1-0.5ms (file existence check only)
- Long keys (>200 chars) have no performance penalty (hashed)
Write Performance
- Single write: ~1-3ms (JSON serialization + file write)
- Atomic counters: ~2-4ms (read + increment + write)
- Batch operations: Linear with key count (delegates to individual operations)
Scalability
- Sweet spot: 1-10,000 entries
- Memory footprint: Minimal (only metadata in memory, values on disk)
- Disk usage: ~100-500 bytes per entry (depends on value size)
Trade-offs
- Slower than in-memory caches (Redis, node-cache) but survives restarts
- Faster than databases for simple key-value operations
- Thread-safe reads but not atomic across processes (see limitations)
has()reads full file to check expiration (accurate but slower)
Limitations
Thread Safety
⚠️ Not thread-safe across multiple processes
- Safe for single-process applications
- Not suitable for multi-process/cluster mode without external locking
- Race conditions possible with concurrent writes to the same key
// ❌ Unsafe in cluster mode
cluster.fork();
await cache.increment('counter'); // Race condition!
// ✅ Safe in single process
await cache.increment('counter');Filesystem Dependencies
- Requires write access to cache directory
- Not suitable for serverless with read-only filesystems (use
/tmpwith caution) - Disk I/O adds latency compared to in-memory caches
- No ACID guarantees (writes are not transactional)
Platform-Specific
- File path limits: Keys >200 chars are hashed (irreversible)
- Case sensitivity: Depends on filesystem (HFS+ vs ext4)
- Permissions: Ensure cache directory is writable
Error Handling
- Silent failures by design (returns defaults instead of throwing)
- No error callbacks or logging hooks
- Corrupted cache files are silently removed during operations
Not Suitable For
- ❌ High-frequency writes (>10K ops/sec)
- ❌ Large values (>1MB per entry)
- ❌ Distributed systems without coordination
- ❌ Critical data requiring persistence guarantees
Best Suited For
- ✅ API response caching
- ✅ Session storage
- ✅ Rate limiting
- ✅ Feature flags
- ✅ Temporary computations
- ✅ Single-server applications
How it works
FileCache
- Files are stored under a cache directory (
<os tmp>/node-cacheby default) - Keys are URL-encoded to form the filename (e.g., key
answer→/tmp/node-cache/answer) - Very long keys (>200 chars) are SHA-256 hashed to prevent filesystem limits
- Each file stores a JSON payload:
{ "value": <data>, "expiresAt": <timestamp|undefined>, "key": "<original>" } getreads and JSON-parses the file, deleting expired entries automatically- Writes use atomic temp file + rename for data safety
FlatFileCache
- All entries stored in a single JSON file (
<os tmp>/newton-cache.jsonby default) - File format:
{ "key1": { "value": <data>, "expiresAt": <timestamp> }, "key2": {...} } - Lazy loading: cache loaded into memory on first access
- Every write operation (put, increment, forget) rewrites the entire file
- Expired entries auto-pruned during load and on access
- Writes use atomic temp file + rename for data safety
MemoryCache
- Data stored in JavaScript
Map(in-memory only, lost on restart) - No filesystem I/O, fastest performance
- Expired entries removed lazily on access or during
prune()
All Adapters
getreturnsundefinedor caller-provided default when missing/invalid/expiredhasreturns true only when entry exists, is not expired, and has a defined valuerememberwrites withexpiresAttimestamp when TTL provided, omits for foreverprune()removes only expired entries;flush()removes everything- Counters (
increment/decrement) are atomic within a single process and preserve existing TTL
Scripts
npm run build— compile TypeScript todist/.npm test— build then run Node's built-in test runner against compiled output.npm run clean— remove build artifacts.
License
MIT
