npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@jeevanms003/cacheflow

v1.0.2

Published

Production-ready intelligent caching library with Redis support, adaptive TTL, predictive preloading, and real-time analytics dashboard

Downloads

296

Readme

CacheFlow

Production-ready intelligent caching library with Redis support, adaptive TTL, predictive preloading, and real-time analytics dashboard.

npm version License: MIT TypeScript

Features

  • Smart Learning: Automatically adapts TTL based on access patterns
  • Predictive Preloading: Learns correlations and preloads related data
  • Real-time Dashboard: Beautiful web UI with live metrics and analytics
  • Redis Support: Production-ready distributed caching with automatic fallback
  • Type-Safe: Full TypeScript support with comprehensive type definitions
  • High Performance: Optimized for speed with minimal overhead
  • Async/Await: Modern promise-based API
  • Analytics: Track hit rates, cost savings, and performance metrics

Installation

npm install @jeevanms003/cacheflow

For Redis support:

npm install @jeevanms003/cacheflow ioredis

For dashboard:

npm install @jeevanms003/cacheflow express socket.io

Quick Start

import { CacheFlow } from '@jeevanms003/cacheflow';

const cache = new CacheFlow({
  defaultTTL: 60000, // 1 minute
  maxSize: 1000
});

// Wrap any async function
const getUser = cache.wrap('getUser', async (id: string) => {
  // Expensive database call
  return await db.users.findById(id);
});

// First call - cache miss (slow)
await getUser('123'); // Fetches from DB

// Second call - cache hit (fast!)
await getUser('123'); // Returns from cache

Configuration

Basic Configuration

const cache = new CacheFlow({
  // Cache Settings
  defaultTTL: 60000,        // Default time-to-live (ms)
  maxSize: 1000,            // Max cache entries
  enableLogging: false,     // Console logging
  
  // Learning Features
  enableLearning: true,     // Adaptive TTL
  minSampleSize: 20,        // Samples before adapting
  
  // Prediction Features
  enablePrediction: true,   // Predictive preloading
  correlationThreshold: 0.7,// Min correlation to preload
  maxPreload: 3,            // Max items to preload
  
  // Storage Backend
  storage: 'memory',        // 'memory' or 'redis'
  
  // Dashboard
  enableDashboard: true,    // Launch web dashboard
  dashboardPort: 3000       // Dashboard port
});

Redis Configuration

const cache = new CacheFlow({
  storage: 'redis',
  redisConfig: {
    host: 'localhost',
    port: 6379,
    password: 'your-password',
    keyPrefix: 'myapp:',
    db: 0
  },
  fallbackToMemory: true // Graceful degradation
});

// Don't forget to disconnect
await cache.disconnect();

Usage Examples

With Express.js

import express from 'express';
import { CacheFlow } from '@jeevanms003/cacheflow';

const app = express();
const cache = new CacheFlow({ enableDashboard: true });

const getProduct = cache.wrap('getProduct', async (id: string) => {
  return await db.products.findById(id);
});

app.get('/product/:id', async (req, res) => {
  const product = await getProduct(req.params.id);
  res.json(product);
});

app.listen(8080);
// Dashboard available at http://localhost:3000

Manual Cache Operations

// Set
await cache.set('key', { data: 'value' }, 30000);

// Get
const value = await cache.get('key');

// Delete
await cache.delete('key');

// Clear all
await cache.clear();

// Check existence
const exists = await cache.has('key');

Statistics

const stats = cache.getStats();
console.log(stats);
// {
//   hits: 150,
//   misses: 50,
//   sets: 50,
//   hitRate: 0.75,
//   preloads: 10
// }

Dashboard

The built-in dashboard provides real-time insights:

  • Overview: Hit rate, total requests, live charts
  • Patterns: Access patterns and optimal TTL analysis
  • Correlations: Network graph of related keys
  • Cost Savings: ROI calculator based on cache hits
  • Cache Keys: Manage and invalidate cache entries

Launching Dashboard

// Auto-start with config
const cache = new CacheFlow({ 
  enableDashboard: true,
  dashboardPort: 3000 
});

// Or start manually
cache.startDashboard(3000);

// Stop dashboard
cache.stopDashboard();

Open http://localhost:3000 to view the dashboard.

Smart Features

Adaptive TTL

CacheFlow learns access patterns and automatically adjusts TTL:

const cache = new CacheFlow({ 
  enableLearning: true,
  minSampleSize: 20 
});

const getData = cache.wrap('getData', fetchData);

// After 20+ accesses, TTL adapts based on:
// - Access frequency
// - Access intervals
// - Pattern consistency

TTL Strategy:

  • High frequency (< 10s intervals) → 1 hour TTL
  • Regular intervals → 1.5x average interval
  • Irregular access → Default TTL

Predictive Preloading

Learns which keys are accessed together and preloads in background:

const cache = new CacheFlow({ 
  enablePrediction: true,
  correlationThreshold: 0.7 
});

const getUser = cache.wrap('getUser', fetchUser);
const getOrders = cache.wrap('getOrders', fetchOrders);

// After learning that getUser('123') → getOrders('123')
await getUser('123'); // Also preloads orders in background
await getOrders('123'); // Cache hit! (preloaded)

API Reference

CacheFlow Class

Methods

  • wrap<T>(name: string, fn: T, ttl?: number): T - Wrap async function
  • get<T>(key: string): Promise<T | undefined> - Get cached value
  • set<T>(key: string, value: T, ttl?: number): Promise<void> - Set cache value
  • delete(key: string): Promise<boolean> - Delete cache entry
  • clear(): Promise<void> - Clear all cache
  • has(key: string): Promise<boolean> - Check if key exists
  • getStats(): CacheStats - Get cache statistics
  • getPatterns(key: string) - Get access pattern for key
  • disconnect(): Promise<void> - Close connections (Redis, dashboard)
  • startDashboard(port?: number): void - Start dashboard server
  • stopDashboard(): void - Stop dashboard server

Types

interface CacheFlowConfig {
  defaultTTL?: number;
  maxSize?: number;
  enableLogging?: boolean;
  enableLearning?: boolean;
  minSampleSize?: number;
  enablePrediction?: boolean;
  correlationThreshold?: number;
  maxPreload?: number;
  storage?: 'memory' | 'redis';
  redisConfig?: RedisOptions & { keyPrefix?: string };
  fallbackToMemory?: boolean;
  enableDashboard?: boolean;
  dashboardPort?: number;
}

interface CacheStats {
  hits: number;
  misses: number;
  sets: number;
  hitRate: number;
  preloads?: number;
}

Architecture

┌─────────────────────────────────────────┐
│           CacheFlow Core                │
├─────────────────────────────────────────┤
│  ┌──────────────┐  ┌─────────────────┐ │
│  │ PatternAnalyzer│ │CorrelationTracker│ │
│  └──────────────┘  └─────────────────┘ │
│  ┌──────────────┐  ┌─────────────────┐ │
│  │SmartTTLCalc  │  │PredictionEngine │ │
│  └──────────────┘  └─────────────────┘ │
├─────────────────────────────────────────┤
│        Storage Adapter (Interface)      │
├─────────────────────────────────────────┤
│  ┌──────────────┐  ┌─────────────────┐ │
│  │ MemoryStore  │  │   RedisStore    │ │
│  └──────────────┘  └─────────────────┘ │
└─────────────────────────────────────────┘

Testing

# Run all tests
npm test

# Run with coverage
npm run test:coverage

# Start Redis for integration tests
docker-compose up -d
npm test

Performance

Benchmarks on typical workloads:

| Operation | Time (avg) | Throughput | |-----------|------------|------------| | Memory Get (hit) | ~0.01ms | 100k ops/s | | Memory Set | ~0.02ms | 50k ops/s | | Redis Get (hit) | ~1ms | 10k ops/s | | Redis Set | ~1.5ms | 7k ops/s | | Prediction | ~0.05ms | 20k ops/s |

Troubleshooting

Redis Connection Issues

// Enable logging
const cache = new CacheFlow({
  storage: 'redis',
  enableLogging: true,
  fallbackToMemory: true // Ensures availability
});

Dashboard Not Loading

  • Check port is not in use: netstat -ano | findstr :3000
  • Ensure express and socket.io are installed
  • Check firewall settings

High Memory Usage

// Limit cache size
const cache = new CacheFlow({
  maxSize: 500, // Reduce max entries
  defaultTTL: 30000 // Shorter TTL
});

Contributing

Contributions welcome! Please read CONTRIBUTING.md first.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing)
  3. Commit changes (git commit -m 'Add amazing feature')
  4. Push to branch (git push origin feature/amazing)
  5. Open a Pull Request

License

MIT © [Your Name]

Acknowledgments

  • Built with TypeScript
  • Powered by ioredis
  • Charts by Chart.js
  • Real-time updates via Socket.IO

Support