npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@bernierllc/rate-limiter

v0.1.7

Published

High-performance rate limiting with multiple algorithms (token bucket, sliding window) and storage backends (memory, Redis, database)

Readme

@bernierllc/rate-limiter

High-performance rate limiting with multiple algorithms and storage backends for Node.js applications.

Installation

npm install @bernierllc/rate-limiter

For Redis support, install the Redis client:

# Using node-redis
npm install redis

# Or using ioredis
npm install ioredis

Usage

Basic Rate Limiting

import { createRateLimiter } from '@bernierllc/rate-limiter';

// Create a rate limiter with default token bucket algorithm
const rateLimiter = createRateLimiter({
  algorithm: 'token-bucket',
  limit: 100,
  windowMs: 60000, // 1 minute
  storage: 'memory'
});

// Check if request is allowed
const result = await rateLimiter.check('user-123');

if (result.allowed) {
  console.log(`Request allowed. ${result.remaining} requests remaining.`);
} else {
  console.log(`Rate limit exceeded. Try again in ${result.reset - Date.now()}ms.`);
}

Express Middleware

import express from 'express';
import { createRateLimiter, createExpressMiddleware } from '@bernierllc/rate-limiter';

const app = express();

// Create rate limiter
const rateLimiter = createRateLimiter({
  algorithm: 'sliding-window',
  limit: 50,
  windowMs: 60000,
  storage: 'memory'
});

// Apply middleware
app.use('/api', createExpressMiddleware(rateLimiter, {
  keyGenerator: (req) => req.ip,
  skip: (req) => req.method === 'GET' && req.path === '/health',
  handler: (req, res) => {
    res.status(429).json({ error: 'Too many requests' });
  },
  contextBuilder: (req) => ({
    key: req.ip,
    path: req.path,
    method: req.method,
    userId: req.user?.id
  })
}));

Fastify Plugin

import fastify from 'fastify';
import { createRateLimiter, createFastifyPlugin } from '@bernierllc/rate-limiter';

const app = fastify();

const rateLimiter = createRateLimiter({
  algorithm: 'fixed-window',
  limit: 200,
  windowMs: 60000
});

app.register(createFastifyPlugin(rateLimiter, {
  points: request => (request.routerPath === '/upload' ? 5 : 1)
}));

Next.js Middleware

import type { NextApiHandler } from 'next';
import { createRateLimiter, createNextMiddleware } from '@bernierllc/rate-limiter';

const limiter = createRateLimiter({
  algorithm: 'token-bucket',
  limit: 120,
  windowMs: 60000
});

const rateLimit = createNextMiddleware(limiter);

const handler: NextApiHandler = async (req, res) => {
  await rateLimit(req, res, async () => {
    res.status(200).json({ ok: true, remaining: req.rateLimit?.remaining });
  });
};

export default handler;

Redis Storage Backend

import { createRateLimiter } from '@bernierllc/rate-limiter';
import { createClient } from 'redis';

// Using node-redis
const redisClient = createClient({ url: 'redis://localhost:6379' });
await redisClient.connect();

const rateLimiter = createRateLimiter({
  algorithm: 'fixed-window',
  limit: 1000,
  windowMs: 3600000, // 1 hour
  storage: {
    type: 'redis',
    client: redisClient,
    keyPrefix: 'rl:'
  }
});

Using ioredis

import { createRateLimiter } from '@bernierllc/rate-limiter';
import Redis from 'ioredis';

const redisClient = new Redis({
  host: 'localhost',
  port: 6379
});

const rateLimiter = createRateLimiter({
  algorithm: 'sliding-window',
  limit: 500,
  windowMs: 900000, // 15 minutes
  storage: {
    type: 'redis',
    client: redisClient,
    keyPrefix: 'api_limits:'
  }
});

Database Storage Backend

import { createRateLimiter } from '@bernierllc/rate-limiter';

const adapter = {
  get: async (key) => database.fetchRateLimit(key),
  set: async (key, data, ttl) => database.upsertRateLimit(key, data, ttl),
  increment: async (key, amount = 1) => database.incrementRateLimit(key, amount),
  delete: async (key) => database.removeRateLimit(key),
  expire: async (key, ttl) => database.setExpiry(key, ttl)
};

const rateLimiter = createRateLimiter({
  limit: 100,
  windowMs: 60000,
  storage: {
    type: 'database',
    adapter,
    keyPrefix: 'api:'
  }
});

API Reference

createRateLimiter(options)

Creates a rate limiter instance with the specified configuration.

Options

  • limit (number): Maximum number of requests allowed in the base configuration (required)
  • windowMs (number): Time window in milliseconds for the base configuration (required)
  • algorithm? (string): Algorithm to use for the base configuration ('token-bucket', 'sliding-window', 'fixed-window', 'leaky-bucket')
  • bucketSize? / refillRate? / tokensPerRequest? / precision? / leakRate? (number): Algorithm-specific overrides for the base configuration
  • storage? (string | object): Storage backend configuration
    • 'memory' - In-memory storage (default)
    • { type: 'redis', client: RedisClient | IORedisClient, keyPrefix?: string } - Redis-backed storage
    • { type: 'database', adapter: RateLimitDatabaseAdapter, keyPrefix?: string } - Database-backed storage using custom adapter
  • skipOnError? (boolean): When true, allows traffic if the storage backend fails (default false)
  • onLimitReached? (function): Callback invoked when a limit is exceeded. Receives (key, result, context)
  • customRules? (RateLimitRule[]): Array of context-aware rules that can override algorithm, limits, or storage key prefix per request

Returns

Returns a RateLimiter instance with the following methods:

  • check(key: string, options?: RateLimitCheckOptions): Promise<RateLimitResult> - Check if request is allowed with optional points/context
  • reset(key: string): Promise<boolean> - Reset limits for a key
  • getUsage(key: string): Promise<UsageInfo> - Get current usage stats

createExpressMiddleware(rateLimiter, options?)

Creates Express middleware from a rate limiter instance.

Options

  • keyGenerator? (function): Function to generate rate limit key from request

    • Default: (req) => req.ip
  • skip? (function): Function to skip rate limiting for certain requests

    • Default: () => false
  • handler? (function): Custom handler when limit exceeded

    • Default: Sends 429 status with standard message
  • points? (number | function): Number of points consumed per request or resolver function

  • contextBuilder? (function): Builds contextual metadata passed to custom rules and storage keys

  • legacyHeaders? (boolean): Include legacy X-RateLimit-* headers

    • Default: false
  • standardHeaders? (boolean): Include standard RateLimit-* headers

    • Default: true
  • requestPropertyName? (string): Property name to attach rate limit info to request

    • Default: 'rateLimit'

createFastifyPlugin(rateLimiter, options?)

Registers rate limiting as a Fastify onRequest hook.

  • Shares the same option signatures as createExpressMiddleware, including points and contextBuilder.
  • Automatically sets response headers using Fastify's reply helpers.

createNextMiddleware(rateLimiter, options?)

Provides a drop-in middleware for Next.js API routes.

  • Supports the same options as createExpressMiddleware.
  • Attaches rate limit metadata to the request object and sets response headers via setHeader.

Rate Limit Algorithms

Token Bucket

  • Allows bursts up to bucket capacity
  • Refills tokens at steady rate
  • Best for: APIs that can handle bursts

Sliding Window

  • Precise rate limiting over moving time window
  • Higher memory usage but accurate
  • Best for: Strict rate enforcement

Fixed Window

  • Simple time windows with reset points
  • Lower memory usage, allows bursts at window boundaries
  • Best for: Simple rate limiting needs

Leaky Bucket

  • Smooths out bursts by processing at steady rate
  • Queues requests up to bucket capacity
  • Best for: Traffic shaping and smoothing

Configuration Examples

API Rate Limiting

// Public API - 1000 requests per hour
const publicApi = createRateLimiter({
  algorithm: 'sliding-window',
  limit: 1000,
  windowMs: 3600000,
  storage: 'redis' // Distributed limiting
});

// Premium API - 5000 requests per hour  
const premiumApi = createRateLimiter({
  algorithm: 'token-bucket',
  limit: 5000,
  windowMs: 3600000,
  storage: 'redis'
});

Different Endpoints

// Login attempts - strict limiting
const loginLimiter = createRateLimiter({
  algorithm: 'fixed-window',
  limit: 5,
  windowMs: 900000, // 15 minutes
  storage: 'memory'
});

// File uploads - burst handling
const uploadLimiter = createRateLimiter({
  algorithm: 'token-bucket', 
  limit: 10,
  windowMs: 60000, // 1 minute
  storage: 'memory'
});

app.post('/login', createExpressMiddleware(loginLimiter));
app.post('/upload', createExpressMiddleware(uploadLimiter));

Advanced Middleware Configuration

const advancedMiddleware = createExpressMiddleware(rateLimiter, {
  // Use user ID for authenticated users, IP for anonymous
  keyGenerator: (req) => {
    return req.user?.id || req.ip;
  },
  
  // Skip rate limiting for admin users
  skip: (req) => {
    return req.user?.role === 'admin';
  },
  
  // Custom response for rate limit exceeded
  handler: (req, res, result) => {
    const retryAfter = Math.ceil((result.reset - Date.now()) / 1000);

    res.status(429).json({
      error: 'Rate limit exceeded',
      retryAfter,
      limit: result.limit,
      remaining: result.remaining
    });
  },

  // Include both standard and legacy headers
  standardHeaders: true,
  legacyHeaders: true,

  // Attach rate limit info to request for logging
  requestPropertyName: 'rateLimitInfo',

  // Provide additional context for custom rules or analytics
  contextBuilder: (req) => ({
    key: req.user?.id || req.ip,
    userId: req.user?.id,
    plan: req.user?.plan || 'free'
  })
});

Integration Status

Logger Integration

Status: Not applicable

Justification: This is a pure utility package with no runtime state, side effects, or error conditions that require logging. The RateLimiter class is stateless and deterministic - it simply calculates rate limit status based on request counts and time windows. All errors are returned as structured results that calling code can handle, and there are no background operations, network calls, or state changes that would benefit from structured logging.

Pattern: Pure functional utility - no logger integration needed.

NeverHub Integration

Status: Not applicable

Justification: This is a core utility package that provides rate limiting functionality. It does not participate in service discovery, event publishing, or service mesh operations. Rate limiting is a stateless utility operation that doesn't require service registration or discovery.

Pattern: Core utility - no service mesh integration needed.

Docs-Suite Integration

Status: Ready

Format: TypeDoc-compatible JSDoc comments are included throughout the source code. All public APIs are documented with examples and type information.

Error Handling

All rate limiter methods return structured results instead of throwing errors:

const result = await rateLimiter.check('user-123');

if (!result.allowed) {
  // Handle rate limit exceeded
  console.log(`Blocked: ${result.remaining} remaining, reset at ${result.reset}`);
}

For storage backend errors (Redis connection issues, etc.), the rate limiter will:

  1. Log the error (if logger integration is available)
  2. Fall back to allowing the request (fail-open policy)
  3. Return appropriate error information in the result

Performance

  • Memory Storage: Handles 100k+ requests/second per algorithm instance
  • Redis Storage: Performance depends on Redis server and network latency
  • Algorithms: Token bucket and fixed window are most performant
  • Memory Usage: Sliding window uses most memory, fixed window uses least

Testing

# Run all tests
npm test

# Run tests with coverage
npm run test:coverage

# Run specific test suite
npm test -- algorithms/token-bucket.test.ts

See Also

License

Copyright (c) 2025 Bernier LLC. All rights reserved.