@bernierllc/rate-limiter
v0.1.7
Published
High-performance rate limiting with multiple algorithms (token bucket, sliding window) and storage backends (memory, Redis, database)
Readme
@bernierllc/rate-limiter
High-performance rate limiting with multiple algorithms and storage backends for Node.js applications.
Installation
npm install @bernierllc/rate-limiterFor Redis support, install the Redis client:
# Using node-redis
npm install redis
# Or using ioredis
npm install ioredisUsage
Basic Rate Limiting
import { createRateLimiter } from '@bernierllc/rate-limiter';
// Create a rate limiter with default token bucket algorithm
const rateLimiter = createRateLimiter({
algorithm: 'token-bucket',
limit: 100,
windowMs: 60000, // 1 minute
storage: 'memory'
});
// Check if request is allowed
const result = await rateLimiter.check('user-123');
if (result.allowed) {
console.log(`Request allowed. ${result.remaining} requests remaining.`);
} else {
console.log(`Rate limit exceeded. Try again in ${result.reset - Date.now()}ms.`);
}Express Middleware
import express from 'express';
import { createRateLimiter, createExpressMiddleware } from '@bernierllc/rate-limiter';
const app = express();
// Create rate limiter
const rateLimiter = createRateLimiter({
algorithm: 'sliding-window',
limit: 50,
windowMs: 60000,
storage: 'memory'
});
// Apply middleware
app.use('/api', createExpressMiddleware(rateLimiter, {
keyGenerator: (req) => req.ip,
skip: (req) => req.method === 'GET' && req.path === '/health',
handler: (req, res) => {
res.status(429).json({ error: 'Too many requests' });
},
contextBuilder: (req) => ({
key: req.ip,
path: req.path,
method: req.method,
userId: req.user?.id
})
}));Fastify Plugin
import fastify from 'fastify';
import { createRateLimiter, createFastifyPlugin } from '@bernierllc/rate-limiter';
const app = fastify();
const rateLimiter = createRateLimiter({
algorithm: 'fixed-window',
limit: 200,
windowMs: 60000
});
app.register(createFastifyPlugin(rateLimiter, {
points: request => (request.routerPath === '/upload' ? 5 : 1)
}));Next.js Middleware
import type { NextApiHandler } from 'next';
import { createRateLimiter, createNextMiddleware } from '@bernierllc/rate-limiter';
const limiter = createRateLimiter({
algorithm: 'token-bucket',
limit: 120,
windowMs: 60000
});
const rateLimit = createNextMiddleware(limiter);
const handler: NextApiHandler = async (req, res) => {
await rateLimit(req, res, async () => {
res.status(200).json({ ok: true, remaining: req.rateLimit?.remaining });
});
};
export default handler;Redis Storage Backend
import { createRateLimiter } from '@bernierllc/rate-limiter';
import { createClient } from 'redis';
// Using node-redis
const redisClient = createClient({ url: 'redis://localhost:6379' });
await redisClient.connect();
const rateLimiter = createRateLimiter({
algorithm: 'fixed-window',
limit: 1000,
windowMs: 3600000, // 1 hour
storage: {
type: 'redis',
client: redisClient,
keyPrefix: 'rl:'
}
});Using ioredis
import { createRateLimiter } from '@bernierllc/rate-limiter';
import Redis from 'ioredis';
const redisClient = new Redis({
host: 'localhost',
port: 6379
});
const rateLimiter = createRateLimiter({
algorithm: 'sliding-window',
limit: 500,
windowMs: 900000, // 15 minutes
storage: {
type: 'redis',
client: redisClient,
keyPrefix: 'api_limits:'
}
});Database Storage Backend
import { createRateLimiter } from '@bernierllc/rate-limiter';
const adapter = {
get: async (key) => database.fetchRateLimit(key),
set: async (key, data, ttl) => database.upsertRateLimit(key, data, ttl),
increment: async (key, amount = 1) => database.incrementRateLimit(key, amount),
delete: async (key) => database.removeRateLimit(key),
expire: async (key, ttl) => database.setExpiry(key, ttl)
};
const rateLimiter = createRateLimiter({
limit: 100,
windowMs: 60000,
storage: {
type: 'database',
adapter,
keyPrefix: 'api:'
}
});API Reference
createRateLimiter(options)
Creates a rate limiter instance with the specified configuration.
Options
- limit (
number): Maximum number of requests allowed in the base configuration (required) - windowMs (
number): Time window in milliseconds for the base configuration (required) - algorithm? (
string): Algorithm to use for the base configuration ('token-bucket','sliding-window','fixed-window','leaky-bucket') - bucketSize? / refillRate? / tokensPerRequest? / precision? / leakRate? (
number): Algorithm-specific overrides for the base configuration - storage? (
string | object): Storage backend configuration'memory'- In-memory storage (default){ type: 'redis', client: RedisClient | IORedisClient, keyPrefix?: string }- Redis-backed storage{ type: 'database', adapter: RateLimitDatabaseAdapter, keyPrefix?: string }- Database-backed storage using custom adapter
- skipOnError? (
boolean): Whentrue, allows traffic if the storage backend fails (defaultfalse) - onLimitReached? (
function): Callback invoked when a limit is exceeded. Receives(key, result, context) - customRules? (
RateLimitRule[]): Array of context-aware rules that can override algorithm, limits, or storage key prefix per request
Returns
Returns a RateLimiter instance with the following methods:
check(key: string, options?: RateLimitCheckOptions): Promise<RateLimitResult>- Check if request is allowed with optional points/contextreset(key: string): Promise<boolean>- Reset limits for a keygetUsage(key: string): Promise<UsageInfo>- Get current usage stats
createExpressMiddleware(rateLimiter, options?)
Creates Express middleware from a rate limiter instance.
Options
keyGenerator? (
function): Function to generate rate limit key from request- Default:
(req) => req.ip
- Default:
skip? (
function): Function to skip rate limiting for certain requests- Default:
() => false
- Default:
handler? (
function): Custom handler when limit exceeded- Default: Sends 429 status with standard message
points? (
number | function): Number of points consumed per request or resolver functioncontextBuilder? (
function): Builds contextual metadata passed to custom rules and storage keyslegacyHeaders? (
boolean): Include legacyX-RateLimit-*headers- Default:
false
- Default:
standardHeaders? (
boolean): Include standardRateLimit-*headers- Default:
true
- Default:
requestPropertyName? (
string): Property name to attach rate limit info to request- Default:
'rateLimit'
- Default:
createFastifyPlugin(rateLimiter, options?)
Registers rate limiting as a Fastify onRequest hook.
- Shares the same option signatures as
createExpressMiddleware, includingpointsandcontextBuilder. - Automatically sets response headers using Fastify's reply helpers.
createNextMiddleware(rateLimiter, options?)
Provides a drop-in middleware for Next.js API routes.
- Supports the same options as
createExpressMiddleware. - Attaches rate limit metadata to the request object and sets response headers via
setHeader.
Rate Limit Algorithms
Token Bucket
- Allows bursts up to bucket capacity
- Refills tokens at steady rate
- Best for: APIs that can handle bursts
Sliding Window
- Precise rate limiting over moving time window
- Higher memory usage but accurate
- Best for: Strict rate enforcement
Fixed Window
- Simple time windows with reset points
- Lower memory usage, allows bursts at window boundaries
- Best for: Simple rate limiting needs
Leaky Bucket
- Smooths out bursts by processing at steady rate
- Queues requests up to bucket capacity
- Best for: Traffic shaping and smoothing
Configuration Examples
API Rate Limiting
// Public API - 1000 requests per hour
const publicApi = createRateLimiter({
algorithm: 'sliding-window',
limit: 1000,
windowMs: 3600000,
storage: 'redis' // Distributed limiting
});
// Premium API - 5000 requests per hour
const premiumApi = createRateLimiter({
algorithm: 'token-bucket',
limit: 5000,
windowMs: 3600000,
storage: 'redis'
});Different Endpoints
// Login attempts - strict limiting
const loginLimiter = createRateLimiter({
algorithm: 'fixed-window',
limit: 5,
windowMs: 900000, // 15 minutes
storage: 'memory'
});
// File uploads - burst handling
const uploadLimiter = createRateLimiter({
algorithm: 'token-bucket',
limit: 10,
windowMs: 60000, // 1 minute
storage: 'memory'
});
app.post('/login', createExpressMiddleware(loginLimiter));
app.post('/upload', createExpressMiddleware(uploadLimiter));Advanced Middleware Configuration
const advancedMiddleware = createExpressMiddleware(rateLimiter, {
// Use user ID for authenticated users, IP for anonymous
keyGenerator: (req) => {
return req.user?.id || req.ip;
},
// Skip rate limiting for admin users
skip: (req) => {
return req.user?.role === 'admin';
},
// Custom response for rate limit exceeded
handler: (req, res, result) => {
const retryAfter = Math.ceil((result.reset - Date.now()) / 1000);
res.status(429).json({
error: 'Rate limit exceeded',
retryAfter,
limit: result.limit,
remaining: result.remaining
});
},
// Include both standard and legacy headers
standardHeaders: true,
legacyHeaders: true,
// Attach rate limit info to request for logging
requestPropertyName: 'rateLimitInfo',
// Provide additional context for custom rules or analytics
contextBuilder: (req) => ({
key: req.user?.id || req.ip,
userId: req.user?.id,
plan: req.user?.plan || 'free'
})
});Integration Status
Logger Integration
Status: Not applicable
Justification: This is a pure utility package with no runtime state, side effects, or error conditions that require logging. The RateLimiter class is stateless and deterministic - it simply calculates rate limit status based on request counts and time windows. All errors are returned as structured results that calling code can handle, and there are no background operations, network calls, or state changes that would benefit from structured logging.
Pattern: Pure functional utility - no logger integration needed.
NeverHub Integration
Status: Not applicable
Justification: This is a core utility package that provides rate limiting functionality. It does not participate in service discovery, event publishing, or service mesh operations. Rate limiting is a stateless utility operation that doesn't require service registration or discovery.
Pattern: Core utility - no service mesh integration needed.
Docs-Suite Integration
Status: Ready
Format: TypeDoc-compatible JSDoc comments are included throughout the source code. All public APIs are documented with examples and type information.
Error Handling
All rate limiter methods return structured results instead of throwing errors:
const result = await rateLimiter.check('user-123');
if (!result.allowed) {
// Handle rate limit exceeded
console.log(`Blocked: ${result.remaining} remaining, reset at ${result.reset}`);
}For storage backend errors (Redis connection issues, etc.), the rate limiter will:
- Log the error (if logger integration is available)
- Fall back to allowing the request (fail-open policy)
- Return appropriate error information in the result
Performance
- Memory Storage: Handles 100k+ requests/second per algorithm instance
- Redis Storage: Performance depends on Redis server and network latency
- Algorithms: Token bucket and fixed window are most performant
- Memory Usage: Sliding window uses most memory, fixed window uses least
Testing
# Run all tests
npm test
# Run tests with coverage
npm run test:coverage
# Run specific test suite
npm test -- algorithms/token-bucket.test.tsSee Also
- @bernierllc/connection-parser - Redis connection string parsing (used internally)
- @bernierllc/crypto-utils - Secure token generation for rate limiting keys
License
Copyright (c) 2025 Bernier LLC. All rights reserved.
