npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

limiter-es6-compat

v2.1.2

Published

A generic rate limiter for the web and node.js. Useful for API clients, web crawling, or other tasks that need to be throttled

Downloads

73,086

Readme

limiter-es6-compat

Build Status NPM Downloads

Provides a generic rate limiter for the web and node.js. Useful for API clients, web crawling, or other tasks that need to be throttled. Two classes are exposed, RateLimiter and TokenBucket. TokenBucket provides a lower level interface to rate limiting with a configurable burst rate and drip rate. RateLimiter sits on top of the token bucket and adds a restriction on the maximum number of tokens that can be removed each interval to comply with common API restrictions such as "150 requests per hour maximum".

Installation

npm install limiter-es6-compat

Usage

A simple example allowing 150 requests per hour:

import { RateLimiter } from "limiter-es6-compat";

// Allow 150 requests per hour (the Twitter search limit). Also understands
// 'second', 'minute', 'day', or a number of milliseconds
const limiter = new RateLimiter({ tokensPerInterval: 150, interval: "hour" });

async function sendRequest() {
  // This call will throw if we request more than the maximum number of requests
  // that were set in the constructor
  // remainingRequests tells us how many additional requests could be sent
  // right this moment
  const remainingRequests = await limiter.removeTokens(1);
  callMyRequestSendingFunction(...);
}

Another example allowing one message to be sent every 250ms:

import { RateLimiter } from "limiter-es6-compat";

const limiter = new RateLimiter({ tokensPerInterval: 1, interval: 250 });

async function sendMessage() {
  const remainingMessages = await limiter.removeTokens(1);
  callMyMessageSendingFunction(...);
}

The default behaviour is to wait for the duration of the rate limiting that's currently in effect before the promise is resolved, but if you pass in "fireImmediately": true, the promise will be resolved immediately with remainingRequests set to -1:

import { RateLimiter } from "limiter-es6-compat";

const limiter = new RateLimiter({
  tokensPerInterval: 150,
  interval: "hour",
  fireImmediately: true
});

async function requestHandler(request, response) {
  // Immediately send 429 header to client when rate limiting is in effect
  const remainingRequests = await limiter.removeTokens(1);
  if (remainingRequests < 0) {
    response.writeHead(429, {'Content-Type': 'text/plain;charset=UTF-8'});
    response.end('429 Too Many Requests - your IP is being rate limited');
  } else {
    callMyMessageSendingFunction(...);
  }
}

A synchronous method, tryRemoveTokens(), is available in both RateLimiter and TokenBucket. This will return immediately with a boolean value indicating if the token removal was successful.

import { RateLimiter } from "limiter-es6-compat";

const limiter = new RateLimiter({ tokensPerInterval: 10, interval: "second" });

if (limiter.tryRemoveTokens(5))
  console.log('Tokens removed');
else
  console.log('No tokens removed');

To get the number of remaining tokens outside the removeTokens promise, simply use the getTokensRemaining method.

import { RateLimiter } from "limiter-es6-compat";

const limiter = new RateLimiter({ tokensPerInterval: 1, interval: 250 });

// Prints 1 since we did not remove a token and our number of tokens per
// interval is 1
console.log(limiter.getTokensRemaining());

Using the token bucket directly to throttle at the byte level:

import { TokenBucket } from "limiter-es6-compat";

const BURST_RATE = 1024 * 1024 * 150; // 150KB/sec burst rate
const FILL_RATE = 1024 * 1024 * 50; // 50KB/sec sustained rate

// We could also pass a parent token bucket in to create a hierarchical token
// bucket
// bucketSize, tokensPerInterval, interval
const bucket = new TokenBucket({
  bucketSize: BURST_RATE,
  tokensPerInterval: FILL_RATE,
  interval: "second"
});

async function handleData(myData) {
  await bucket.removeTokens(myData.byteLength);
  sendMyData(myData);
}

Additional Notes

Both the token bucket and rate limiter should be used with a message queue or some way of preventing multiple simultaneous calls to removeTokens(). Otherwise, earlier messages may get held up for long periods of time if more recent messages are continually draining the token bucket. This can lead to out of order messages or the appearance of "lost" messages under heavy load.

License

MIT License

Notes Regarding Fork

This repository is a fork, and the fork was needed because the original author seems to have abandoned the original project. This repository was created because ES6 imports don't work, and the Pull Request jhurliman/node-rate-limiter#92 is pending and doesn't look like it will be merged.