npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@philiprehberger/async-batcher

v0.2.1

Published

Automatic batching and deduplication for async operations

Readme

@philiprehberger/async-batcher

CI npm version Last updated

Automatic batching and deduplication for async operations

Installation

npm install @philiprehberger/async-batcher

Usage

import { createBatcher } from '@philiprehberger/async-batcher';

const userLoader = createBatcher(async (ids: string[]) => {
  // Called once with all collected IDs
  return db.users.findMany({ where: { id: { in: ids } } });
});

// These are automatically batched into a single call
const user1 = await userLoader.load('user-1');
const user2 = await userLoader.load('user-2');

// Load multiple at once
const users = await userLoader.loadMany(['user-3', 'user-4']);

Options

const loader = createBatcher(batchFn, {
  maxBatchSize: 100,  // Flush when batch reaches this size (default: 100)
  windowMs: 10,       // Batch window in milliseconds (default: 10)
});

How It Works

  1. load(key) adds the key to an internal queue and returns a promise
  2. After windowMs or when maxBatchSize is reached, the queue is flushed
  3. Duplicate keys within a batch window are deduplicated — the batch function receives unique keys only
  4. Results are matched back to callers by index (batch function must return results in the same order as keys)

Batch Metrics

Track batch performance and error rates with built-in metrics:

const loader = createBatcher(batchFn);

await loader.load('key-1');
await loader.load('key-2');

const metrics = loader.getMetrics();
console.log(metrics.totalBatches);          // number of batches executed
console.log(metrics.totalItems);            // total items processed
console.log(metrics.errorCount);            // number of failed batches
console.log(metrics.batchSizeDistribution); // array of batch sizes

// Reset counters
loader.resetMetrics();

Max Queue Size Limit

Prevent unbounded memory growth by capping the queue size:

// Throw an error when the queue is full
const loader = createBatcher(batchFn, {
  maxQueueSize: 500,
  overflowStrategy: 'throw', // default
});

try {
  await loader.load('key');
} catch (err) {
  // BatcherQueueFullError: Queue is full (max size: 500)
}

// Drop the oldest item when the queue is full
const loader2 = createBatcher(batchFn, {
  maxQueueSize: 500,
  overflowStrategy: 'drop-oldest',
});

Batch Prioritization

Assign priority levels so critical items are processed first within each batch:

const loader = createBatcher(batchFn);

// Items are sorted by priority before the batch function is called
loader.load('background-task', 'low');
loader.load('user-request', 'normal');
loader.load('payment', 'critical');
loader.load('admin-action', 'high');

// Priority order: critical > high > normal > low

Retry on Batch Failure

Automatically retry failed batches with exponential backoff:

const loader = createBatcher(batchFn, {
  retryCount: 3,      // retry up to 3 times (default: 0)
  retryDelayMs: 100,   // initial delay of 100ms (default: 100)
});

// If batchFn fails, it retries with delays of 100ms, 200ms, 400ms
await loader.load('key');

API

| Export | Description | |--------|-------------| | createBatcher(batchFn, options?) | Create a new batcher instance | | BatcherQueueFullError | Error thrown when queue exceeds maxQueueSize with throw strategy |

Batcher<K, V>

| Method | Description | |--------|-------------| | load(key, priority?) | Load a single value, batched automatically. Optional priority: 'low', 'normal', 'high', 'critical' | | loadMany(keys, priority?) | Load multiple values, returns Promise<V[]>. Optional priority applies to all keys | | getMetrics() | Returns a snapshot of batch metrics | | resetMetrics() | Resets all metric counters to zero |

BatcherOptions

| Option | Type | Default | Description | |--------|------|---------|-------------| | maxBatchSize | number | 100 | Max keys per batch | | windowMs | number | 10 | Batch collection window in ms | | maxQueueSize | number | undefined | Max items allowed in the queue | | overflowStrategy | 'drop-oldest' \| 'throw' | 'throw' | What to do when queue exceeds maxQueueSize | | retryCount | number | 0 | Number of retry attempts on batch failure | | retryDelayMs | number | 100 | Initial retry delay in ms (doubles each attempt) |

BatchMetrics

| Property | Type | Description | |----------|------|-------------| | totalBatches | number | Total number of batches executed | | totalItems | number | Total number of items processed | | errorCount | number | Number of batches that failed | | batchSizeDistribution | number[] | Array of unique-key counts per batch |

Priority

'low' | 'normal' | 'high' | 'critical'

OverflowStrategy

'drop-oldest' | 'throw'

Development

npm install
npm run build
npm test

Support

If you find this project useful:

Star the repo

🐛 Report issues

💡 Suggest features

❤️ Sponsor development

🌐 All Open Source Projects

💻 GitHub Profile

🔗 LinkedIn Profile

License

MIT