npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@jbingen/microbatch

v0.1.0

Published

Automatic request batching. Collects individual calls into batches within a microtask.

Readme

🧺 microbatch

npm version npm bundle size license

Automatic request batching. Collects individual calls into batches within a microtask.

For anyone dealing with N+1 queries, request fan-out, or redundant API calls.

npm install @jbingen/microbatch
// before: 3 separate DB queries
const user1 = await getUser(1);
const user2 = await getUser(2);
const user3 = await getUser(3);

// after: 1 batched query, same API
const loader = batcher(ids => getUsersByIds(ids));
const [user1, user2, user3] = await Promise.all([
  loader.load(1),
  loader.load(2),
  loader.load(3),
]);

Individual load() calls within the same microtask are automatically collected and dispatched as a single batch. Each caller gets back just their result.

import { batcher } from "@jbingen/microbatch";

const userLoader = batcher(async (ids: number[]) => {
  const users = await db.query("SELECT * FROM users WHERE id IN (?)", ids);
  return new Map(users.map(u => [u.id, u]));
});

// these fire one query, not three
const alice = await userLoader.load(1);
const bob = await userLoader.load(2);
const carol = await userLoader.load(3);

Why

The N+1 problem shows up everywhere: rendering a list of items that each need to fetch related data, resolving GraphQL fields, hydrating objects from a cache. The usual fix is DataLoader, but that's tied to GraphQL conventions and carries more weight than most apps need.

microbatch is the batching primitive extracted. It works with any data source, any framework, and any key type. Zero dependencies.

API

batcher(batchFn, options?)

Creates a batcher. The batchFn receives an array of keys and must return a Map keyed by those same keys.

const loader = batcher(async (ids: string[]) => {
  const results = await fetchMany(ids);
  return new Map(results.map(r => [r.id, r]));
});

Returns an object with load and flush.

.load(key)

Queues a key for the next batch. Returns a promise that resolves with the value for that key.

const result = await loader.load("abc");

If the batch function's result Map doesn't contain the key, the promise rejects with an error. If the batch function itself throws, all pending promises reject with that error.

.flush()

Immediately dispatches any pending keys without waiting for the microtask.

loader.load(1);
loader.load(2);
await loader.flush();

Options

| Option | Type | Default | Description | |--------|------|---------|-------------| | maxSize | number | Infinity | Flush immediately when the queue reaches this size | | maxWait | number | 0 | Max ms to wait before flushing (0 = microtask only) |

maxSize

Caps the number of unique keys per batch. When the queue hits maxSize unique keys, it flushes immediately without waiting for the microtask. Duplicate loads for the same key don't count toward the limit. Remaining keys go into the next batch.

const loader = batcher(batchFn, { maxSize: 50 });

maxWait

Adds a timer-based flush in addition to the microtask flush. Useful when loads trickle in across multiple ticks and you want to batch them within a time window.

const loader = batcher(batchFn, { maxWait: 10 });

How batching works

  1. You call load(key) - the key is added to an internal queue
  2. If maxSize is reached, the queue flushes immediately
  3. Otherwise, a microtask is scheduled (or a timer if maxWait is set)
  4. When the microtask fires, all queued keys are passed to batchFn as a single array
  5. The returned Map is used to resolve each caller's promise individually
  6. If a key is missing from the Map, that caller's promise rejects
  7. If batchFn throws, all callers in that batch reject

No caching. Each load() call always goes through batching. If you want caching, layer it on top.

Per-key error handling

The batch function returns a Map. If a key is present, the caller gets the value. If a key is missing, the caller gets a rejection. This lets you handle partial failures cleanly:

const results = await Promise.allSettled([
  loader.load(1), // fulfilled
  loader.load(2), // rejected (missing from Map)
  loader.load(3), // fulfilled
]);

If the batch function itself throws (e.g., network error), every caller in that batch rejects with the same error.

Design decisions

  • Zero dependencies. Tiny footprint.
  • Batches within a microtask by default. No timers unless you opt in with maxWait.
  • Batch function must return a Map for per-key resolution. Arrays would require positional matching which is fragile.
  • No caching, no deduplication, no memoization. This is just the batching primitive.
  • Works with any key type that can be a Map key (strings, numbers, objects by reference). Object keys use reference equality - two { id: 1 } literals won't dedupe unless they're the same reference.
  • maxSize flushes synchronously so you can control memory and payload size.