adaptive-batcher
v1.0.0
Published
Self-tuning promise batcher with warm-up and verbose logging
Maintainers
Readme
Adaptive-Batcher 🔥🧊
Self-tuning promise batcher for Node and browsers
Scales up when calls are fast, scales down when they slow down.
✨ Features
| | |
|-------------------------------|---------------------------------------------------------------------|
| Adaptive size | Calculates the next batch so a round-trip never exceeds a target time |
| Warm-up phase | Starts cautiously (🧊) and doubles each batch for warmUpRounds |
| Verbose logging | One-line metrics for every batch (🧊 warm-up / 🔥 adaptive) |
| Progress events | Hook up spinners, busy bars, or UI5 BusyIndicator |
| Framework-agnostic | Works with fetch, Axios, GraphQL, SAP UI5 OData, or any async task |
| Zero dependencies | Tiny (≈1.5 kB min+gz) |
📦 Installation
npm i adaptive-batcher🚀 Quick start
import { AdaptiveBatcher } from 'adaptive-batcher';
const tasks = Array.from({ length: 500 }, () =>
() => fetch('/api/ping').then(r => r.json())
);
const batcher = new AdaptiveBatcher({
maxBatchTime : 10_000, // keep each batch < 10 s
warmUpRounds : 2, // 2 batches: 2→4→8, then adaptive
verbose : true // console logs with emojis 🧊/🔥
})
.on('progress', p =>
console.log(`done ${p.done}/${p.total} (next ≈ ${p.nextSize})`));
await batcher.run(
tasks,
/* buildBatch */ slice => {}, // no prep needed for fetch
/* execBatch */ () => Promise.all(slice.map(fn => fn()))
);Console (verbose on):
🧊 2/500 batch=2 t=720 ms
🧊 6/500 batch=4 t=1420 ms
🧊 14/500 batch=8 t=2840 ms
🔥 30/500 batch=16 t=4900 ms
🔥 46/500 batch=17 t=9000 ms
…🛠 API
new AdaptiveBatcher(options?)
| Option | Default | Description |
|-------------------|----------|-------------|
| maxBatchTime | 60000 | Upper-bound (ms) for one batch’s round-trip time |
| startSize | 2 | Number of items in the very first batch |
| minSize | 1 | Hard floor for adaptive size |
| maxSize | 200 | Hard ceiling (e.g. SAP Gateway limits) |
| safetyFactor | 0.9 | Multiplier (0-1) applied to theoretical size for head-room |
| warmUpRounds | 0 | How many warm-up batches (size doubles each round) |
| verbose | false | true → logs to console, fn → custom logger (msg)=>void |
run(items, buildBatch, executeBatch)
| Param | Type | Purpose |
|----------------|-----------------------------------|---------|
| items | T[] | Source array (is mutated/spliced) |
| buildBatch | (slice:T[]) => void\|Promise | Prepare the slice (add OData creates, build HTTP payload, …) |
| executeBatch | () => Promise | Fire the request and return a Promise |
Events
batcher.on('progress', info => { /* ProgressInfo */ })
.on('error', err => { /* error */ })
.on('end', () => { /* queue empty */ });ProgressInfo:
{
done: number; // processed so far
total: number; // initial length
elapsed: number; // ms the last batch took
nextSize: number; // batch size the lib will try next
warmUp?: boolean; // true during warm-up
warmLeft?:number; // remaining warm-up rounds
}🌍 UI5 + OData V2 sample
import ODataModel from "sap/ui/model/odata/v2/ODataModel";
import { AdaptiveBatcher } from "adaptive-batcher";
const oModel = new ODataModel("/sap/opu/odata/sap/ZSO_SRV", { useBatch: true });
const GROUP = "ADAPTIVE";
oModel.setDeferredGroups([GROUP]);
async function upload(items) {
const batcher = new AdaptiveBatcher({
maxBatchTime : 10_000,
warmUpRounds : 2,
maxSize : 250,
verbose : true
}).on('progress', p =>
sap.m.MessageToast.show(`${p.done}/${p.total} uploaded…`));
await batcher.run(
items,
// buildBatch → add creates to deferred group
slice => slice.forEach(row =>
oModel.create('/SalesOrderItemSet', row, { groupId: GROUP })),
// executeBatch → fire $batch
() => new Promise<void>((res, rej) =>
oModel.submitChanges({
groupId : GROUP,
success : () => res(),
error : err => rej(err)
}))
);
sap.m.MessageToast.show("All items uploaded 🎉");
}🔍 Verbose logging details
| Emoji | Phase | Example line |
|-------|-----------------------------|-------------------------------------------|
| 🧊 | Warm-up (warmLeft > 0) | 🧊 14/500 batch=8 t=2840 ms |
| 🔥 | Adaptive (post warm-up) | 🔥 46/500 batch=17 t=9000 ms |
Pass verbose:true or a custom function:
const batcher = new AdaptiveBatcher({
verbose: msg => myLogger.debug(msg) // custom
});📝 Roadmap
- Rolling average to damp latency spikes
- AbortController / cancellation
- Browser perf marks for flamegraphs
📜 License
Enjoy faster, safer batching!
PRs & issues welcome.
