batchkit
v0.3.1
Published
A modern TypeScript library for batching async operations
Maintainers
Readme
batchkit
Tiny batching library. Supports various scheduling and deduplication strategies.
Installation
npm install batchkit
# or
bun add batchkitQuick Start
import { batch } from 'batchkit'
const users = batch(
(ids) => db.users.findMany({ where: { id: { in: ids } } }),
'id'
)
// These calls are batched into one database query
const [alice, bob] = await Promise.all([
users.get(1),
users.get(2),
])API
batch(fn, match, options?)
Creates a batcher.
const users = batch(
// The batch function - receives keys and an AbortSignal
async (ids: number[], signal: AbortSignal) => {
return api.getUsers(ids, { signal })
},
// How to match results - just the field name
'id',
// Optional configuration
{
wait: 10, // ms to wait before dispatch (default: 0 = microtask)
max: 100, // max batch size
name: 'users', // for debugging
}
)batcher.get(key) / batcher.get(keys)
Get one or many items:
// Single item
const user = await users.get(1)
// Multiple items (batched together)
const [a, b] = await Promise.all([users.get(1), users.get(2)])
// Array syntax
const team = await users.get([1, 2, 3, 4, 5])batcher.get(key, { signal })
Cancel a request:
const controller = new AbortController()
const user = await users.get(1, { signal: controller.signal })
// Later...
controller.abort() // Rejects with AbortErrorbatcher.flush()
Execute pending batch immediately:
users.get(1)
users.get(2)
await users.flush() // Don't wait for schedulerbatcher.abort()
Abort the in-flight batch:
users.abort() // All pending requests reject with AbortErrorMatching Results
By Field Name (most common)
batch(fn, 'id')
// Matches results where result.id === requestedKeyFor Record/Object Responses
import { batch, indexed } from 'batchkit'
const users = batch(
async (ids) => {
// Returns { "1": {...}, "2": {...} }
return fetchUsersAsRecord(ids)
},
indexed
)Custom Matching
batch(
fn,
(results, key) => results.find(r => r.externalId === key)
)Scheduling
Default: Microtask
Batches all calls within the same event loop tick:
const users = batch(fn, 'id')
// these synchronous invocations are batched into one request
users.get(1)
users.get(2)
users.get(3)Delayed
Wait before dispatching:
batch(fn, 'id', { wait: 10 }) // 10ms windowAnimation Frame
Sync with rendering:
import { batch, onAnimationFrame } from 'batchkit'
batch(fn, 'id', { schedule: onAnimationFrame })Idle
Background/low-priority work:
import { batch, onIdle } from 'batchkit'
batch(fn, 'id', { schedule: onIdle({ timeout: 100 }) })Deduplication
Duplicate keys in the same batch are automatically deduplicated:
// Only ONE request for id=1
await Promise.all([
users.get(1),
users.get(1),
users.get(1),
])For complex keys, provide a key function:
batch(fn, match, {
key: (query) => query.id // Dedupe by query.id
})Tracing
Debug batch behavior:
batch(fn, 'id', {
name: 'users',
trace: (event) => {
console.log(event.type, event)
// 'get', 'schedule', 'dispatch', 'resolve', 'error', 'abort'
}
})Examples
React + TanStack Query
import { batch } from 'batchkit'
import { useQuery } from '@tanstack/react-query'
const users = batch(
(ids, signal) => fetch(`/api/users?ids=${ids.join(',')}`, { signal }).then(r => r.json()),
'id'
)
function UserAvatar({ userId }: { userId: string }) {
const { data } = useQuery({
queryKey: ['user', userId],
queryFn: ({ signal }) => users.get(userId, { signal })
})
return <img src={data?.avatar} />
}
// Rendering 100 UserAvatars from a service -> 1 HTTP requestAPI with Rate Limits
const products = batch(
(ids) => shopify.products.list({ ids }),
'id',
{ max: 50 } // Shopify's limit
)
// 200 product requests = 4 API calls (50 each)TypeScript
Correctly infers types based on call site
type User = { id: number; name: string }
const users = batch(
async (ids: number[]): Promise<User[]> => fetchUsers(ids),
'id'
)
const user = await users.get(1) // user: User
const many = await users.get([1, 2]) // many: User[]License
MIT
