@scratch/task-herder
v12.3.1
Published
An asynchronous task runner with configurable rate limiting / throttling and concurrency control.
Downloads
284
Maintainers
Readme
Task Herder
Task Herder helps you herd your asynchronous ~~cats~~ ~~kats~~ tasks. Its main job is to provide a way to queue up tasks and run them with configurable throttling and concurrency limits.
Concepts
The queue operates according to the Token Bucket algorithm. The general idea is that you have a "bucket" that can hold a certain number of "tokens". Each task "costs" a certain number of tokens (default: 1) to run. Tokens are added to the bucket at a certain rate (the "refill rate") up to the maximum capacity of the bucket.
When a task is added to the queue, if there are enough tokens in the bucket, the task will "spend" the required number of tokens and be eligible to run. If there are not enough tokens, the task will wait in the queue until enough tokens are available. Tasks are run in the order they were added to the queue (FIFO), so keep in mind that an expensive task may delay cheaper tasks behind it until it can run.
Eligible tasks are run immediately, up to the maximum concurrency limit. This is useful if you need to limit concurrent resource usage with tasks that take an unpredictable amount of time. For example, you may want to make no more than 5 concurrent connections to a particular service at a time. If the concurrency limit is reached, eligible tasks will wait their turn. This stage of processing is also FIFO.
Once a task makes it through both the token and concurrency checks, it is executed. The Promise generated by the Task Queue will then resolve or reject based on the outcome of the task.
Usage
import TaskQueue from 'task-herder'
const queue = new TaskQueue({
// The maximum number of tokens in the bucket controls the burst limit
burstLimit: 10,
// Rate at which tokens are added to the bucket (tokens per second) controls the sustained rate
sustainRate: 5,
// Initial number of tokens in the bucket
// Default: burstLimit (i.e., start with a full bucket)
startingTokens: 5,
// Reject a task if it would cause the total queue cost to exceed this limit
// Default: no limit
queueCostLimit: 50,
// Number of tasks that can be processed concurrently
// Default: 1 (run tasks serially)
concurrency: 3,
})
// Using `await` syntax
try {
const response = await queue.do(
// Your async task here
() => fetch('https://example.com/data'),
{
// Optional cost of this task (default: 1)
cost: 2,
},
)
// Handle successful response
} catch (error) {
// Handle error
}
// Using Promise syntax
queue
.do(
// Your async task here
() => fetch('https://example.com/data'),
{
// Optional cost of this task (default: 1)
cost: 2,
},
)
.then(response => {
// Handle successful response
})
.catch(error => {
// Handle error
})You can also supply an AbortSignal to cancel a task before it starts:
const controller = new AbortController()
queue
.do(
() =>
fetch('https://example.com/data', {
// Optionally, provide the same signal (or a different one) so it can interrupt the fetch request
signal: controller.signal,
}),
{
// Provide the signal to the task queue so it can cancel the task before it starts
signal: controller.signal,
},
)
.catch(error => {
if (error.name === 'AbortError') {
// Handle task cancellation
} else {
// Handle other errors
}
})Or, you can remove one task or all tasks from the queue before they start:
const taskPromise = queue.do(() => fetch('https://example.com/data'))
queue.cancel(taskPromise, optionalReason) // Remove a specific task from the queue
queue.cancelAll(optionalReason) // Remove all tasks from the queueDonate
We provide Scratch free of charge, and want to keep it that way! Please consider making a donation to support our continued engineering, design, community, and resource development efforts. Donations of any size are appreciated. Thank you!
