npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

parallel-batch

v0.1.1

Published

Batches big arrays, does stuff in parallel (per-batch), and lets you merge the results

Downloads

61

Readme

parallel-batch

This tiny library builds on top of async, to add easy batch-running.

Please note the license when using this library. It's pretty free, and you're welcome to fork... but I always appreciate a pull request ;-)

Installation

This is a node module. You install it with npm:

npm install parallel-batch

Usage

Parallel-batching makes it easy for you to split an array in batches (smaller arrays of at most a specified size), and running a function on each batch. The final callback lets you merge the results as you wish.

The library exposes a single function:

var parallelBatch = require("parallel-batch");

The function takes four arguments:

parallelBatch(array, batchSize, iterator, callback);
  • array (Array) is the array to batch and iterate over.
  • batchSize (Number) is the maximum size of each batch.
  • iterator (Function) is the function that is run for each batch. It must take two arguments: a batch and a callback. The batch is an array that is at most batchSize big, containing elements from array. The callback should be the last thing called from the iterator, and is called in the normal node-style: the first argument is an error if one occurred, null otherwise; the second argument contains the result.
  • callback (Function) is the final callback that is called when all the iterators have concluded. The callback takes two arguments: an error (which will be null if everything went as it should), and an array of results. There is one element in the result array per batch.

When should I use parallel-batch?

In the example below, you'll see a good use case for this library: the API we are using has an artificial limit per request.

You shouldn't use this library if you simply want to throttle your code, so it doesn't make too many requests at a time. Async has a much better built-in alternative for that, namely parallelLimit, seriesLimit, and eachLimit. These functions make sure that no more than the limit parallel requests are ongoing at the same time, a different use case from this library.

Example: Batched requests

Given a really big list of users, we want to delete them all. We are using an API that has a mass-delete function, but limits the number of users you can delete to 100 per request.

parallelBatch(usersToDelete, 100, function(batch, callback) {
	var ids = _.map(usersToDelete, "id");
	request.post({
		uri: uri+"/mass-delete",
		body: JSON.stringify({
			users: usersToDelete
		})
	}, function(error, httpResponse, body) {
		if(error) {
			return callback(error);
		}
		
		if(httpResponse.statusCode == 200) {
			var names = _.map(usersToDelete, "name");
			return callback(null, names);
		}
		
		callback(new Error("Got non-200 error code ("+httpResponse.statusCode+") on delete: "+body));
	});
}, function(error, results) {
	if(error) {
		return console.error("Failed to delete users", error);
	}
	
	var allNames = _.flatten(results);
	console.log("Deleted the following users:", allNames);
});

This example shows a very simple way of merging the results of each iterator: simply flattening the results.