npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

stream-pipes

v0.0.3

Published

Composable and modular stream transforms for Node.js.

Readme

stream-pipes

Modular and composable data transformation streams for Node.js. stream-pipes helps you process data on-the-fly using clean, functional-style pipelines built on Node's stream API.

Features

  • Object-mode stream transforms
  • Convert objects to newline-delimited JSON
  • Parse delimited text lines into objects
  • Batch objects into fixed-size arrays
  • Apply custom object transformations
  • Perform side effects without altering the data (tap)
  • Read and write files as streams, with optional Gzip compression
  • Read and write lines from/to files and generic streams

Installation

npm install stream-pipes

Example

import { pipeline } from 'stream/promises'

import {
  createFileReader,
  createFileWriter,
  createDelimitedParseTransform,
  createMapTransform,
  createBatchStream,
  createTapTransform,
  createJsonTransform
} from 'stream-pipes'

await pipeline(
  createFileReader('./data.csv.gz'),
  createDelimitedParseTransform(),
  createMapTransform((obj) => ({ ...obj, processed: true })),
  createTapTransform(() => console.log('Processed!')),
  createBatchStream(10),
  createJsonTransform(),
  createFileWriter('./output.jsonl.gz')
)

Available Streams & Utilities

Summary

| Function | Description | Input | Output | | ----------------------------------------------------| ------------------------------------------------ | ----------------------- | ------------------------------ | | createFileReader(path, options) | Reads a file (gzip supported) | File content | Stream of chunks | | createFileWriter(path, options) | Writes chunks to a file (gzip supported) | "..." (string/buffer) | Writes raw data | | createLineReader(options) | Splits text stream by lines | Stream of text chunks | One line per chunk (string) | | createLineWriter(options) | Writes each string with a trailing newline | "..." (string) | One line per write | | createDelimitedParseTransform(delimiter, options) | Parses delimited text into objects | "name,age\nAlice,30" | { name: "Alice", age: "30" } | | createMapTransform(fn, options) | Applies a custom function to each object | { name: "Alice" } | fn({ name: "Alice" }) | | createBatchStream(size, options) | Groups objects into fixed-size arrays | {...}, {...} | [ {...}, {...}, {...} ] | | createJsonTransform(options) | Converts each object to JSON string with newline | { name: "Alice" } | "{"name":"Alice"}\n" | | createTapTransform(fn, options) | Runs a side-effect function per object | { name: "Alice" } | { name: "Alice" } |

createFileReader(path, options)

Creates a readable stream from a file. If the path ends with .gz, it's automatically decompressed.

import { createFileReader } from 'stream-pipes'

const textStream = createFileReader('file.txt')
const gzipStream = createFileReader('file.txt.gz')

createFileWriter(path, options)

Creates a writable stream to a file. If the path ends with .gz, the stream automatically compresses the output using Gzip.

import { createLineWriter } from 'stream-pipes'

const writer = createFileWriter('output.txt')     // writes plain text lines
const gzipWriter = createFileWriter('output.txt.gz') // compresses with gzip

createLineReader(options)

Returns a transform stream that splits incoming text by newlines.

import { createLineReader } from 'stream-pipes'

source.pipe(createLineReader()).on('data', console.log)

createLineWriter(options)

Returns a transform stream that splits incoming text by newlines.

import { createLineWriter } from 'stream-pipes'

stream.pipe(createLineWriter()).pipe(createFileWriter('lines.txt'))

createDelimitedParseTransform(delimiter, options)

Transforms delimited text lines (e.g., CSV) into JavaScript objects using the first line as headers.

import { createDelimitedParseTransform } from 'stream-pipes'

const csvParser = createDelimitedParseTransform(',')

createMapTransform(fn, options)

Applies a synchronous or asynchronous function fn to each incoming object and pushes the result downstream.

import { createMapTransform } from 'stream-pipes'

const transform = createMapTransform(obj => ({
  ...obj,
  processed: true
}))

createBatchStream(size, options)

Groups incoming objects into arrays of fixed size size before pushing them downstream.

import { createBatchStream } from 'stream-pipes'

const batcher = createBatchStream(10)  // emit arrays of 10 objects

createJsonTransform(options)

Converts each incoming object to a JSON string followed by a newline character (\n), ideal for creating JSON Lines (jsonl) streams.

import { createJsonTransform } from 'stream-pipes'
const jsonStringify = createJsonTransform()

createTapTransform(fn, options)

Runs a side-effect function fn on each object passing through, without modifying the data.

import { createTapTransform } from 'stream-pipes'
const logger = createTapTransform(obj => console.log('Passing object:', obj))

License

MIT