npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

jsonarrayfs

v2.0.0

Published

Specialized Node.js library for memory-efficient operations on JSON arrays. Stream individual elements from large JSON arrays (files, network responses etc.) and append elements to array files without loading the entire array into memory. Perfect for proc

Readme

jsonarrayfs

npm version npm downloads License: MIT Coverage

Specialized Node.js library for memory-efficient operations on JSON arrays. Stream individual elements from large JSON arrays (files, network responses etc.) and append elements to array files without loading the entire array into memory. Perfect for processing large-scale JSON array datasets without memory limitations.

Why Use This?

  • 🎯 Specialized: Purpose-built for JSON arrays
  • 💾 Memory Efficient: Process arrays of any size without loading them entirely
  • High Performance: Optimized streaming and batch operations
  • ✍️ Direct Updates: Append elements without rewriting the entire file
  • 🔄 Format Agnostic: Works with any valid JSON array structure

Installation

npm install jsonarrayfs

Examples

1. Stream from File

Processs a large JSON array file (e.g., application logs) without loading it into memory:

import { JsonArrayStream } from "jsonarrayfs";
import { createReadStream } from "node:fs";

// Analyze logs: Count errors and slow responses
const fileStream = createReadStream("app.log.json");
const arrayStream = new JsonArrayStream("utf8");

let errorCount = 0;
let slowResponses = 0;

for await (const log of fileStream.pipe(arrayStream)) {
  if (log !== JsonArrayStream.NULL) {
    if (log.level === "error") errorCount++;
    if (log.responseTime > 1000) slowResponses++;
  }
}

console.log(
  `Analysis complete: Found ${errorCount} errors, ${slowResponses} slow responses`,
);

2. Stream from Network

Process a JSON array from an API response:

import { JsonArrayStream } from "jsonarrayfs";
import { get } from "node:https";

get("https://api.example.com/json-array-data", (res) => {
  const arrayStream = new JsonArrayStream("utf8");

  res.pipe(arrayStream).on("data", (item) => {
    console.log(`Got item: ${item === JsonArrayStream.NULL ? null : item}`);
  });
});

3. Append to File

Append new elements to an existing JSON array file:

import { JsonArrayStream, appendToJsonArrayFile } from "jsonarrayfs";

// Append new log entries
const newLogs = [
  {
    timestamp: Date.now(),
    level: "info",
    message: "User login successful",
    responseTime: 245,
  },
  {
    timestamp: Date.now(),
    level: "info",
    message: "User login successful",
    responseTime: 1245,
  },
  null,
  {
    timestamp: Date.now(),
    level: "error",
    message: "Database connection timeout",
    responseTime: 1532,
  },
];

await appendToJsonArrayFile("app.log.json", "utf8", ...newLogs);

API Reference

JsonArrayStream

A transform stream that parses JSON array elements one by one for efficient processing. When processing arrays containing null values, it uses a special sentinel value (JsonArrayStream.NULL) to distinguish between JSON null and stream EOF.

Constructor

new JsonArrayStream(encoding?: string)
Parameters
  • encoding (string, optional): Content encoding (default: 'utf8')

Properties

  • JsonArrayStream.NULL: Special sentinel value to distinguish between JSON null and stream EOF

Events

  • data: Emitted for each array element
  • error: Emitted when parsing fails or input is invalid

appendToJsonArrayFile

Appends elements to a JSON array file efficiently without loading the entire file into memory.

Signature

async function appendToJsonArrayFile(
  filePath: string,
  encoding?: string,
  ...elements: any[]
): Promise<void>;

Parameters

  • filePath (string): Path to the JSON array file
  • encoding (string, optional): File encoding (default: 'utf8')
  • ...elements (any[]): Elements to append to the array

Returns

Promise that resolves when the append operation is complete.

Error Handling

The library can throw these errors:

JsonArrayStreamError

  • Invalid JSON array format
  • Malformed array elements
  • Unexpected end of input

JsonArrayAppendError

  • Invalid JSON array format
  • File system errors
  • Permission issues
  • Invalid input elements

Requirements

  • Node.js >= 16.0.0
  • Input file must be a valid JSON array

License

MIT License - see the LICENSE file for details