npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

json-stream-lite

v1.0.6

Published

A lightweight, memory-efficient, zero-dependency streaming JSON parser and stringifier for JavaScript and TypeScript. It works in both Node.js and browser environments.

Readme

Examples | Documentation

json-stream-lite

A lightweight, memory-efficient streaming JSON parser and stringifier for JavaScript and TypeScript. Process large JSON files without loading them entirely into memory.

Features

  • 🚀 Stream parsing: Parse JSON incrementally as data arrives
  • 💾 Memory efficient: Process large JSON files without loading them entirely into memory
  • 🔄 Bidirectional: Both parse and stringify JSON in streaming fashion
  • 🎯 Type-safe: Full TypeScript support with comprehensive type definitions
  • 🔌 Flexible input: Support for sync/async iterables, ReadableStreams, strings, and byte arrays
  • 🎨 Key-value extraction: Flatten nested JSON structures into key-value pairs
  • Zero dependencies: Minimal footprint with no external runtime dependencies

Installation

npm install json-stream-lite
pnpm add json-stream-lite
yarn add json-stream-lite

Quick Start

Parsing JSON

Parse a complete JSON object incrementally

import { JsonObject } from 'json-stream-lite'

const json = '{"name": "Alice", "age": 30, "active": true}'
const parser = new JsonObject()

// Feed bytes into the parser
parser.feed(...new TextEncoder().encode(json))

// Read the complete object
const result = parser.read()
console.log(result) // { name: 'Alice', age: 30, active: true }

Stream through object members

import { JsonObject } from 'json-stream-lite'

const json = '{"name": "Alice", "age": 30, "city": "NYC"}'
const parser = new JsonObject()
parser.feed(...new TextEncoder().encode(json))

// Iterate through key-value pairs without loading the entire object
for (const [keyEntity, valueEntity] of parser.members()) {
    const key = keyEntity.read()
    const value = valueEntity.read().read()
    console.log(`${key}: ${value}`)
}
// Output:
// name: Alice
// age: 30
// city: NYC

Parse JSON arrays incrementally

import { JsonArray } from 'json-stream-lite'

const json = '[1, 2, 3, 4, 5]'
const parser = new JsonArray()
parser.feed(...new TextEncoder().encode(json))

// Process each item individually
for (const item of parser.items()) {
    console.log(item.read())
}
// Output: 1, 2, 3, 4, 5

Async Streaming

Process JSON from async sources like HTTP responses or file streams:

import { JsonObject } from 'json-stream-lite'

async function processStream(stream: ReadableStream<Uint8Array>) {
    const parser = new JsonObject(stream)

    // Asynchronously iterate through members
    for await (const [keyEntity, valueEntity] of parser.membersAsync()) {
        const key = keyEntity.read()
        const value = await valueEntity.readValueAsync()
        console.log(`${key}: ${value}`)
    }
}

// Example with fetch
const response = await fetch('https://api.example.com/data.json')
await processStream(response.body!)

Key-Value Extraction

Flatten nested JSON structures into dot-notation key-value pairs:

import { jsonKeyValueParser } from 'json-stream-lite'

const json = '{"user": {"name": "Alice", "scores": [95, 87, 92]}}'

for (const [key, value] of jsonKeyValueParser(json)) {
    console.log(`${key} = ${value}`)
}
// Output:
// user.name = Alice
// user.scores[0] = 95
// user.scores[1] = 87
// user.scores[2] = 92

Async key-value extraction

import { jsonKeyValueParserAsync } from 'json-stream-lite'

async function extractKeyValues(stream: ReadableStream) {
    for await (const [key, value] of jsonKeyValueParserAsync(stream)) {
        console.log(`${key} = ${value}`)
    }
}

Stringifying JSON

Convert JavaScript objects to JSON strings in a streaming fashion:

import { jsonStreamStringify } from 'json-stream-lite'

const data = {
    name: 'Alice',
    scores: [95, 87, 92],
    metadata: { verified: true },
}

// Generate JSON in chunks
for (const chunk of jsonStreamStringify(data, null, 2)) {
    process.stdout.write(chunk)
}

Stringify to bytes

import { jsonStreamStringifyBytes } from 'json-stream-lite'

const data = { name: 'Alice', age: 30 }

for (const bytes of jsonStreamStringifyBytes(data)) {
    // bytes is a Uint8Array
    await writeToFile(bytes)
}

Control chunk size

import { jsonStreamStringify } from 'json-stream-lite'

const data = { longString: 'x'.repeat(10000) }

// Control how strings are chunked (default: 1024 bytes)
for (const chunk of jsonStreamStringify(data, null, 0, {
    stringChunkSize: 512,
})) {
    console.log(chunk.length) // Chunks will be ~512 bytes
}

API Reference

See docs.

Advanced Usage

Processing Large Files

import { createReadStream } from 'fs'
import { JsonObject } from 'json-stream-lite'

async function processLargeFile(filePath: string) {
    const stream = createReadStream(filePath)
    const parser = new JsonObject(stream)

    for await (const [keyEntity, valueEntity] of parser) {
        const key = keyEntity.read()
        const value = await valueEntity.readValueAsync()

        // Process each key-value pair without loading entire file
        await processRecord(key, value)
    }
}

Handling Nested Structures

import { JsonObject, JsonArray } from 'json-stream-lite'

const json = '{"users": [{"name": "Alice"}, {"name": "Bob"}]}'
const parser = new JsonObject()
parser.feed(...new TextEncoder().encode(json))

for (const [keyEntity, valueEntity] of parser) {
    const key = keyEntity.read()
    const value = valueEntity.read()

    if (key === 'users' && value instanceof JsonArray) {
        for (const userEntity of value.items()) {
            const user = userEntity.read()
            console.log(user) // Each user object
        }
    }
}

Incremental Feeding

import { JsonObject } from 'json-stream-lite'

const parser = new JsonObject()

// Feed data incrementally as it arrives
parser.feed(123) // {
parser.feed(34, 110, 97, 109, 101, 34) // "name"
parser.feed(58, 34, 65, 108, 105, 99, 101, 34) // :"Alice"
parser.feed(125) // }

const result = parser.read()
console.log(result) // { name: 'Alice' }

Use Cases

1. Processing API Responses

async function processApiResponse(url: string) {
    const response = await fetch(url)
    const parser = new JsonObject(response.body!)

    for await (const [keyEntity, valueEntity] of parser.membersAsync()) {
        const key = keyEntity.read()
        const value = await valueEntity.readValueAsync()
        console.log(`Processing ${key}:`, value)
    }
}

2. Log File Analysis

import { jsonKeyValueParserAsync } from 'json-stream-lite'

async function analyzeLogFile(stream: ReadableStream) {
    const metrics: Record<string, number> = {}

    for await (const [key, value] of jsonKeyValueParserAsync(stream)) {
        if (typeof value === 'number') {
            metrics[key] = (metrics[key] || 0) + value
        }
    }

    return metrics
}

3. Generating Large JSON Files

import { jsonStreamStringifyBytes } from 'json-stream-lite'
import { createWriteStream } from 'fs'

async function generateLargeFile(data: unknown, outputPath: string) {
    const writeStream = createWriteStream(outputPath)

    for (const chunk of jsonStreamStringifyBytes(data, null, 2)) {
        writeStream.write(chunk)
    }

    writeStream.end()
}

4. Database Export

import { jsonStreamStringify } from 'json-stream-lite'

async function* exportDatabase(query: string) {
    const records = await db.query(query)

    for (const chunk of jsonStreamStringify(records, null, 2)) {
        yield chunk
    }
}

// Stream to client
app.get('/export', async (req, res) => {
    res.setHeader('Content-Type', 'application/json')
    for await (const chunk of exportDatabase('SELECT * FROM users')) {
        res.write(chunk)
    }
    res.end()
})

Performance Tips

  1. Use async methods for I/O-bound operations
  2. Set appropriate buffer limits with maxBufferSize
  3. Stream member-by-member instead of calling read() on large objects
  4. Control chunk size in stringify operations for optimal throughput
  5. Avoid reading entire objects when you only need specific fields

Browser Support

Works in all modern browsers and Node.js environments that support:

  • ES2015+ features
  • Generators and async generators
  • TextEncoder/TextDecoder (for string conversion)
  • ReadableStream (for stream processing)

TypeScript Support

Full TypeScript definitions included. All types are exported:

import type {
    JsonPrimitive,
    JsonKeyValuePair,
    JsonValueType,
    JsonPrimitiveType,
    JsonStreamStringifyOptions,
} from 'json-stream-lite'

License

MIT

Contributing

Contributions welcome! Please ensure:

  • All tests pass: pnpm test
  • Code compiles: pnpm compile
  • Coverage maintained: pnpm test -- --coverage

For more details, see CONTRIBUTING.md.