npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

csv-array

v0.0.23

Published

Intelligent CSV parser created and made for nodeJS. Which takes a csv file and produce an array from it.

Readme

csv-array

Simple, lightweight, intelligent CSV-parser for Node.js — now in TypeScript

What's New

  • TypeScript rewrite — full type definitions, strict mode, ES2022 output.
  • Promise support — callbacks are now optional; you can await the result.
  • Worker-thread support — parsing of CSV files larger than 3 MB is automatically off-loaded to a separate worker_threads thread so your main thread stays responsive.
  • Pagination — query large datasets selectively and securely using { start, count }.

Installation

npm install csv-array

How to Use the Package

This package exports a highly efficient function to parse CSV files. Under the hood, it intelligently shifts between main-thread streaming and worker-thread batching depending on the size of the target payload.

API: parseCSV

function parseCSV(
  fileName: string,
  callBack?: ParseCSVCallback,
  considerFirstRowAsHeading?: boolean,
  pagination?: Pagination
): void | Promise<CSVRow[]>

Parameters

| Parameter | Type | Default | Description | |-----------|------|---------|-------------| | fileName | string | Required | The relative or absolute path to the CSV file you want to parse. | | callBack | function | undefined | A callback function receiving the parsed rows array. If omitted, parseCSV returns a Promise. | | considerFirstRowAsHeading | boolean | true | When true, rows are parsed as objects mapping column headers to row values. When false, rows are returned as plain string arrays. | | pagination | { start: number, count: number } | undefined | (For files > 3MB) Fetches a specific slice of the parsed rows without bloating memory. Note: If omitted on >3MB files, it defaults to {start: 0, count: 100}. A console warning occurs if count > 10000. |

Behavior & Features

Promises vs Callbacks

You can use standard callbacks or omit the callback parameter entirely to receive a Promise<CSVRow[]>:

// Standard async/await Promises
const data = await parseCSV("data.csv");

// Callbacks
parseCSV("data.csv", (data) => console.log(data));

Large File Execution (Worker Thread)

If your provided file exceeds 10 MB in size, csv-array automatically spawns a Node.js worker_threads instance. This completely isolates heavy disk I/O and string parsing logic, meaning your main server thread will never freeze. Your API usage patterns remain identical whether the file is 1 MB or 2 GB.

Memory-safe Pagination

Reading massive CSV files into an array often results in Out of Memory crash errors. To prevent this, structural pagination can be strictly applied. The internal stream closes the moment your requested chunk is achieved, ensuring optimal execution times.


Change log

| Version | Notes | |---------|-------| | 0.0.23 | Complete TypeScript rewrite (ES2022). Optional Promises/async compatibility. Worker-thread execution scaling for large files > 10 MB. Stream pagination and safe default limitations. | | 0.0.22 | Dramatic speed improvements — please avoid versions 0.0.1x |


Example of Use in TypeScript

The library is cleanly typed, exporting standard module boundaries like parseCSV, and typings like CSVRow, CSVObjectRow, and Pagination. This allows deep integration into strict typed pipelines.

Initialization & usage

import { parseCSV, CSVRow, Pagination } from 'csv-array';

async function processData() {
  
  // Example 1: Standard usage with headings (Returns an array of objects)
  // No callback is passed, so we can await it cleanly!
  const users = await parseCSV('users.csv');
  console.log('First user name:', users[0]['Name']);

  // Example 2: Without heading objects (Returns an array of raw strings)
  const rawRows = await parseCSV('data.csv', undefined, false);
  console.log('Header keys:', rawRows[0]);
  console.log('First entry payload:', rawRows[1]);

  // Example 3: Handling massive files using Pagination & Callbacks
  const paginationConfig: Pagination = { start: 1500, count: 25 };
  
  parseCSV(
    'massive-system-dataset.csv', 
    (data: CSVRow[]) => {
       console.log(`Memory-safe slice executed! Received ${data.length} records.`);
    }, 
    true, 
    paginationConfig
  );

}

processData();

If you find any issues feel free to contact me at [email protected]