npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

fast-csv-parser

v0.0.4

Published

[![CI](https://github.com/jonaylor89/fast-csv-parser/workflows/CI/badge.svg)](https://github.com/jonaylor89/fast-csv-parser/actions) [![npm version](https://badge.fury.io/js/fast-csv-parser.svg)](https://badge.fury.io/js/fast-csv-parser) [![Downloads](htt

Downloads

11

Readme

fast-csv-parser

CI npm version Downloads License: MIT

🚀 A high-performance, drop-in replacement for csv-parser powered by Rust

Streaming CSV parser that aims for maximum speed while maintaining 100% compatibility with the original csv-parser API. Built with Rust and N-API for blazing-fast performance on large datasets.

fast-csv-parser is a drop-in replacement - simply change your import and enjoy better performance with zero code changes required.

⚡️ Up to 1.3x faster on large datasets (7K+ rows) 🔧 100% API compatible with the original csv-parser 🦀 Rust-powered native performance 🌐 Cross-platform support (Windows, macOS, Linux, ARM) 📦 Zero dependencies in production 🔤 UTF-16 support - handles UTF-16 LE/BE with BOM detection

Performance Comparison

Performance characteristics vary by file size. Here are real benchmark results:

Large Files (Where It Matters)

| Dataset | Rows | Size | csv-parser | fast-csv-parser | Speedup | |---------|------|------|------------|-----------------|---------| | large-dataset.csv | 7,268 | 1.1MB | 59ms | 47ms | 🚀 1.26x faster | | option-maxRowBytes.csv | 4,577 | 700KB | 36ms | 29ms | 🚀 1.24x faster |

Small Files (Startup Overhead)

Small files show the Node.js ↔ Rust boundary overhead:

  • Files <100 rows: ~0.6x performance (0.1ms → 0.2ms)
  • The overhead is constant (~0.1ms) regardless of file size

Performance Profile

          Performance Ratio (higher = better)
     2x  ┤
         │     ╭─ Peak performance zone
   1.5x  ┤   ╱
         │ ╱
     1x  ┼─────────────────────── Break-even point (~1KB)
         │
   0.5x  ┤ Overhead zone
         └─────────────────────────────────────
         0KB    1KB     10KB    100KB   1MB+
                        File Size

💡 Recommendation: Use fast-csv-parser for files >10KB or high-throughput scenarios.

📦 Installation

Using npm:

npm install fast-csv-parser

Using yarn:

yarn add fast-csv-parser

Using pnpm:

pnpm add fast-csv-parser

🚀 Usage Examples

Basic Usage

Simply replace your csv-parser import:

// Before
const csv = require('csv-parser')

// After - that's it!
const csv = require('fast-csv-parser')

// Your existing code works unchanged
const fs = require('fs')
const results = []

fs.createReadStream('data.csv')
  .pipe(csv())
  .on('data', (data) => results.push(data))
  .on('end', () => {
    console.log(results)
    // [
    //   { NAME: 'Daffy Duck', AGE: '24' },
    //   { NAME: 'Bugs Bunny', AGE: '22' }
    // ]
  })

Advanced Usage Examples

Custom Headers and Transformations

const csv = require('fast-csv-parser')

fs.createReadStream('data.csv')
  .pipe(csv({
    headers: ['name', 'age', 'city'],
    skipLines: 1,
    mapHeaders: ({ header }) => header.toUpperCase(),
    mapValues: ({ header, value }) => {
      if (header === 'age') return parseInt(value)
      return value.trim()
    }
  }))
  .on('data', (row) => console.log(row))

Processing TSV Files

const csv = require('fast-csv-parser')

fs.createReadStream('data.tsv')
  .pipe(csv({ separator: '\t' }))
  .on('data', (row) => console.log(row))

Error Handling

const csv = require('fast-csv-parser')

fs.createReadStream('data.csv')
  .pipe(csv({ strict: true }))
  .on('data', (row) => console.log(row))
  .on('headers', (headers) => console.log('Headers:', headers))
  .on('error', (err) => {
    console.error('Parse error:', err.message)
  })
  .on('end', () => console.log('Parsing complete'))

Streaming Transform Pipeline

const csv = require('fast-csv-parser')
const { Transform } = require('stream')

const processor = new Transform({
  objectMode: true,
  transform(row, encoding, callback) {
    // Process each row
    row.processed_at = new Date().toISOString()
    this.push(row)
    callback()
  }
})

fs.createReadStream('input.csv')
  .pipe(csv())
  .pipe(processor)
  .pipe(fs.createWriteStream('output.json'))

📚 API Documentation

fast-csv-parser implements the exact same API as the original csv-parser. All options, events, and behaviors are identical.

csv([options | headers])

Returns: Transform Stream

Options

All original csv-parser options are supported:

  • separator (String, default: ,) - Column separator
  • quote (String, default: ") - Quote character
  • escape (String, default: ") - Escape character
  • newline (String, default: \n) - Line ending
  • headers (Array|Boolean) - Custom headers or disable header parsing
  • mapHeaders (Function) - Transform header names
  • mapValues (Function) - Transform cell values
  • skipLines (Number, default: 0) - Skip initial lines
  • skipComments (Boolean|String, default: false) - Skip comment lines
  • maxRowBytes (Number) - Maximum bytes per row
  • strict (Boolean, default: false) - Strict column count validation
  • raw (Boolean, default: false) - Disable UTF-8 decoding

Example with Options

const csv = require('fast-csv-parser')

fs.createReadStream('data.tsv')
  .pipe(csv({
    separator: '\t',
    mapHeaders: ({ header }) => header.toLowerCase(),
    mapValues: ({ value }) => value.trim()
  }))
  .on('data', (row) => console.log(row))

🎯 Events

data

Emitted for each parsed row (excluding headers).

headers

Emitted after header row is parsed with Array<string> of header names.

end

Emitted when parsing is complete.

error

Emitted on parsing errors.

⚡ Performance Tips

  1. Use on large files - The performance benefits are most apparent with files >1MB
  2. Enable raw: true for maximum speed if you don't need UTF-8 processing
  3. Avoid complex mapValues functions - They can negate performance gains
  4. Set appropriate maxRowBytes to avoid memory issues with malformed data

🌐 Platform Support

Pre-built binaries are available for:

  • macOS: x64, ARM64 (Apple Silicon)
  • Linux: x64, ARM64, ARM7 (GNU & musl)
  • Windows: x64, x86, ARM64
  • FreeBSD: x64
  • Android: ARM64, ARM7

🔄 Migration from csv-parser

Migration is seamless:

// Old code
const csvParser = require('csv-parser')

// New code - just change the import!
const csvParser = require('fast-csv-parser')

// Everything else stays the same

🔤 Encoding Support

fast-csv-parser automatically detects and handles multiple text encodings:

  • UTF-8 (default) - with automatic BOM stripping
  • UTF-16 LE (Little Endian) - with BOM detection
  • UTF-16 BE (Big Endian) - with BOM detection

No configuration needed - encoding is detected automatically from Byte Order Marks (BOM):

const csv = require('fast-csv-parser')

// Works with UTF-8, UTF-16 LE, UTF-16 BE files automatically
fs.createReadStream('data-utf16.csv')  // UTF-16 file
  .pipe(csv())
  .on('data', (row) => console.log(row))

Encoding Details

| Encoding | BOM | Detection | Status | |----------|-----|-----------|--------| | UTF-8 | EF BB BF | Auto-detected, BOM stripped | ✅ Supported | | UTF-16 LE | FF FE | Auto-detected | ✅ Supported | | UTF-16 BE | FE FF | Auto-detected | ✅ Supported | | ASCII | None | Treated as UTF-8 | ✅ Supported | | Other | - | Not supported | ❌ |

📊 Benchmarks

Run benchmarks yourself:

# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser

# Install dependencies
npm install

# Run performance comparison against original csv-parser
./benchmark-comparison.js

# Run individual benchmarks
npm run bench

Sample output:

🏁 CSV Parser Performance Comparison
Comparing Rust implementation vs Original JavaScript csv-parser

📊 PERFORMANCE COMPARISON
========================================
File                    Original    Rust     Speedup
large-dataset.csv         59ms      47ms     1.26x ⚡
option-maxRowBytes.csv    36ms      29ms     1.24x ⚡
basic.csv               0.098ms    0.43ms    0.23x

🛠️ Error Handling

Errors are handled identically to the original csv-parser:

fs.createReadStream('data.csv')
  .pipe(csv())
  .on('error', (err) => {
    if (err instanceof RangeError) {
      console.log('Row length mismatch')
    } else {
      console.log('Parse error:', err.message)
    }
  })

📝 TypeScript Support

Full TypeScript definitions are included:

import csv from 'fast-csv-parser'

interface Row {
  name: string
  age: number
}

fs.createReadStream('data.csv')
  .pipe(csv())
  .on('data', (row: Row) => {
    console.log(row.name, row.age)
  })

🖥️ CLI Usage

The CLI is fully compatible with csv-parser:

# Parse CSV to NDJSON
fast-csv-parser data.csv

# Parse TSV
fast-csv-parser -s $'\t' data.tsv

# Custom options
fast-csv-parser --separator=';' --quote='"' data.csv

🔤 Advanced Encoding Usage

For most cases, encoding is handled automatically. For advanced scenarios:

const csv = require('fast-csv-parser')

// Automatic encoding detection (recommended)
fs.createReadStream('data.csv')  // Any supported encoding
  .pipe(csv())
  .on('data', (row) => console.log(row))

// For unsupported encodings, use iconv-lite preprocessing
const iconv = require('iconv-lite')
fs.createReadStream('latin1-data.csv')
  .pipe(iconv.decodeStream('latin1'))
  .pipe(csv())
  .on('data', (row) => console.log(row))

🤔 When to Use

Use fast-csv-parser when:

  • Processing large CSV files (>1MB)
  • Performance is critical
  • You want a drop-in replacement for csv-parser
  • Working with data pipelines or ETL processes

Stick with csv-parser when:

  • Processing many small files (<1KB each) where startup overhead matters
  • Bundle size is extremely critical (csv-parser is pure JS)
  • You need cutting-edge features that haven't been ported yet

❓ FAQ

Q: Is this a drop-in replacement?

A: Yes! Just change your import from csv-parser to fast-csv-parser. All options, events, and behaviors are identical.

Q: When should I use this over the original?

A: Use fast-csv-parser when:

  • Processing files larger than 10KB
  • Performance is critical (ETL pipelines, data processing)
  • You're already using csv-parser and want better performance

Q: Are there any breaking changes?

A: No breaking changes. The API is 100% compatible. UTF-16 files are now properly supported with automatic encoding detection.

Q: Why is it slower on small files?

A: There's a ~0.1ms overhead from the Node.js ↔ Rust boundary. For tiny files, this overhead exceeds the parsing time. The crossover point is around 1KB.

Q: Can I use this in the browser?

A: Not currently. This is a native Node.js addon. For browser use, stick with the original csv-parser.

Q: How stable is this?

A: Very stable. It passes all csv-parser tests and maintains the same error handling. The Rust core uses battle-tested CSV parsing libraries.

🛠️ Development

Building from Source

Requirements:

  • Node.js 16+
  • Rust 1.60+
  • Supported platform (see Platform Support above)
# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser

# Install dependencies
npm install

# Build the native addon
npm run build

# Run tests
npm test

# Run benchmarks
npm run bench

Project Structure

fast-csv-parser/
├── src/                 # Rust source code
│   ├── lib.rs          # N-API bindings
│   └── parser.rs       # Core CSV parsing logic
├── __test__/           # Test files and fixtures
├── examples/           # Usage examples
├── bin/                # CLI tools
├── main.js             # Main entry point with Stream API
└── index.js           # Auto-generated native binding loader (build-safe)

Architecture

  1. Rust Core (src/parser.rs) - High-performance CSV parsing
  2. N-API Bridge (src/lib.rs) - Node.js ↔ Rust interface
  3. Native Loader (index.js) - Auto-generated cross-platform binary loading
  4. Main Entry (main.js) - Stream API compatibility layer (build-safe)

🤝 Contributing

Contributions welcome! This project maintains:

  • 100% API compatibility with csv-parser
  • Comprehensive test coverage (23+ test files)
  • Performance benchmarks
  • Cross-platform CI/CD
  • Rust best practices

Testing

# Run all tests
npm test

# Run specific test
npm test headers.test.mjs

# Run benchmarks
./benchmark-comparison.js

📄 License

MIT License - same as the original csv-parser.

🙏 Acknowledgments

Built on the shoulders of giants:

Special thanks to the csv-parser community for creating such a robust and well-designed API that made this drop-in replacement possible.


🚀 Ready to speed up your CSV processing?

npm install fast-csv-parser

Just replace your import and enjoy the performance boost!

⭐ Star on GitHub📦 View on npm🐛 Report Issues