fast-csv-parser
v0.0.4
Published
[](https://github.com/jonaylor89/fast-csv-parser/actions) [](https://badge.fury.io/js/fast-csv-parser) [
🔧 100% API compatible with the original csv-parser
🦀 Rust-powered native performance
🌐 Cross-platform support (Windows, macOS, Linux, ARM)
📦 Zero dependencies in production
🔤 UTF-16 support - handles UTF-16 LE/BE with BOM detection
Performance Comparison
Performance characteristics vary by file size. Here are real benchmark results:
Large Files (Where It Matters)
| Dataset | Rows | Size | csv-parser | fast-csv-parser | Speedup | |---------|------|------|------------|-----------------|---------| | large-dataset.csv | 7,268 | 1.1MB | 59ms | 47ms | 🚀 1.26x faster | | option-maxRowBytes.csv | 4,577 | 700KB | 36ms | 29ms | 🚀 1.24x faster |
Small Files (Startup Overhead)
Small files show the Node.js ↔ Rust boundary overhead:
- Files <100 rows: ~0.6x performance (0.1ms → 0.2ms)
- The overhead is constant (~0.1ms) regardless of file size
Performance Profile
Performance Ratio (higher = better)
2x ┤
│ ╭─ Peak performance zone
1.5x ┤ ╱
│ ╱
1x ┼─────────────────────── Break-even point (~1KB)
│
0.5x ┤ Overhead zone
└─────────────────────────────────────
0KB 1KB 10KB 100KB 1MB+
File Size💡 Recommendation: Use fast-csv-parser for files >10KB or high-throughput scenarios.
📦 Installation
Using npm:
npm install fast-csv-parserUsing yarn:
yarn add fast-csv-parserUsing pnpm:
pnpm add fast-csv-parser🚀 Usage Examples
Basic Usage
Simply replace your csv-parser import:
// Before
const csv = require('csv-parser')
// After - that's it!
const csv = require('fast-csv-parser')
// Your existing code works unchanged
const fs = require('fs')
const results = []
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
console.log(results)
// [
// { NAME: 'Daffy Duck', AGE: '24' },
// { NAME: 'Bugs Bunny', AGE: '22' }
// ]
})Advanced Usage Examples
Custom Headers and Transformations
const csv = require('fast-csv-parser')
fs.createReadStream('data.csv')
.pipe(csv({
headers: ['name', 'age', 'city'],
skipLines: 1,
mapHeaders: ({ header }) => header.toUpperCase(),
mapValues: ({ header, value }) => {
if (header === 'age') return parseInt(value)
return value.trim()
}
}))
.on('data', (row) => console.log(row))Processing TSV Files
const csv = require('fast-csv-parser')
fs.createReadStream('data.tsv')
.pipe(csv({ separator: '\t' }))
.on('data', (row) => console.log(row))Error Handling
const csv = require('fast-csv-parser')
fs.createReadStream('data.csv')
.pipe(csv({ strict: true }))
.on('data', (row) => console.log(row))
.on('headers', (headers) => console.log('Headers:', headers))
.on('error', (err) => {
console.error('Parse error:', err.message)
})
.on('end', () => console.log('Parsing complete'))Streaming Transform Pipeline
const csv = require('fast-csv-parser')
const { Transform } = require('stream')
const processor = new Transform({
objectMode: true,
transform(row, encoding, callback) {
// Process each row
row.processed_at = new Date().toISOString()
this.push(row)
callback()
}
})
fs.createReadStream('input.csv')
.pipe(csv())
.pipe(processor)
.pipe(fs.createWriteStream('output.json'))📚 API Documentation
fast-csv-parser implements the exact same API as the original csv-parser. All options, events, and behaviors are identical.
csv([options | headers])
Returns: Transform Stream
Options
All original csv-parser options are supported:
separator(String, default:,) - Column separatorquote(String, default:") - Quote characterescape(String, default:") - Escape characternewline(String, default:\n) - Line endingheaders(Array|Boolean) - Custom headers or disable header parsingmapHeaders(Function) - Transform header namesmapValues(Function) - Transform cell valuesskipLines(Number, default: 0) - Skip initial linesskipComments(Boolean|String, default: false) - Skip comment linesmaxRowBytes(Number) - Maximum bytes per rowstrict(Boolean, default: false) - Strict column count validationraw(Boolean, default: false) - Disable UTF-8 decoding
Example with Options
const csv = require('fast-csv-parser')
fs.createReadStream('data.tsv')
.pipe(csv({
separator: '\t',
mapHeaders: ({ header }) => header.toLowerCase(),
mapValues: ({ value }) => value.trim()
}))
.on('data', (row) => console.log(row))🎯 Events
data
Emitted for each parsed row (excluding headers).
headers
Emitted after header row is parsed with Array<string> of header names.
end
Emitted when parsing is complete.
error
Emitted on parsing errors.
⚡ Performance Tips
- Use on large files - The performance benefits are most apparent with files >1MB
- Enable
raw: truefor maximum speed if you don't need UTF-8 processing - Avoid complex
mapValuesfunctions - They can negate performance gains - Set appropriate
maxRowBytesto avoid memory issues with malformed data
🌐 Platform Support
Pre-built binaries are available for:
- macOS: x64, ARM64 (Apple Silicon)
- Linux: x64, ARM64, ARM7 (GNU & musl)
- Windows: x64, x86, ARM64
- FreeBSD: x64
- Android: ARM64, ARM7
🔄 Migration from csv-parser
Migration is seamless:
// Old code
const csvParser = require('csv-parser')
// New code - just change the import!
const csvParser = require('fast-csv-parser')
// Everything else stays the same🔤 Encoding Support
fast-csv-parser automatically detects and handles multiple text encodings:
- UTF-8 (default) - with automatic BOM stripping
- UTF-16 LE (Little Endian) - with BOM detection
- UTF-16 BE (Big Endian) - with BOM detection
No configuration needed - encoding is detected automatically from Byte Order Marks (BOM):
const csv = require('fast-csv-parser')
// Works with UTF-8, UTF-16 LE, UTF-16 BE files automatically
fs.createReadStream('data-utf16.csv') // UTF-16 file
.pipe(csv())
.on('data', (row) => console.log(row))Encoding Details
| Encoding | BOM | Detection | Status |
|----------|-----|-----------|--------|
| UTF-8 | EF BB BF | Auto-detected, BOM stripped | ✅ Supported |
| UTF-16 LE | FF FE | Auto-detected | ✅ Supported |
| UTF-16 BE | FE FF | Auto-detected | ✅ Supported |
| ASCII | None | Treated as UTF-8 | ✅ Supported |
| Other | - | Not supported | ❌ |
📊 Benchmarks
Run benchmarks yourself:
# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser
# Install dependencies
npm install
# Run performance comparison against original csv-parser
./benchmark-comparison.js
# Run individual benchmarks
npm run benchSample output:
🏁 CSV Parser Performance Comparison
Comparing Rust implementation vs Original JavaScript csv-parser
📊 PERFORMANCE COMPARISON
========================================
File Original Rust Speedup
large-dataset.csv 59ms 47ms 1.26x ⚡
option-maxRowBytes.csv 36ms 29ms 1.24x ⚡
basic.csv 0.098ms 0.43ms 0.23x🛠️ Error Handling
Errors are handled identically to the original csv-parser:
fs.createReadStream('data.csv')
.pipe(csv())
.on('error', (err) => {
if (err instanceof RangeError) {
console.log('Row length mismatch')
} else {
console.log('Parse error:', err.message)
}
})📝 TypeScript Support
Full TypeScript definitions are included:
import csv from 'fast-csv-parser'
interface Row {
name: string
age: number
}
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (row: Row) => {
console.log(row.name, row.age)
})🖥️ CLI Usage
The CLI is fully compatible with csv-parser:
# Parse CSV to NDJSON
fast-csv-parser data.csv
# Parse TSV
fast-csv-parser -s $'\t' data.tsv
# Custom options
fast-csv-parser --separator=';' --quote='"' data.csv🔤 Advanced Encoding Usage
For most cases, encoding is handled automatically. For advanced scenarios:
const csv = require('fast-csv-parser')
// Automatic encoding detection (recommended)
fs.createReadStream('data.csv') // Any supported encoding
.pipe(csv())
.on('data', (row) => console.log(row))
// For unsupported encodings, use iconv-lite preprocessing
const iconv = require('iconv-lite')
fs.createReadStream('latin1-data.csv')
.pipe(iconv.decodeStream('latin1'))
.pipe(csv())
.on('data', (row) => console.log(row))🤔 When to Use
Use fast-csv-parser when:
- Processing large CSV files (>1MB)
- Performance is critical
- You want a drop-in replacement for csv-parser
- Working with data pipelines or ETL processes
Stick with csv-parser when:
- Processing many small files (<1KB each) where startup overhead matters
- Bundle size is extremely critical (csv-parser is pure JS)
- You need cutting-edge features that haven't been ported yet
❓ FAQ
Q: Is this a drop-in replacement?
A: Yes! Just change your import from csv-parser to fast-csv-parser. All options, events, and behaviors are identical.
Q: When should I use this over the original?
A: Use fast-csv-parser when:
- Processing files larger than 10KB
- Performance is critical (ETL pipelines, data processing)
- You're already using csv-parser and want better performance
Q: Are there any breaking changes?
A: No breaking changes. The API is 100% compatible. UTF-16 files are now properly supported with automatic encoding detection.
Q: Why is it slower on small files?
A: There's a ~0.1ms overhead from the Node.js ↔ Rust boundary. For tiny files, this overhead exceeds the parsing time. The crossover point is around 1KB.
Q: Can I use this in the browser?
A: Not currently. This is a native Node.js addon. For browser use, stick with the original csv-parser.
Q: How stable is this?
A: Very stable. It passes all csv-parser tests and maintains the same error handling. The Rust core uses battle-tested CSV parsing libraries.
🛠️ Development
Building from Source
Requirements:
- Node.js 16+
- Rust 1.60+
- Supported platform (see Platform Support above)
# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser
# Install dependencies
npm install
# Build the native addon
npm run build
# Run tests
npm test
# Run benchmarks
npm run benchProject Structure
fast-csv-parser/
├── src/ # Rust source code
│ ├── lib.rs # N-API bindings
│ └── parser.rs # Core CSV parsing logic
├── __test__/ # Test files and fixtures
├── examples/ # Usage examples
├── bin/ # CLI tools
├── main.js # Main entry point with Stream API
└── index.js # Auto-generated native binding loader (build-safe)Architecture
- Rust Core (
src/parser.rs) - High-performance CSV parsing - N-API Bridge (
src/lib.rs) - Node.js ↔ Rust interface - Native Loader (
index.js) - Auto-generated cross-platform binary loading - Main Entry (
main.js) - Stream API compatibility layer (build-safe)
🤝 Contributing
Contributions welcome! This project maintains:
- 100% API compatibility with csv-parser
- Comprehensive test coverage (23+ test files)
- Performance benchmarks
- Cross-platform CI/CD
- Rust best practices
Testing
# Run all tests
npm test
# Run specific test
npm test headers.test.mjs
# Run benchmarks
./benchmark-comparison.js📄 License
MIT License - same as the original csv-parser.
🙏 Acknowledgments
Built on the shoulders of giants:
csv-parserby @mafintosh - The original, excellent CSV parsernapi-rs- Rust N-API framework for Node.js addons
Special thanks to the csv-parser community for creating such a robust and well-designed API that made this drop-in replacement possible.
🚀 Ready to speed up your CSV processing?
npm install fast-csv-parserJust replace your import and enjoy the performance boost!
