npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

typed-csv-stream

v0.1.0

Published

A typescript package to process CSV file upload

Downloads

4

Readme

Typed CSV Stream Parser

A strongly-typed CSV stream processor for Node.js that transforms CSV data into TypeScript classes with automatic type conversion and validation.

Features

  • 🔑 Type-safe: Transform CSV data into strongly-typed TypeScript classes
  • 🚀 Streaming: Process large CSV files efficiently using Node.js streams
  • 🎯 Automatic Type Conversion: Built-in conversion for common types (Number, Boolean, Date)
  • 🛠 Custom Parsers: Define custom parsing logic for complex data types
  • 📦 Batch Processing: Process records in configurable batch sizes for optimal performance
  • 🔌 Express Integration: Ready-to-use processor for handling CSV file uploads in Express applications

Installation

npm install typed-csv-stream

Usage

1. Define Your Data Model

Use decorators to map CSV columns to class properties:

import { CsvHeader } from 'typed-csv-stream';

class User {
  @CsvHeader('name')
  name: string;

  @CsvHeader('age')
  age: number;

  @CsvHeader('isActive')
  active: boolean;

  @CsvHeader({
    name: 'joinDate',
    parser: (value: string) => new Date(value),
  })
  joinDate: Date;
}

2. Create Express Endpoint with CSV Processing

import express from 'express';
import { ExpressCsvStreamProcessor } from 'typed-csv-stream';
import { Writable } from 'stream';

const app = express();

app.post('/upload', (req, res) => {
  // Create processor instance with your model and options
  const processor = new ExpressCsvStreamProcessor(User, {
    batchSize: 1000,
    delimiter: ',',
    encoding: 'utf-8',
  });

  // Create writable stream to handle processed batches
  const writable = new Writable({
    objectMode: true,
    write(batch: User[], encoding, callback) {
      try {
        // Handle each batch of processed users
        console.log(`Processing batch of ${batch.length} users`);
        // Example: Save to database
        // await saveUsers(batch);
        callback();
      } catch (error) {
        callback(error);
      }
    },
  });

  // Process the uploaded file
  processor.processRequest(req, writable, error => {
    if (error) {
      res.status(400).json({ error: error.message });
    } else {
      res.json({ success: true });
    }
  });
});

app.listen(3000, () => {
  console.log('Server running on port 3000');
});

3. Upload and Process CSV Files

// Example client-side code
const formData = new FormData();
formData.append('file', csvFile);

await fetch('http://localhost:3000/upload', {
  method: 'POST',
  body: formData,
});

Configuration Options

The processor accepts the following options:

interface CsvProcessorOptions {
  delimiter?: string; // CSV delimiter (default: ',')
  skipHeader?: boolean; // Skip header row (default: false)
  encoding?: string; // File encoding (default: 'utf-8')
  batchSize: number; // Number of records per batch
}

Decorator Options

The @CsvHeader decorator accepts either a string (column name) or an options object:

interface CsvHeaderOptions {
  name: string; // CSV column name
  parser?: (value: string) => any; // Custom parser function
}

Type Conversion

The library automatically converts CSV string values to appropriate types based on the property decorators:

  • string: No conversion (default)
  • number: Converted using Number()
  • boolean: Converted using string comparison ('true' -> true)
  • Date: Converted using new Date()
  • Custom types: Use the parser option in @CsvHeader decorator

Error Handling

The processor handles various error cases:

  • Invalid file type (non-CSV files)
  • Missing file in request
  • CSV parsing errors
  • Type conversion failures
  • Custom parser errors

Errors are passed to the completion callback with appropriate error messages:

processor.processRequest(req, writable, error => {
  if (error) {
    console.error('Processing failed:', error.message);
    // Handle error appropriately
  }
});

Example CSV Format

For the User model defined above, your CSV file should look like:

name,age,isActive,joinDate
John Doe,30,true,2023-01-01
Jane Smith,25,false,2023-02-15

License

MIT

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.