npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@bakes/dastardly-csv

v1.0.0

Published

CSV parser and serializer for dASTardly

Readme

@bakes/dastardly-csv

High-performance CSV/TSV/PSV parser and serializer for dASTardly, built with Tree-sitter.

Installation

npm install @bakes/dastardly-csv @bakes/dastardly-core
pnpm add @bakes/dastardly-csv @bakes/dastardly-core

Overview

@bakes/dastardly-csv provides a blazing-fast CSV parser and serializer that converts CSV to dASTardly's format-agnostic AST. Built on tree-sitter for real-time editor performance with full position tracking for precise error reporting.

Key Features:

  • High performance - Tree-sitter-based parsing for real-time editor feedback
  • Multiple delimiters - Support for CSV (,), TSV (\t), and PSV (|)
  • Position tracking - Every node tracks source location (line, column, offset)
  • Flexible headers - Auto-detect, custom headers, or no headers mode
  • Type inference - Optional automatic type detection for numbers and booleans
  • Quote strategies - Configurable quoting: needed, all, nonnumeric, or none
  • Type-safe - Full TypeScript support with strict mode
  • Format-agnostic AST - Convert to/from other formats (JSON, YAML, etc.)

Quick Start

Parsing

import { parse } from '@bakes/dastardly-csv';

// Parse CSV with headers to DocumentNode
const doc = parse('name,age\nAlice,30\nBob,25');
console.log(doc.type); // 'Document'
console.log(doc.body.type); // 'Array'

// Access the data directly
const data = doc.body;
if (data.type === 'Array') {
  console.log(data.elements.length); // 2
  // Each element is an Object with properties from headers
}

Serializing

import { serialize } from '@bakes/dastardly-csv';

// Serialize with default options (comma delimiter, auto-headers)
const csv = serialize(doc);
// name,age
// Alice,30
// Bob,25

// Serialize as TSV (tab-separated)
const tsv = serialize(doc, { delimiter: '\t' });
// name	age
// Alice	30
// Bob	25

// Serialize with all fields quoted
const quoted = serialize(doc, { quoting: 'all' });
// "name","age"
// "Alice","30"
// "Bob","25"

Roundtrip

import { parse, serialize } from '@bakes/dastardly-csv';

const source = 'name,age\nAlice,30\nBob,25';
const doc = parse(source);
const output = serialize(doc);
// Preserves data structure, can reformat with different options

API Reference

Package Object

The package exports a csv object implementing the FormatPackage interface:

import { csv } from '@bakes/dastardly-csv';

const doc = csv.parse('name,age\nAlice,30', { inferTypes: true });
const output = csv.serialize(doc, { delimiter: '\t' });

Convenience Functions

For convenience, parse and serialize are also exported as standalone functions:

parse(source, options?)

Parse CSV string into a DocumentNode:

function parse(
  source: string,
  options?: CSVParseOptions
): DocumentNode;

Parameters:

  • source - CSV string to parse
  • options - Optional parse options:
    • delimiter?: string - Field delimiter (,, \t, or |). Default: ,
    • headers?: boolean | string[] - Header handling:
      • true (default) - First row is headers
      • false - No headers (produces array of arrays)
      • string[] - Custom header names
    • inferTypes?: boolean - Auto-detect numbers and booleans. Default: false

Example:

import { parse } from '@bakes/dastardly-csv';

// Basic parsing with headers
const doc1 = parse('name,age\nAlice,30');
console.log(doc1.body.type); // 'Array'

// Parse with type inference
const doc2 = parse('name,score\nAlice,100', { inferTypes: true });
// score will be Number(100) instead of String("100")

// Parse without headers (array of arrays)
const doc3 = parse('Alice,30\nBob,25', { headers: false });
// Produces: [["Alice", "30"], ["Bob", "25"]]

// Parse TSV with custom headers
const doc4 = parse('Alice\t30\nBob\t25', {
  delimiter: '\t',
  headers: ['name', 'age']
});

Throws: ParseError if source is invalid CSV

serialize(node, options?)

Serialize AST to CSV string:

function serialize(
  node: DocumentNode | DataNode,
  options?: CSVSerializeOptions
): string;

Parameters:

  • node - DocumentNode or DataNode to serialize (must be Array of Objects or Array of Arrays)
  • options - Optional serialization options:
    • delimiter?: string - Field delimiter. Default: ,
    • quoting?: 'needed' | 'all' | 'nonnumeric' | 'none' - Quote strategy. Default: 'needed'
    • lineEnding?: '\n' | '\r\n' - Line ending style. Default: '\n'
    • headers?: boolean | string[] - Header handling:
      • true (default for Objects) - Auto-generate from object keys
      • false - No header row
      • string[] - Custom header names

Example:

import { serialize } from '@bakes/dastardly-csv';

// Default: comma-separated with auto-headers
serialize(doc);
// name,age
// Alice,30

// Tab-separated (TSV)
serialize(doc, { delimiter: '\t' });
// name	age
// Alice	30

// Pipe-separated (PSV)
serialize(doc, { delimiter: '|' });
// name|age
// Alice|30

// Quote all fields
serialize(doc, { quoting: 'all' });
// "name","age"
// "Alice","30"

// Quote only non-numeric fields
serialize(doc, { quoting: 'nonnumeric' });
// "name",age
// "Alice",30

// Custom line endings (Windows-style)
serialize(doc, { lineEnding: '\r\n' });

// No headers
serialize(doc, { headers: false });
// Alice,30
// Bob,25

Types

CSVParseOptions

Options for CSV parsing:

interface CSVParseOptions {
  /** Field delimiter (`,`, `\t`, `|`). Default: `,` */
  delimiter?: string;

  /**
   * Header handling:
   * - `true` (default): First row is headers
   * - `false`: No headers (array of arrays)
   * - `string[]`: Custom header names
   */
  headers?: boolean | string[];

  /** Auto-detect and convert numbers/booleans. Default: `false` */
  inferTypes?: boolean;
}

CSVSerializeOptions

Options for CSV serialization:

interface CSVSerializeOptions {
  /** Field delimiter. Default: `,` */
  delimiter?: string;

  /**
   * Quote strategy:
   * - `'needed'`: Quote fields that require it (contains delimiter, newline, quote)
   * - `'all'`: Quote all fields
   * - `'nonnumeric'`: Quote non-numeric fields
   * - `'none'`: Never quote (may produce invalid CSV)
   */
  quoting?: 'needed' | 'all' | 'nonnumeric' | 'none';

  /** Line ending style. Default: `'\n'` */
  lineEnding?: '\n' | '\r\n';

  /**
   * Header handling:
   * - `true` (default for Objects): Auto-generate from keys
   * - `false`: No header row
   * - `string[]`: Custom header names
   */
  headers?: boolean | string[];
}

Position Tracking

Every node in the AST includes position information:

import { parse } from '@bakes/dastardly-csv';

const doc = parse('name,age\nAlice,30');

// Position info on every node
console.log(doc.loc);
// {
//   start: { line: 1, column: 0, offset: 0 },
//   end: { line: 2, column: 8, offset: 17 }
// }

// Even individual cells have positions
const data = doc.body;
if (data.type === 'Array' && data.elements[0]?.type === 'Object') {
  const firstRow = data.elements[0];
  console.log(firstRow.properties[0]?.value.loc);
  // Position of "Alice" in the CSV
}

Common Patterns

Type Inference

import { parse } from '@bakes/dastardly-csv';

const doc = parse('name,age,active\nAlice,30,true', { inferTypes: true });

const data = doc.body;
if (data.type === 'Array' && data.elements[0]?.type === 'Object') {
  const row = data.elements[0];
  // age is Number, not String
  // active is Boolean, not String
}

Headerless CSV (Array of Arrays)

import { parse, serialize } from '@bakes/dastardly-csv';

// Parse without headers
const doc = parse('Alice,30\nBob,25', { headers: false });

// Produces array of arrays:
// [
//   ["Alice", "30"],
//   ["Bob", "25"]
// ]

// Serialize array of arrays
const output = serialize(doc, { headers: false });
// Alice,30
// Bob,25

Custom Headers

import { parse, serialize } from '@bakes/dastardly-csv';

// Parse with custom headers (source has no header row)
const doc = parse('Alice,30\nBob,25', {
  headers: ['name', 'age']
});

// Override headers when serializing
const output = serialize(doc, {
  headers: ['person', 'years']
});
// person,years
// Alice,30
// Bob,25

Cross-format Conversion

import { parse as parseCSV, serialize as serializeCSV } from '@bakes/dastardly-csv';
import { parse as parseJSON, serialize as serializeJSON } from '@bakes/dastardly-json';

// CSV → JSON
const csvDoc = parseCSV('name,age\nAlice,30\nBob,25');
const jsonOutput = serializeJSON(csvDoc, { indent: 2 });
// [
//   {
//     "name": "Alice",
//     "age": "30"
//   },
//   ...
// ]

// JSON → CSV
const jsonDoc = parseJSON('[{"name":"Alice","age":30}]');
const csvOutput = serializeCSV(jsonDoc);
// name,age
// Alice,30

Error Handling

import { parse } from '@bakes/dastardly-csv';
import { ParseError } from '@bakes/dastardly-core';

try {
  // Malformed CSV (unclosed quote)
  const doc = parse('name,age\n"Alice,30');
} catch (error) {
  if (error instanceof ParseError) {
    console.error(`Parse error at line ${error.line}, column ${error.column}`);
  }
}

Edge Cases

Empty Fields

import { parse } from '@bakes/dastardly-csv';

// Empty fields become empty strings
const doc = parse('name,age\nAlice,\n,30');
// Alice has empty age, second row has empty name

Special Characters

import { parse, serialize } from '@bakes/dastardly-csv';

// Fields with commas are quoted automatically
const doc = parse('name,location\n"Alice","New York, NY"');

// Quotes within fields are escaped with double quotes
const doc2 = parse('quote\n"She said ""hello"""');
// Value: She said "hello"

Large Numbers

import { parse } from '@bakes/dastardly-csv';

// With type inference, numbers are parsed as numbers
const doc = parse('id\n9007199254740991', { inferTypes: true });
// Gets Number.MAX_SAFE_INTEGER

// Without type inference, stays as string
const doc2 = parse('id\n9007199254740991');
// Gets string "9007199254740991"

Limitations

  • Variable field counts: Rows with different numbers of fields are not currently supported
  • Multi-line values: Only supported within quoted fields
  • Nested structures: CSV is flat - nested objects/arrays require special handling (see serializer options)

Related Packages

License

MIT