npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@toiroakr/lines-db

v0.6.1

Published

A database implementation that treats JSONL files as tables using SQLite

Readme

lines-db

A data management library that treats JSONL (JSON Lines) files as tables. Perfect for managing application seed data and testing.

Features

  • 📝 Load JSONL files as database tables
  • CLI tools for validation and data migration
  • 🔄 Automatic schema inference
  • 📦 JSON column support with automatic serialization/deserialization
  • ✅ Built-in validation using StandardSchema (Valibot, Zod, etc.)
  • 🎯 Automatic type inference from table names
  • 🔄 Bidirectional schema transformations
  • 💾 Auto-sync to JSONL files
  • 🛡️ Type-safe with TypeScript
  • Node.js 22.5+ support

VS Code Extension

A VS Code extension is available that provides syntax highlighting and validation for JSONL files with schema support.

VS Code Marketplace

Install from VS Code Marketplace

Installation

npm install @toiroakr/lines-db
# or
pnpm add @toiroakr/lines-db

CLI Usage

Setting Up Schemas

Create schema files alongside your JSONL files:

Directory structure:

data/
  ├── users.jsonl
  ├── users.schema.ts
  ├── products.jsonl
  └── products.schema.ts

Example schema (users.schema.ts):

import * as v from 'valibot';
import { defineSchema } from '@toiroakr/lines-db';

export const schema = defineSchema(
  v.object({
    id: v.pipe(v.number(), v.integer(), v.minValue(1)),
    name: v.pipe(v.string(), v.minLength(1)),
    age: v.pipe(v.number(), v.integer(), v.minValue(0), v.maxValue(150)),
    email: v.pipe(v.string(), v.email()),
  }),
);
export default schema;

Supported validation libraries:

Validate JSONL Files

Validate your JSONL files against their schemas:

npx lines-db validate <path>

Example:

# Validate all JSONL files in ./data directory
npx lines-db validate ./data

# Validate a specific file
npx lines-db validate ./data/users.jsonl

# Verbose output
npx lines-db validate ./data --verbose

This command will:

  • For directories: Find all .jsonl files in the directory
  • For files: Validate the specified .jsonl file
  • Load corresponding .schema.ts files
  • Validate each record against the schema
  • Report validation errors with detailed messages

Migrate Data

Transform data in JSONL files with validation:

npx lines-db migrate <file> <transform> [options]

Example:

# Update all ages by adding 1
npx lines-db migrate ./data/users.jsonl "(row) => ({ ...row, age: row.age + 1 })"

# Migrate with filter
npx lines-db migrate ./data/users.jsonl "(row) => ({ ...row, active: true })" --filter "{ age: (age) => age > 18 }"

# Save transformed data on error
npx lines-db migrate ./data/users.jsonl "(row) => ({ ...row, age: row.age + 1 })" --errorOutput ./migrated.jsonl

Options:

  • --filter, -f <expr> - Filter expression to select rows
  • --errorOutput, -e <path> - Save transformed data to file if migration fails
  • --verbose, -v - Show detailed error messages

The migration runs in a transaction and validates all transformed rows before committing.

TypeScript Usage

Generate Types

Generate TypeScript types from your schemas for type-safe database access:

npx lines-db generate <dataDir>

Example:

# Generate types (creates ./data/db.ts by default)
npx lines-db generate ./data

Add to package.json:

"scripts": {
  "db:validate": "lines-db validate ./data",
  "db:generate": "lines-db generate ./data"
}

Quick Start

1. Create a JSONL file (./data/users.jsonl):

{"id":1,"name":"Alice","age":30,"email":"[email protected]"}
{"id":2,"name":"Bob","age":25,"email":"[email protected]"}
{"id":3,"name":"Charlie","age":35,"email":"[email protected]"}

2. Use in TypeScript:

import { LinesDB } from '@toiroakr/lines-db';

const db = LinesDB.create({ dataDir: './data' });
await db.initialize();

// Find all users
const users = db.find('users');
console.log(users); // [{ id: 1, name: "Alice", ... }, ...]

// Find a specific user
const user = db.findOne('users', { id: 1 });
console.log(user); // { id: 1, name: "Alice", age: 30, ... }

// Find with conditions
const adults = db.find('users', { age: (age) => age >= 30 });

await db.close();

Using Generated Types

After running npx lines-db generate ./data:

import { LinesDB } from '@toiroakr/lines-db';
import { config } from './data/db.js';

const db = LinesDB.create(config);
await db.initialize();

// ✨ Type is automatically inferred!
const users = db.find('users');

// ✨ Type-safe operations
db.insert('users', {
  id: 10,
  name: 'Alice',
  age: 30,
  email: '[email protected]',
});

await db.close();

Core API

Query Operations:

  • find(table, where?) - Find all matching records
  • findOne(table, where?) - Find a single record
  • query(sql, params?) - Execute raw SQL query

Modify Operations:

  • insert(table, data) - Insert a single record
  • update(table, data, where) - Update matching records
  • delete(table, where) - Delete matching records

Batch Operations:

  • batchInsert(table, data[]) - Insert multiple records
  • batchUpdate(table, updates[]) - Update multiple records
  • batchDelete(table, where) - Delete multiple records

Transaction & Schema:

  • transaction(fn) - Execute operations in a transaction
  • getSchema(table) - Get table schema
  • getTableNames() - Get all table names

Where Conditions:

// Simple equality
db.find('users', { age: 30 });

// Multiple conditions (AND)
db.find('users', { age: 30, name: 'Alice' });

// Advanced conditions
db.find('users', {
  age: (age) => age > 25,
  name: (name) => name.startsWith('A'),
});

JSON Columns

Objects and arrays are automatically handled as JSON columns:

db.insert('orders', {
  id: 1,
  items: [{ name: 'Laptop', quantity: 1 }],
  metadata: { source: 'web' },
});

const order = db.findOne('orders', { id: 1 });
console.log(order.items[0].name); // "Laptop"

Schema Transformations

When your schema transforms data types (e.g., parsing date strings into Date objects), you need to provide a backward transformation to save data back to JSONL files.

Why? JSONL files store strings like "2024-01-01", but your app works with Date objects. You need to convert both ways.

Example:

import * as v from 'valibot';
import { defineSchema } from '@toiroakr/lines-db';

const eventSchema = v.pipe(
  v.object({
    id: v.number(),
    // Transform: string → Date (when reading)
    date: v.pipe(
      v.string(),
      v.isoDate(),
      v.transform((str) => new Date(str)),
    ),
  }),
);

// Provide backward transformation: Date → string (when writing)
export const schema = defineSchema(eventSchema, (output) => ({
  ...output,
  date: output.date.toISOString(), // Convert Date back to string
}));

In your JSONL file (events.jsonl):

{
  "id": 1,
  "date": "2024-01-01T00:00:00.000Z"
}

In your TypeScript code:

const event = db.findOne('events', { id: 1 });
console.log(event.date instanceof Date); // true
console.log(event.date.getFullYear()); // 2024

Transactions

Operations outside transactions are auto-synced:

db.insert('users', { id: 10, name: 'Alice', age: 30 });
// ↑ Automatically synced to users.jsonl

Batch operations with transactions:

await db.transaction(async (tx) => {
  tx.insert('users', { id: 10, name: 'Alice', age: 30 });
  tx.update('users', { age: 31 }, { id: 1 });
  // All changes synced atomically on commit
});

Configuration

interface DatabaseConfig {
  dataDir: string; // Directory containing JSONL files
}

const db = LinesDB.create({ dataDir: './data' });

Type Mapping

| JSON Type | Column Type | SQLite Storage | | ---------------- | ----------- | -------------- | | number (integer) | INTEGER | INTEGER | | number (float) | REAL | REAL | | string | TEXT | TEXT | | boolean | INTEGER | INTEGER | | object | JSON | TEXT | | array | JSON | TEXT |

License

MIT