npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

sql-to-json-converter

v1.0.2

Published

Powerful SQL to JSON converter with support for large files and multiple output formats. Converts SQL database dumps to structured JSON files.

Readme

SQL to JSON Converter

🔄 Powerful SQL to JSON converter with support for large files and multiple output formats. Converts SQL database dumps to structured JSON files.

✨ Key Features

  • 🚀 Large file processing: Stream processing for SQL files up to GB size
  • 📁 Multiple output modes:
    • Separate files: Each table becomes a separate JSON file (default)
    • Combined file: All tables in one JSON file
  • 💾 Smart output: Automatically creates json-output directory with summary file
  • High performance: Batch processing and memory optimization
  • 🛡️ Error resilient: Skip unparsable statements and continue processing
  • 📊 Progress tracking: Real-time progress and memory usage
  • 🎯 CLI & Library: Can be used as both CLI tool and JavaScript library

📦 Installation

Use with npx (recommended)

npx sql-to-json-converter database.sql

Global installation

npm install -g sql-to-json-converter
sql-to-json database.sql

Local installation

npm install sql-to-json-converter

🚀 CLI Usage

Separate files mode (default)

# Export each table as separate file in json-output/ directory
npx sql-to-json-converter database.sql

# Specify different output directory
npx sql-to-json-converter database.sql --output-dir my-tables

# With additional options
npx sql-to-json-converter database.sql --memory --batch-size 1000

Combined file mode

# Export everything to a single JSON file
npx sql-to-json-converter database.sql --combined --output result.json

# Export to stdout
npx sql-to-json-converter database.sql --combined

Advanced options

# Process large files with memory monitoring
npx sql-to-json-converter large-db.sql --memory --limit 100000

# Skip unparsable statements (faster processing)
npx sql-to-json-converter database.sql --skip-unparsable

# Custom batch size for performance tuning
npx sql-to-json-converter database.sql --batch-size 2000

📚 Library Usage

Basic usage

const { convertSQLToJSONFiles, convertSQLToJSON } = require('sql-to-json-converter');

// Read SQL file and convert to separate files
const sqlContent = fs.readFileSync('database.sql', 'utf8');
const result = convertSQLToJSONFiles(sqlContent, 'output-folder');
console.log(`Converted ${result.metadata.totalTables} tables`);

// Or convert to combined JSON
const combined = convertSQLToJSON(sqlContent);
console.log(combined.tables);

Advanced usage

const { SQLToJSONConverter } = require('sql-to-json-converter');

const converter = new SQLToJSONConverter({
  batchSize: 1000,
  showMemory: true,
  outputMode: 'separate',
  outputDir: 'my-json-data'
});

// Process large file with streaming
converter.processLargeSQL('huge-database.sql').then(() => {
  console.log('Conversion completed!');
});

API Reference

// High-level functions
convertSQLToJSON(content, options)           // -> Combined JSON object
convertSQLToJSONFiles(content, outputDir)    // -> Separate files + summary  
processLargeSQLFile(inputFile, outputFile)   // -> Stream processing

// Advanced usage
createConverter(options)                     // -> SQLToJSONConverter instance

📝 Output Examples

Input SQL

CREATE TABLE users (
    id INT PRIMARY KEY AUTO_INCREMENT,
    name VARCHAR(255) NOT NULL,
    email VARCHAR(255) UNIQUE
);

INSERT INTO users VALUES (1, 'John Doe', '[email protected]');
INSERT INTO users VALUES (2, 'Jane Smith', '[email protected]');

CREATE TABLE products (
    id INT PRIMARY KEY,
    name VARCHAR(100),
    price DECIMAL(10,2)
);

INSERT INTO products VALUES (1, 'Laptop', 999.99);
INSERT INTO products VALUES (2, 'Mouse', 25.50);

Separate Files Output (default)

json-output/
├── _summary.json       # Overview of all tables
├── users.json          # User table data
└── products.json       # Product table data

users.json:

{
  "tableName": "users",
  "columns": [
    {"name": "id", "type": "INT PRIMARY KEY AUTO_INCREMENT"},
    {"name": "name", "type": "VARCHAR(255) NOT NULL"},
    {"name": "email", "type": "VARCHAR(255) UNIQUE"}
  ],
  "recordCount": 2,
  "generatedAt": "2024-01-20T10:30:00.000Z",
  "data": [
    {"id": 1, "name": "John Doe", "email": "[email protected]"},
    {"id": 2, "name": "Jane Smith", "email": "[email protected]"}
  ]
}

_summary.json:

{
  "generatedAt": "2024-01-20T10:30:00.000Z",
  "totalTables": 2,
  "totalRecords": 4,
  "tables": [
    {"name": "users", "recordCount": 2, "fileName": "users.json"},
    {"name": "products", "recordCount": 2, "fileName": "products.json"}
  ]
}

🎯 CLI Options

| Option | Description | Default | |--------|-------------|---------| | --help, -h | Show help | | | --version, -v | Show version | | | --separate | Export separate files (default) | ✅ | | --combined | Export combined file | | | --output [file] | Output file for combined mode | | | --output-dir [dir] | Output directory for separate mode | json-output | | --memory, -m | Show memory usage | | | --batch-size [num] | Batch size for processing | 500 | | --limit [num] | Limit number of statements | | | --skip-unparsable | Skip unparsable statements | |

🚀 Performance

File Size Guidelines

  • < 10MB: In-memory processing
  • > 10MB: Automatic stream processing
  • > 100MB: Recommended to use --memory flag
  • > 1GB: Recommended to increase --batch-size to 2000+

Memory Optimization

# For very large files (> 1GB)
npx sql-to-json-converter huge-db.sql \
  --memory \
  --batch-size 5000 \
  --skip-unparsable \
  --output-dir large-output

📊 Supported SQL Statements

| Statement | Support | Description | |-----------|---------|-------------| | CREATE TABLE | ✅ Full | Table structure, columns, constraints | | INSERT INTO | ✅ Full | Single and multiple value sets | | VALUES | ✅ Full | Quoted strings, numbers, NULL | | DROP TABLE | ✅ Skip | Ignored during processing | | Comments | ✅ Full | -- line comments | | Transactions | ✅ Basic | START TRANSACTION, COMMIT |

🛠 Development

Setup

git clone <repo-url>
cd sql-to-json-converter
npm install

Testing

# Create test SQL file
echo "CREATE TABLE test (id INT); INSERT INTO test VALUES (1);" > test.sql

# Test CLI
npm start test.sql

# Test library
node -e "
const {convertSQLToJSONFiles} = require('./index');
const fs = require('fs');
const sql = fs.readFileSync('test.sql', 'utf8');
console.log(convertSQLToJSONFiles(sql));
"

Publishing

# Update version
npm version patch|minor|major

# Publish to npm
npm publish

⚙️ Configuration Options

const options = {
  batchSize: 1000,        // Processing batch size
  showMemory: true,       // Show memory usage
  limit: 50000,          // Max statements to process
  skipUnparsable: true,   // Skip invalid statements
  outputMode: 'separate', // 'separate' or 'combined'
  outputDir: 'my-output'  // Output directory name
};

🐛 Troubleshooting

Common Issues

1. Memory errors with large files

# Reduce batch size and enable memory monitoring
npx sql-to-json-converter large-file.sql --batch-size 200 --memory

2. Unparsable statements

# Skip invalid statements
npx sql-to-json-converter problematic.sql --skip-unparsable

3. Too slow with very large files

# Increase batch size and skip unparsable
npx sql-to-json-converter huge.sql --batch-size 2000 --skip-unparsable

📄 License

MIT License

🤝 Contributing

  1. Fork the repository
  2. Create feature branch (git checkout -b feature/amazing-feature)
  3. Commit changes (git commit -m 'Add amazing feature')
  4. Push to branch (git push origin feature/amazing-feature)
  5. Open Pull Request

📞 Support