@arraypress/csv
v1.0.0
Published
CSV generation for edge runtimes — string, streaming, and Response helper.
Maintainers
Readme
@arraypress/csv
CSV generation for edge runtimes. String output for small datasets, streaming for large ones, and a ready-made Response helper for Cloudflare Workers and Hono routes.
Zero dependencies. Uses Web Streams API — works in Cloudflare Workers, Node.js 18+, Deno, Bun, and browsers.
Installation
npm install @arraypress/csvUsage
Quick Export (small datasets)
import { csvResponse } from '@arraypress/csv';
app.get('/api/export/customers', async (c) => {
const data = await db.prepare('SELECT * FROM customers').all();
return csvResponse({
filename: 'customers.csv',
columns: ['ID', 'Email', 'Name', 'Country', 'Orders'],
rows: data.results.map(c => [c.id, c.email, c.name, c.country, c.order_count]),
});
});Streaming Export (large datasets)
import { csvResponse } from '@arraypress/csv';
app.get('/api/export/orders', async (c) => {
return csvResponse({
filename: 'orders.csv',
columns: ['Order #', 'Email', 'Amount', 'Currency', 'Status', 'Date'],
stream: async function* () {
let offset = 0;
while (true) {
const batch = await db.prepare('SELECT * FROM orders LIMIT 500 OFFSET ?')
.bind(offset).all();
if (!batch.results.length) break;
for (const o of batch.results) {
yield [o.order_number, o.email, o.amount, o.currency, o.status, o.created_at];
}
offset += 500;
}
},
});
});Build CSV String
import { createCSV } from '@arraypress/csv';
const csv = createCSV({
columns: ['Name', 'Email', 'Amount'],
rows: [
['Alice', '[email protected]', 1999],
['Bob', '[email protected]', 2500],
],
});
// "Name,Email,Amount\nAlice,[email protected],1999\nBob,[email protected],2500\n"API
csvResponse({ filename, columns, rows?, stream? })
Create a complete Response with CSV data, Content-Type, and Content-Disposition headers. Ready to return from any Worker/Hono route.
filename— download filename (e.g.'orders.csv')columns— header rowrows— array of row arrays (for small datasets)stream— async generator yielding row arrays (for large datasets)includeHeader— include header row (defaulttrue)
If both rows and stream are provided, stream takes precedence.
createCSV({ columns, rows })
Returns a CSV string. Best for small datasets or when you need the string for something other than a Response (email attachment, file write, etc.).
createCSVStream({ columns, stream })
Returns a ReadableStream. Best for large datasets where you need the stream directly (piping to another stream, etc.).
Escaping
All values are automatically escaped per RFC 4180:
- Values containing commas are wrapped in quotes:
"Smith, John" - Values containing quotes have quotes doubled:
"Say ""hello""" - Values containing newlines are wrapped in quotes
- Null and undefined become empty strings
- Numbers and booleans are converted to strings
License
MIT
