n8n-nodes-postgresql-copy
v1.0.17
Published
n8n community node for PostgreSQL COPY command - efficient bulk import/export
Maintainers
Readme
n8n-nodes-postgresql-copy
Custom n8n community node (developer: Firman Hanafi) to run PostgreSQL COPY TO / COPY FROM with streaming (CSV/TSV/custom delimiter) for bulk export/import.
Features
- COPY TO: export SELECT results to CSV/TSV/custom delimiter; header, quote/null, encoding options.
- Chunked Export: split a single COPY stream into multiple files by max rows per file.
- COPY FROM: import CSV/TSV/custom delimiter from a binary property; header, column mapping, dry-run, skip errors.
- Streaming: avoids loading the whole file into memory.
- Uses n8n's built-in Postgres credentials.
Installation
- n8n UI: Settings → Community Nodes → install
n8n-nodes-postgresql-copy. - CLI (in your n8n directory):
npm install n8n-nodes-postgresql-copy
Node Configuration
- Credentials: select Postgres (host, port, database, user, password, ssl).
- Operation:
- Copy To (Export): set
Query,Output Format(csv/tsv/custom),File Name,Binary Property Name,Max Rows Per File(0 = unlimited), header/quote/null/encoding options. - Copy From (Import): set
Table Name,Input Binary Field,Input Format(csv/tsv/custom),Has Header,Column Mapping, quote/null/skipErrors/dryRun options.
- Copy To (Export): set
Output
- Copy To: JSON
{rowCount, fileSize, executionTimeMs, fileName, format}+ binary file (withmimeType,fileName,fileExtension).- Chunked Export: JSON includes
{totalChunks, chunkNumber, totalRows}per file (computed after stream completes). Returns multiple output items.
- Chunked Export: JSON includes
- Copy From: JSON
{success, table, rowsImported, rowsSkipped, errors, executionTimeMs, dryRun}.
Chunked Export (Copy To)
When Max Rows Per File is set to a value > 0, the export is split into multiple files:
- Single COPY stream: No
LIMIT/OFFSETpagination and no pre-COUNT(*)query. - No duplicates: Rows are read once from the stream and split by row count.
- Naming: Files are named sequentially:
export-1.csv,export-2.csv, etc. - Headers: Each chunk file includes headers (if enabled).
- Ordering: Output order follows the query result. Add
ORDER BYif you need deterministic ordering. - Atomicity: If the stream fails, the operation fails (no partial results).
Example Use Case:
- Export 800,000 rows with
Max Rows Per File = 100000 - Result: 8 output items, each with a CSV file (export-1.csv through export-8.csv)
- Each JSON includes:
{chunkNumber: 1, totalChunks: 8, totalRows: 800000, rowCount: 100000, ...}
Notes
- For custom delimiter, fill
Custom Delimiter. Dry Runrolls back the transaction (no data written).- Ensure resources are sufficient for large files; COPY uses streaming but still needs memory headroom.
Docs
- Sequence diagram for chunked export:
docs/sequence-copyto-chunked.puml
Development
npm install
npm run build
npm test # requires devDependenciesLicense
MIT
