n8n-nodes-motherduck
v0.1.3
Published
n8n node to connect to MotherDuck (DuckDB in the cloud)
Downloads
24
Maintainers
Readme
n8n-nodes-motherduck
This is an n8n community node that lets you use MotherDuck (DuckDB in the cloud) in your n8n workflows.
MotherDuck is a serverless analytics platform built on DuckDB. It allows you to run SQL queries on your data without managing infrastructure.
n8n is a fair-code licensed workflow automation platform.
Important: only n8n:latest-debian is supported.
This node uses the
@duckdb/node-apipackage which includes native bindings compiled for glibc-based systems. Alpine Linux usesmusl libcinstead, which is incompatible with these bindings. If you're running n8n on Alpine Linux (common in Docker images), you'll need to use a glibc-based distribution like Debian or Ubuntu instead.
Operations
Row Operations
- Get: Query rows from a table with optional filters
- Insert: Insert rows into a table
- Upsert: Insert or update rows based on matching columns (uses MERGE)
- Delete: Delete rows matching specified conditions
Table Operations
- List: List all tables in a database/schema
- Import from File: Create or load data from CSV, JSON, or Parquet files
- Insert from Data: Batch insert an array of objects into a table (create or append)
- Upsert from Data: Batch upsert an array of objects into a table
Credentials
You need a MotherDuck access token to use this node. You can create one at app.motherduck.com/settings/tokens.
Installation
Follow the installation guide in the n8n community nodes documentation.
npm
npm install n8n-nodes-motherduckManual Installation
- Clone this repository
- Run
npm install - Run
npm run build - Copy the
distfolder to your n8n custom nodes directory
Usage
Get Rows
Query data from a MotherDuck table:
- Select Row resource and Get operation
- Choose the database, schema, and table
- Optionally add filter conditions
- Set limit, order by, or return all rows
Insert Rows
Insert rows from your workflow into a table:
Using an existing table:
- Select Row resource and Insert operation
- Choose the database, schema, and table
- Optionally specify an Input Data Field to map from a specific field in your input (e.g.,
dataoruser.profile) - leave empty to use the entire input - Map columns using the resource mapper (auto-map or define manually)
Creating a new table:
- Select Row resource and Insert operation
- Set Table Mode to "Create New Table"
- Enter the new table name
- Optionally specify an Input Data Field to map from a specific field
- Optionally define column types in Column Definitions (defaults to VARCHAR for all columns)
Upsert Rows
Insert new rows or update existing ones based on matching columns:
Using an existing table:
- Select Row resource and Upsert operation
- Choose the database, schema, and table
- Optionally specify an Input Data Field to map from a specific field in your input
- Select which columns to match on using the resource mapper (no unique constraint required - uses MERGE)
- Map the values to insert/update
Creating a new table:
- Select Row resource and Upsert operation
- Set Table Mode to "Create New Table"
- Enter the new table name
- Specify Match Columns as a comma-separated list (e.g.,
id, email) - Optionally specify an Input Data Field and Column Definitions
Delete Rows
Delete rows matching specified conditions:
- Select Row resource and Delete operation
- Choose the database, schema, and table
- Add filter conditions (at least one required)
List Tables
List all tables in a database:
- Select Table resource and List operation
- Choose the database
- Optionally filter by schema
Import from File
Create or append to a table from external files:
- Select Table resource and Import from File operation
- Choose the database and schema
- Enter the target table name
- Enter the file URL(s):
- Single file:
https://example.com/data.csv - Multiple files:
https://example.com/file1.csv, https://example.com/file2.csv - Glob pattern:
https://example.com/data/*.parquet - S3:
s3://bucket/path/file.json
- Single file:
- Select format (auto-detect, CSV, JSON, or Parquet)
- Choose import mode:
- Create Table: Create new table (fails if exists)
- Create or Replace: Drop and recreate table
- Create If Not Exists: Only create if table doesn't exist
- Append: Insert into existing table
- For CSV files, configure delimiter, header, quote character, etc.
Insert from Data (Batch Insert)
Batch insert an array of objects into a table. Perfect for processing aggregated data from previous nodes:
- Select Table resource and Insert from Data operation
- Choose the database and schema
- Enter the target table name
- Specify the Data Field containing your array (e.g.,
movies,data.items), or leave empty to use input items directly - Choose mode:
- Create Table: Create new table (fails if exists)
- Create or Replace: Drop and recreate table
- Create If Not Exists: Only create if table doesn't exist
- Append: Insert into existing table
- Optionally define column types in Column Definitions
Example: A workflow that fetches 200 movies from an API, aggregates them into one item with a movies array, then uses Insert from Data to bulk insert all 200 movies in a single operation.
Upsert from Data (Batch Upsert)
Batch upsert an array of objects into a table:
- Select Table resource and Upsert from Data operation
- Choose the database and schema
- Enter the target table name
- Specify the Data Field containing your array
- Enter Match Columns (comma-separated, e.g.,
id, email) - Choose mode:
- Create or Replace Then Upsert: Drop and recreate table, then upsert
- Create If Not Exists Then Upsert: Create table if needed, then upsert
- Upsert Only: Table must exist
- Optionally define column types
Development
Quick Start with Docker
The easiest way to test the node locally is with Docker:
# Install dependencies and build
make install
make build
# Start n8n with the custom node
make start
# View logs
make logs
# Stop n8n
make stopThen open http://localhost:5678 in your browser. The MotherDuck node will be available in the nodes panel.
Development Workflow
# Terminal 1: Watch for TypeScript changes
make dev
# Terminal 2: Run n8n (restart after changes)
make startAfter making changes to the node code:
- The TypeScript watcher will recompile automatically
- Restart n8n to pick up changes:
make restart
Manual Setup
# Install dependencies
npm install
# Build the node
npm run build
# Run in development mode (watch)
npm run dev
# Lint
npm run lintMakefile Commands
| Command | Description |
|---------|-------------|
| make install | Install npm dependencies |
| make build | Build TypeScript to dist/ |
| make dev | Watch mode for development |
| make start | Build and start n8n container |
| make stop | Stop n8n container |
| make restart | Full restart (stop, build, start) |
| make logs | View n8n container logs |
| make clean | Remove containers and dist/ |
| make rebuild | Rebuild Docker image and start |
