@rafaelcecchin/pg-import
v3.1.6
Published
Node script to transfer data between PostgreSQL databases easily.
Readme
PostgreSQL Import 🚀 - Transfer data between PostgreSQL databases easily.
Description
Node script to transfer data between PostgreSQL databases easily. 🛠️
How to use
Step 1: Install PostgreSQL v12 or higher 🐘
This is necessary for the --rows-per-insert command from pg_dump to work.
Step 2: Add the environment variables 🌐
Add the path to your PostgreSQL "bin" folder to the "Path" system environment variable.
Note: in my case, it is located in "C:\Program Files\PostgreSQL\12\bin".
Step 3: Install Node.js (if you don't have it) 🟢
Click here to access the Node.js page and download.
Step 4: Install libraries 📦
Run the command (only if you haven't started npm in your project yet):
npm initRun the command below:
npm install @rafaelcecchin/pg-importStep 5: Configure databases 🗄️
Create a file called pg-import.config.js in the root of your project.
Below I present an example:
module.exports = {
'db': {
'erp-prod': {
name: process.env.PROD_ERP_NAME,
port: process.env.PROD_ERP_PORT,
host: process.env.PROD_ERP_HOST,
user: process.env.PROD_ERP_USER,
pass: process.env.PROD_ERP_PASS
},
'erp-dev': {
name: process.env.DEV_ERP_NAME,
port: process.env.DEV_ERP_PORT,
host: process.env.DEV_ERP_HOST,
user: process.env.DEV_ERP_USER,
pass: process.env.DEV_ERP_PASS
}
},
'import': {
'import1': {
'source': 'erp-prod',
'destination': 'erp-dev',
'tables': [
'rotas'
],
'create-db': true,
'encode': 'LATIN1',
'rows-per-insert': 5000,
'template': 'template0',
'lc-collate': 'C',
'lc-ctype': 'C'
},
'import2': {
'source': 'erp-prod',
'destination': 'erp-dev',
'tables': [
'clientes'
],
'create-db': false,
'encode': 'LATIN1',
'rows-per-insert': 5000
}
}
}Import args 📋
source: Source databasedestination: Destination databasetables: Tables to transfercreate-db: Create databaseencode: Set the database encodingtemplate: Set the database templatelc-collate: Set the database collatelc-ctype: Set the database ctype
ignore: Inform that they should be ignored during exportbefore-schema: Scripts to run before schema importafter-schema: Scripts to run after schema importbefore-data: Scripts to run before data importafter-data: Scripts to run after data importrows-per-insert: Number of rows per insertonly-restore: Just restore, without doing DUMPonly-dump: Only DUMP, without restoringrm: Auto remove backup files
Step 6: Make the transfers 🔄
Now that everything is configured, you can use bash to make transfers between databases.
node node_modules/@rafaelcecchin/pg-import/pg-import.jsAlternatively, you can streamline the process by adding the following script to your package.json:
"scripts": {
"import": "node node_modules/@rafaelcecchin/pg-import/pg-import.js"
}Then, you can run the transfer command with:
npm run importFinal considerations 💡
Although this script facilitates data transfer between PostgreSQL databases, the process can be improved. Collaborate with this small project, make a pull request. 🤝
