npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

sftp-push-sync

v2.1.5

Published

SFTP sync tool for Hugo projects (local to remote, with hash cache)

Readme

SFTP Synchronisation Tool

Implements a push syncronisation with Dry-Run. Performs the following tasks:

  1. Upload new files
  2. Delete remote files that no longer exist locally
  3. Identify changes based on size or altered content and upload them

Why?

Features:

  • multiple connections in sync.config.json
  • dry-run mode
  • mirrors local → remote
  • adds, updates, deletes files
  • text diff detection
  • Binary files (images, video, audio, PDF, etc.): SHA-256 hash comparison
  • Hashes are cached in .sync-cache.*.json
  • Parallel uploads/deletions via worker pool
  • include/exclude patterns
  • Sidecar uploads / downloads - Bypassing the sync process

The file sftp-push-sync.mjs is pure JavaScript (ESM), not TypeScript. Node.js can execute it directly as long as "type": "module" is specified in package.json or the file has the extension .mjs.

Breaking changes in 2.0.0

  • The flags --upload-list / --download-list have been replaced by --sidecar-upload / --sidecar-download.
  • The settings for sidecars are now located in the sidecar block of the connection.

Install

npm i -D sftp-push-sync
# or
npm install --save-dev sftp-push-sync
# or
yarn add --dev sftp-push-sync
# or
pnpm add -D sftp-push-sync

Setup

Create a sync.config.json in the root folder of your project:

{
  "connections": {
    "prod": {
      "host": "your.host.net",
      "port": 23,
      "user": "ftpuser",
      "password": "mypassword",
      "syncCache": ".sync-cache.prod.json",
      "worker": 3,
      "sync": {
        "localRoot": "public",
        "remoteRoot": "/folder/"
      },
      "sidecar": {
        "localRoot": "sidecar-local",
        "remoteRoot": "/sidecar-remote/",
        "uploadList": [],
        "downloadList": []
      }
    },
    "staging": {
      "host": "ftpserver02",
      "port": 22,
      "user": "ftp_user",
      "password": "total_secret",
      "syncCache": ".sync-cache.staging.json",
      "worker": 1,
      "sync": {
        "localRoot": "public",
        "remoteRoot": "/web/my-page/"
      },
      "sidecar": {
        "localRoot": "sidecar-local",
        "remoteRoot": "/sidecar-remote/",
        "uploadList": [],
        "downloadList": []
      }
    }
  },
  "parallelScan": true,
  "cleanupEmptyDirs": true,
  "include": [],
  "exclude": ["**/.DS_Store", "**/.git/**", "**/node_modules/**"],
  "textExtensions": [".shtml",".xml",".txt",".json",".js",".css",".md",".svg"],
  "mediaExtensions": [".jpg",".jpeg",".png",".webp",".gif",".avif",".tif",".tiff",".mp4",".mov",".m4v","mp3",".wav",".flac"],
  "progress": {
    "scanChunk": 10,
    "analyzeChunk": 1
  },
  "logLevel": "normal",
  "logFile": ".sftp-push-sync.{target}.log"
}

CLI Usage

# Normal synchronisation
node bin/sftp-push-sync.mjs staging

# Normal synchronisation + sidecar upload list
node bin/sftp-push-sync.mjs staging --sidecar-upload

# Normal synchronisation + sidecar download list
node bin/sftp-push-sync.mjs staging --sidecar-download

# Only sidecar lists, no standard synchronisation
node bin/sftp-push-sync.mjs staging --skip-sync --sidecar-upload
node bin/sftp-push-sync.mjs staging --skip-sync --sidecar-download

# (optional) only run lists dry
node bin/sftp-push-sync.mjs staging --skip-sync --sidecar-upload --dry-run
  • Can be conveniently started via the scripts in package.json:
# For example
npm run sync:staging
# or short
npm run ss

If you have stored the scripts in package.json as follows:


"scripts": {
    "sync:staging": "sftp-push-sync staging",
    "sync:staging:dry": "sftp-push-sync staging --dry-run",
    "ss": "npm run sync:staging",
    "ssd": "npm run sync:staging:dry",

    "sync:prod": "sftp-push-sync prod",
    "sync:prod:dry": "sftp-push-sync prod --dry-run",
    "sp": "npm run sync:prod",
    "spd": "npm run sync:prod:dry",
  },

The dry run is a great way to compare files and fill the cache.

How ist works

There are 7 steps to follow:

  • Phase 1: Scan local files
  • Phase 2: Scan remote files
  • Phase 3: Compare & decide
  • Phase 4: Removing orphaned remote files
  • Phase 5: Preparing remote directories
  • Phase 6: Apply changes
  • Phase 7: Cleaning up empty remote directories

Phases 1 and 2 can optionally be executed in parallel. Phase 6 always runs in parallel with as many worker threads as the FTP server allows.

Sidecar uploads / downloads

A list of files that are excluded from the sync comparison and can be downloaded or uploaded separately.

  • sidecar.uploadList
    • Relative to sidecar.localRoot, e.g. "downloads.json" or "data/downloads.json"
  • sidecar.downloadList
    • Relative to sidecar.remoteRoot, e.g. "download-counter.json" or "logs/download-counter.json"
# normal synchronisation
sftp-push-sync staging

# Normal synchronisation + explicitly transfer sidecar upload list
sftp-push-sync staging --sidecar-upload

# just fetch the sidecar download list from the server
# combined with normal synchronisation
sftp-push-sync prod --sidecar-download --dry-run   # view first
sftp-push-sync prod --sidecar-download             # then do
  • The sidecar is always executed together with sync when using --sidecar-download or --sidecar-upload.
  • With --skip-sync, you can exclude the sync process and only process the sidecar:
sftp-push-sync prod --sidecar-download --skip-sync

Logging Progress

Logging can also be configured.

  • logLevel - normal, verbose, laconic.
  • logFile - an optional logFile.
  • scanChunk - After how many elements should a log output be generated during scanning?
  • analyzeChunk - After how many elements should a log output be generated during analysis?

For >100k files, use analyzeChunk = 10 or 50, otherwise the TTY output itself is a relevant factor.

Wildcards

Examples for Wirdcards for include, exclude, uploadList and downloadList:

  • "content/**" -EVERYTHING below content/
  • ".html", ".htm", ".md", ".txt", ".json"- Only certain file extensions
  • "**/*.html" - all HTML files
  • "**/*.md"- all Markdown files
  • "content/**/*.md" - only Markdown in content/
  • "static/images/**/*.jpg"
  • "**/thumb-*.*" - thumb images everywhere
  • "**/*-draft.*" -Files with -draft before the extension
  • "content/**/*.md" - all Markdown files
  • "config/**" - complete configuration
  • "static/images/covers/**"- cover images only
  • "logs/**/*.log" - all logs from logs/
  • "reports/**/*.xlsx"

practical excludes:

"exclude": [
  ".git/**",           // kompletter .git Ordner
  ".idea/**",          // JetBrains
  "node_modules/**",   // Node dependencies
  "dist/**",           // Build Output
  "**/*.map",          // Source Maps
  "**/~*",             // Emacs/Editor-Backups (~Dateien)
  "**/#*#",            // weitere Editor-Backups
  "**/.DS_Store"       // macOS Trash
]

Folder handling

Sync only handles files and creates missing directories during upload. However, it should also manage directories:

  • They should (optionally) be removed if:
    • for example, a directory is empty because all files have been deleted from it.
    • or if a directory no longer exists locally.

Which files are needed?

  • sync.config.json - The configuration file (with passwords in plain text, so please leave it out of the git repository)

Which files are created?

  • The cache files: .sync-cache.*.json
  • The log file: .sftp-push-sync.{target}.log (Optional, overwritten with each run)

You can safely delete the local cache at any time. The first analysis will then take longer, because remote hashes will be streamed again. After that, everything will run fast.

Note: The first run always takes a while, especially with lots of media – so be patient! Once the cache is full, it will be faster.

Example Output

An console output example

Links