npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@maydotinc/s3-syncer

v0.1.5

Published

Full-featured CLI to sync local directories to S3-compatible buckets via GitHub Actions or locally. You may also pull files from a remote target.

Readme

@maydotinc/s3-syncer

Sync local folders to S3-compatible storage (AWS S3, Cloudflare R2, MinIO, etc.) with minimal setup.

Most teams can start immediately with one command:

npx @maydotinc/s3-syncer sync
  • If s3-syncer.json is missing, the CLI offers to create it or run a one-time sync (no config written).
  • If credentials are missing, the CLI can prompt you, or you can run in non-interactive env mode (--env / --env-file).

Requirements

  • Node.js 20+ (generated GitHub Actions workflow uses Node 22)
  • S3 credentials with permissions for ListObjectsV2, PutObject, GetObject (for pull), and optionally DeleteObject (when delete: true)

Guides

Quick start

  1. Run sync (interactive):
npx @maydotinc/s3-syncer sync
  1. If prompted, choose:
  • one-time sync (no config file), or
  • create reusable s3-syncer.json
  1. Provide credentials (interactive or via env):
  • from .env, or
  • enter once in prompt

Common examples

Create config only (multi-target, interactive):

npx @maydotinc/s3-syncer init --config-only

Create config + GitHub Actions workflow:

npx @maydotinc/s3-syncer init --full-setup

Run sync in non-interactive mode (fails if creds are missing from env):

npx @maydotinc/s3-syncer sync --env

Pull a subpath to a specific output directory (skip confirmation):

npx @maydotinc/s3-syncer pull path/to/remote -o ./pulled --yes

Environment variables

Credentials (required for sync/pull)

  • AWS_S3_ACCESS_KEY_ID
  • AWS_S3_SECRET_ACCESS_KEY

Optional

  • AWS_S3_ENDPOINT
  • SLACK_WEBHOOK_URL
  • DISCORD_WEBHOOK_URL

AWS_S3_ENDPOINT notes:

  • pull: used as a global endpoint fallback when present.
  • sync: used as a global endpoint fallback only when none of your targets specify an endpoint. If you have mixed providers/endpoints, set endpoint per target (often via ${...} placeholders).

Example .env:

AWS_S3_ACCESS_KEY_ID=your_access_key
AWS_S3_SECRET_ACCESS_KEY=your_secret_key
AWS_S3_ENDPOINT=

Commands

  • npx @maydotinc/s3-syncer init
    • interactive initializer (config-only or full setup)
  • npx @maydotinc/s3-syncer init --config-only
    • generate s3-syncer.json (no workflow)
  • npx @maydotinc/s3-syncer init --full-setup
    • generate s3-syncer.json + .github/workflows/s3-syncer.yml
  • npx @maydotinc/s3-syncer setup <directory>
    • configure one target and (by default) generate/update the workflow
  • npx @maydotinc/s3-syncer setup <directory> --env-file <path>
    • store an env file path on that target for local runs (no sync --env-file needed)
  • npx @maydotinc/s3-syncer sync
    • run sync now (interactive if config/credentials are missing)
  • npx @maydotinc/s3-syncer sync --env
    • load .env from the current working directory (non-interactive for credentials)
  • npx @maydotinc/s3-syncer sync --env-file ../.env.local
    • load a specific env file (implies env mode; non-interactive for credentials)
  • npx @maydotinc/s3-syncer pull [remotePath]
    • pull files from S3 to a local directory (supports config mode or “direct pull” flags)

init --full-setup vs setup

Not exactly, but very close in outcome.

  • init --full-setup

    • guided multi-step flow
    • can add multiple targets in one run
    • meant for onboarding and multi-target repos
  • setup <directory>

    • direct command for one target at a time
    • better for scripted/power usage with flags

Both end up producing the same core artifacts:

  • s3-syncer.json
  • .github/workflows/s3-syncer.yml

So they are functionally aligned, but the interaction model differs.

Config file

Primary config file:

  • s3-syncer.json

Example:

{
  "targets": [
    {
      "directory": "cdn",
      "bucket": "my-assets",
      "region": "auto",
      "endpoint": "https://<account>.r2.cloudflarestorage.com",
      "prefix": "assets",
      "delete": true,
      "envFile": "./.env.cdn",
      "accessKeyId": "${AWS_S3_ACCESS_KEY_ID}",
      "secretAccessKey": "${AWS_S3_SECRET_ACCESS_KEY}"
    }
  ],
  "branch": "main",
  "notifications": {
    "slack": false,
    "discord": false
  }
}

envFile (per target, optional)

If targets[].envFile is set, sync / pull will automatically load that env file for that target (using dotenv), so you can run:

npx @maydotinc/s3-syncer sync

Without --env / --env-file.

Notes:

  • envFile is for local runs. The generated GitHub Actions workflow uses repo secrets (not env files).
  • Env files are loaded per target and isolated so variables from one target don’t leak into the next.

${ENV_VAR} placeholders

You can reference environment variables in these target fields using ${VAR_NAME}:

  • bucket
  • endpoint
  • accessKeyId
  • secretAccessKey

If a placeholder is present and the env var is missing/empty, the run fails with a clear error pointing to the target + field.

Example:

{
  "targets": [
    {
      "directory": "cdn",
      "bucket": "${SYNC_BUCKET}",
      "region": "auto",
      "endpoint": "${SYNC_ENDPOINT}",
      "prefix": "assets",
      "delete": true,
      "accessKeyId": "${SYNC_ACCESS_KEY}",
      "secretAccessKey": "${SYNC_SECRET_KEY}"
    }
  ]
}

When target-level accessKeyId/secretAccessKey are set, they are preferred over global AWS_S3_ACCESS_KEY_ID/AWS_S3_SECRET_ACCESS_KEY.

Use this pattern when you have multiple providers/targets.

Workflow behavior

Generated file:

  • .github/workflows/s3-syncer.yml

Behavior:

  • triggers on configured branch
  • triggers only when target paths change, or s3-syncer.json / the workflow file itself changes
  • supports manual run (workflow_dispatch)
  • runs a pinned package version (the version you used when generating the workflow):

GitHub secrets for Actions:

  • AWS_S3_ACCESS_KEY_ID
  • AWS_S3_SECRET_ACCESS_KEY
  • AWS_S3_ENDPOINT (optional)
  • optional: SLACK_WEBHOOK_URL, DISCORD_WEBHOOK_URL
  • plus any ${VAR_NAME} referenced in target bucket, endpoint, accessKeyId, secretAccessKey

Pull behavior

pull lists remote files first, shows total file count/size, and asks for confirmation before downloading (unless --yes is passed).

  • If output path is not provided via -o, --output, it always prompts (even with --yes).
  • pull <remotePath> pulls only that subpath under the selected target prefix.
  • If you have multiple targets, you can pick one with --target <directory>.
  • If s3-syncer.json is missing, pull can still run by prompting for bucket/region/endpoint/prefix.
  • Power users can skip prompts with direct flags: --bucket, --region, --endpoint, --prefix, --access-key-id, --secret-access-key.

How s3-syncer works

Per target:

  1. fingerprint local files (MD5)
  2. list remote objects in bucket/prefix
  3. compare local MD5 vs remote ETag
  4. upload changed/new files
  5. optionally delete stale remote files (delete: true)

If the configured prefix does not exist yet, s3-syncer treats it as empty and starts uploading (no manual prefix creation needed).

No GitHub cache is required for correctness.

Notes:

  • Local dotfiles are ignored during fingerprinting (for example .well-known/... will not be uploaded).
  • ETag matching is used for change detection. If objects were uploaded outside of s3-syncer (for example multipart uploads), ETags may not match the local MD5 and those objects may be re-uploaded.

Notifications (optional)

Slack and Discord support with summary stats on successful runs.

R2 tip

For Cloudflare R2:

  • region: auto
  • endpoint: https://<account_id>.r2.cloudflarestorage.com

Troubleshooting

  • Missing credentials:
    • set AWS_S3_ACCESS_KEY_ID and AWS_S3_SECRET_ACCESS_KEY
  • Invalid config:
    • fix fields in s3-syncer.json
  • Missing placeholder env var:
    • ensure the referenced ${VAR_NAME} exists and is non-empty in your shell / .env
  • Missing directory:
    • ensure build output exists before running sync