npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

s3-forklift

v1.2.0

Published

aws-sdk based S3 uploader with support for AWS S3, DigitalOcean Spaces, and Cloudflare R2

Readme

s3-forklift

aws-sdk based S3 uploader with support for AWS S3, DigitalOcean Spaces, and Cloudflare R2. It wraps upload method of aws-sdk. For file upload, ContentType automatically added. Also, it can remove uploaded file from the file system, after upload completed successfully.

Install

npm install s3-forklift --save

Test

Rename secret.example.json to secret.json with valid credentials.

Then, run test via npm run test

Requirements

  • Node.js 10+

Initialize

AWS S3

const Forklift = require("s3-forklift");

const forklift = new Forklift({
  provider: "aws",
  accessKey: "<YOUR_ACCESS_KEY>",
  secretKey: "<YOUR_SECRET_KEY>",
  bucket: "<BUCKET_NAME>",
  region: "<REGION>", // required for aws
  s3params: {ACL: "bucket-owner-read"} // optional, default: {ACL: "public-read"}
});

DigitalOcean Spaces

const Forklift = require("s3-forklift");

const forklift = new Forklift({
  provider: "do",
  accessKey: "<YOUR_ACCESS_KEY>",
  secretKey: "<YOUR_SECRET_KEY>",
  bucket: "<BUCKET_NAME>",
  region: "fra1", // required for do (e.g., nyc3, sfo2, fra1)
  s3params: {ACL: "public-read"}
});

Cloudflare R2

const Forklift = require("s3-forklift");

const forklift = new Forklift({
  provider: "r2",
  accessKey: "<YOUR_R2_ACCESS_KEY_ID>",
  secretKey: "<YOUR_R2_SECRET_ACCESS_KEY>",
  bucket: "<BUCKET_NAME>",
  endpoint: "https://<ACCOUNT_ID>.r2.cloudflarestorage.com", // required for r2
  // region is optional for r2, defaults to "auto"
  s3params: {ACL: "public-read"}
});

Custom S3-Compatible Endpoint

You can also use any S3-compatible storage by providing a custom endpoint:

const forklift = new Forklift({
  provider: "aws", // or "do"
  accessKey: "<YOUR_ACCESS_KEY>",
  secretKey: "<YOUR_SECRET_KEY>",
  bucket: "<BUCKET_NAME>",
  region: "us-east-1",
  endpoint: "https://your-custom-s3-endpoint.com" // optional custom endpoint
});

Upload

  • source should be string (file path) or readable stream.
  • remotePath s3 path.
  • options Besides the all the options of original S3.upload you can pass
    • {remove: true} to remove the source after upload completed successfully
    • {timestamp: true} to add timestamp at the end of url
const url = await forklift.upload({source, remotePath, options});

Example

// To upload file
// ContentType automatically retrieved from file name and passed to S3.upload method
// if you want to override it, you should pass {ContentType:"<CONTENT_TYPE>"} as options.
const url = await forklift.upload({
  source: "test.jpg", 
  remotePath: "test/test.jpg"
});

// To upload and then remove the file with callback
const url = await forklift.upload({
  source: "test.jpg", 
  remotePath: "test/test.jpg",
  options: {remove: true}
});

// To upload a stream without ContentType
const url = await forklift.upload({
  source: fs.createReadStream("test.jpg"), 
  remotePath: "test/test.jpg"
});

// To upload a stream with ContentType
const url = await forklift.upload({
  source: fs.createReadStream("test.jpg"), 
  remotePath: "test/test.jpg",
  options: {ContentType:"image/jpeg"}
});

Changelog

1.2.0

  • Added Cloudflare R2 support (provider: "r2")
  • Added custom endpoint support for any S3-compatible storage
  • Increased accessKey max length to 128 characters (for R2 compatibility)
  • Region is now optional for R2 (defaults to "auto")

1.1.0

  • Added DigitalOcean Spaces support

1.0.0

  • Initial release with AWS S3 support