npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

uns3

v0.0.6

Published

Runtime agnostic S3 client.

Readme

uns3

npm version npm downloads bundle size

Tiny, runtime-agnostic, S3 client.

A lightweight, dependency-free S3 client that works across Node, Deno, Bun and modern browsers. Compatible with AWS S3 and S3-compatible providers (Cloudflare R2, Hetzner, Backblaze B2, Garage, etc.). Focused on a small, ergonomic API for streaming downloads, uploads, multipart uploads, presigned URLs and common object operations.

Key features:

  • Runtime agnostic: same API in Node, Deno, Bun and browsers
  • Works with AWS S3 and S3-compatible endpoints (R2, Hetzner, Backblaze…)
  • Streamable responses (standard Response object)
  • Multipart upload helpers and presigned URL generation
  • Zero native dependencies, minimal bundle size

[!WARNING] This package is in active development. It is not recommended for production use yet unless you are willing to help with testing and feedback. Expect breaking changes, as I prioritize usability and correctness over stability at this stage.

Usage

Install the package:

# ✨ Auto-detect (supports npm, yarn, pnpm, deno and bun)
npx nypm install uns3

Import:

ESM (Node.js, Bun, Deno)

import { S3Client, S3Error } from "uns3";

CDN (Deno, Bun and Browsers)

import { S3Client, S3Error } from "https://esm.sh/uns3";

Initialization

First, create an instance of the S3Client. You need to provide your S3-compatible service's region, endpoint, and your credentials.

import { S3Client } from "uns3";

const client = new S3Client({
  // e.g. "us-east-1" or "auto" for R2
  region: "auto",
  // e.g. "https://s3.amazonaws.com" or your custom endpoint
  endpoint: "https://<ACCOUNT_ID>.r2.cloudflarestorage.com",
  credentials: {
    accessKeyId: "<ACCESS_KEY_ID>",
    secretAccessKey: "<SECRET_ACCESS_KEY>",
  },
  // Optional default bucket
  defaultBucket: "my-bucket",
});

Methods

All methods return a Promise.

get()

Retrieves an object from an S3 bucket. It returns a standard Response object, allowing you to stream the body.

// Get a full object
const response = await client.get({ key: "my-file.txt" });
const text = await response.text();
console.log(text);

// Get a partial object (range request)
const partialResponse = await client.get({
  key: "my-large-file.zip",
  range: { start: 0, end: 1023 }, // first 1KB
});
const chunk = await partialResponse.arrayBuffer();

Conditional Requests & Caching

The get() and head() methods support conditional request headers (ifMatch, ifNoneMatch, ifModifiedSince, ifUnmodifiedSince). When the object hasn't changed, S3 returns a 304 Not Modified response, which is treated as a success.

// Conditional GET using ETag
const response = await client.get({
  key: "cached-file.txt",
  ifNoneMatch: '"abc123"', // ETag from previous request
});

if (response.status === 304) {
  console.log("Content hasn't changed, use cached version");
} else {
  // Status is 200, process new content
  const content = await response.text();
}

This is especially useful when serving S3 responses through a server framework (e.g., Nitro, Nuxt) to browsers, as the library correctly handles browser cache validation.

head()

Retrieves metadata from an object without returning the object itself.

const response = await client.head({ key: "my-file.txt" });
console.log("Content-Type:", response.headers.get("content-type"));
console.log("ETag:", response.headers.get("etag"));
console.log("Size:", response.headers.get("content-length"));

put()

Uploads an object to an S3 bucket. The body can be a string, Blob, ArrayBuffer, Uint8Array, or a ReadableStream.

// Upload from a string
await client.put({
  key: "hello.txt",
  body: "Hello, World!",
  contentType: "text/plain", // also inferred from key extension
});

// Upload from a plain object (automatically stringified)
await client.put({
  key: "hello.json",
  body: {
    message: "Hello, World!",
  },
  // contentType is automatically set to application/json
});

// Upload from a Blob
const blob = new Blob(["<h1>Hello</h1>"], { type: "text/html" });
await client.put({
  key: "index.html",
  body: blob,
});

Conditional Overwrites (Advanced)

The put() method supports optional conditional headers (ifMatch, ifNoneMatch) for preventing accidental overwrites. Note that not all S3-compatible providers support these headers.

// Only overwrite if the current ETag matches
const response = await client.put({
  key: "document.txt",
  body: "Updated content",
  ifMatch: '"abc123"', // Current object's ETag
});

if (response.status === 412) {
  console.log("Precondition failed - object was modified by someone else");
} else {
  console.log("Upload successful");
}

When conditional headers are used and the condition fails, S3 returns 412 Precondition Failed (not 304 Not Modified like GET/HEAD operations).

del()

Deletes an object from a bucket. Note: DELETE operations do not support conditional headers.

await client.del({ key: "my-file-to-delete.txt" });

list()

Lists objects in a bucket.

const result = await client.list({
  prefix: "documents/",
  delimiter: "/", // To group objects by folder
});

console.log("Files:", result.contents);
// [ { key: 'documents/file1.txt', ... }, ... ]

console.log("Subdirectories:", result.commonPrefixes);
// [ 'documents/images/', ... ]

getSignedUrl()

Generates a presigned URL that can be used to grant temporary access to an S3 object.

// Get a presigned URL for downloading an object (expires in 1 hour)
const downloadUrl = await client.getSignedUrl({
  method: "GET",
  key: "private-document.pdf",
  expiresInSeconds: 3600,
});
console.log("Download URL:", downloadUrl);

// Get a presigned URL for uploading an object
const uploadUrl = await client.getSignedUrl({
  method: "PUT",
  key: "new-upload.zip",
  expiresInSeconds: 600, // 10 minutes
});
console.log("Upload URL:", uploadUrl);

Multipart Upload

For large files, you can use multipart uploads.

1. initiateMultipart()

Start a new multipart upload and get an uploadId.

const { uploadId } = await client.initiateMultipart({
  key: "large-video.mp4",
  contentType: "video/mp4",
});

2. uploadPart()

Upload a part of the file. You need to provide the uploadId and a partNumber (from 1 to 10,000).

const parts = [];
const file = new Blob([
  /* ... large content ... */
]);
const chunkSize = 5 * 1024 * 1024; // 5MB

for (let i = 0; i * chunkSize < file.size; i++) {
  const partNumber = i + 1;
  const chunk = file.slice(i * chunkSize, (i + 1) * chunkSize);

  const { etag } = await client.uploadPart({
    uploadId,
    key: "large-video.mp4",
    partNumber,
    body: chunk,
  });

  parts.push({ partNumber, etag });
}

3. completeMultipart()

Finish the multipart upload after all parts have been uploaded.

await client.completeMultipart({
  uploadId,
  key: "large-video.mp4",
  parts: parts,
});

Conditional Overwrites (Advanced)

The completeMultipart() method supports optional conditional headers (ifMatch, ifNoneMatch) for preventing accidental overwrites. Note that not all S3-compatible providers support these headers.

// Only overwrite if the current ETag matches
const response = await client.completeMultipart({
  uploadId,
  key: "large-video.mp4",
  parts: parts,
  ifMatch: '"abc123"', // Current object's ETag
});

if (response.status === 412) {
  console.log("Precondition failed - object was modified by someone else");
} else {
  console.log("Upload successful");
}

When conditional headers are used and the condition fails, S3 returns 412 Precondition Failed (not 304 Not Modified like GET/HEAD operations).

abortMultipart()

If something goes wrong, you can abort the multipart upload to clean up the parts that have already been uploaded.

await client.abortMultipart({
  uploadId,
  key: "large-video.mp4",
});

Development

  • Clone this repository
  • Install latest LTS version of Node.js
  • Enable Corepack using corepack enable
  • Install dependencies using pnpm install
  • Run interactive tests using pnpm test

Credits

License

Published under the MIT license. Made by community 💛


🤖 auto updated with automd