npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

s3bridge

v1.0.1

Published

A unified SDK for managing files across AWS S3-compatible storage platforms (S3, Cloudflare R2, DigitalOcean Spaces)

Downloads

110

Readme

s3bridge

🧩 Problem Statement

Modern developers often rely on a single cloud storage provider, typically AWS S3. But this creates major problems:

Vendor lock-in: Once your application depends heavily on one storage provider, switching becomes painful and expensive.

Inconsistent APIs: Even though many services claim "S3 compatibility," each provider (DigitalOcean, Cloudflare R2, Wasabi, etc.) behaves slightly differently — different errors, configs, URLs, permissions.

Scattered configurations: Developers must manage different SDKs, write adapters, and maintain duplicate code.

No simple way to use multiple S3-compatible providers at the same time: You might want

Cloudflare R2 for public assets

DigitalOcean Spaces for backups

AWS S3 for logging But no package easily allows plug-and-play multi-provider usage.

Because of this, migrations are hard, flexibility is low, and application complexity increases.

A unified SDK for managing files across AWS S3-compatible storage platforms including AWS S3, Cloudflare R2, and DigitalOcean Spaces.

Features

  • Unified API for multiple S3-compatible storage providers
  • Support for AWS S3, Cloudflare R2, and DigitalOcean Spaces
  • Easy provider switching without code changes
  • TypeScript support with full type definitions
  • Stream-based uploads and downloads
  • Presigned URL generation
  • Public and signed URL access

Installation

npm install s3bridge

Peer Dependencies

This SDK requires the following peer dependencies:

npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presigner

Usage

Instantiation for JavaScript or TypeScript Projects

TypeScript Example

import { StorageClient, StorageProviderType } from 's3bridge';

// Single provider configuration
const storage = new StorageClient({
  provider: StorageProviderType.S3,
  bucketName: 'my-bucket',
  region: 'us-west-2',
  accessKeyId: 'YOUR_ACCESS_KEY',
  secretAccessKey: 'YOUR_SECRET_KEY'
});

JavaScript Example

const { StorageClient, StorageProviderType } = require('s3bridge');

// Single provider configuration
const storage = new StorageClient({
  provider: StorageProviderType.S3,
  bucketName: 'my-bucket',
  region: 'us-west-2',
  accessKeyId: 'YOUR_ACCESS_KEY',
  secretAccessKey: 'YOUR_SECRET_KEY'
});

Using the Default Storage Provider

When you instantiate with a single configuration, that becomes the default provider. All operations will use it unless specified otherwise.

// Upload using default provider
await storage.upload('path/to/file.txt', fileBuffer, {
  contentType: 'text/plain'
});

// Download using default provider
const stream = await storage.getObjectReadStream('path/to/file.txt');

Setting Your Own Provider Instance Different from the Default

When instantiated with multiple configurations, you can specify which provider to use for each operation.

// Multiple providers configuration
const storage = new StorageClient([
  {
    provider: StorageProviderType.S3,
    bucketName: 'my-s3-bucket',
    region: 'us-west-2',
    accessKeyId: 'AWS_ACCESS_KEY',
    secretAccessKey: 'AWS_SECRET_KEY'
  },
  {
    provider: StorageProviderType.R2,
    bucketName: 'my-r2-bucket',
    accountId: 'YOUR_ACCOUNT_ID',
    accessKeyId: 'R2_ACCESS_KEY',
    secretAccessKey: 'R2_SECRET_KEY'
  }
]);

// Upload to specific provider (R2 in this case)
await storage.upload('path/to/file.txt', fileBuffer, {
  contentType: 'text/plain'
}, {
  provider: StorageProviderType.R2  // Specify different provider
});

// Get signed URL from specific provider
const signedUrl = await storage.getSignedUrl('path/to/file.txt', 3600, {
  provider: StorageProviderType.S3
});

Upload a File

const fileBuffer = Buffer.from('Hello World');

await storage.upload('path/to/file.txt', fileBuffer, {
  contentType: 'text/plain',
  acl: 'public-read'
});

// Upload to a specific provider
await storage.upload('path/to/file.txt', fileBuffer, {
  contentType: 'text/plain'
}, {
  provider: StorageProviderType.R2
});

Upload from Stream

import { createReadStream } from 'fs';

const stream = createReadStream('./local-file.txt');

await storage.uploadFromStream(stream, 'path/to/file.txt', {
  contentType: 'text/plain'
});

Download a File

// Download to local file
await storage.downloadFile('path/to/file.txt', './local-file.txt');

// Get read stream
const stream = await storage.getObjectReadStream('path/to/file.txt');

Get URLs

// Get public URL
const publicUrl = storage.getPublicUrl('path/to/file.txt');

// Get signed URL (expires in 1 hour by default)
const signedUrl = await storage.getSignedUrl('path/to/file.txt', 3600);

// Get presigned upload URL
const uploadUrl = await storage.getUploadUrl('path/to/file.txt', {
  contentType: 'image/png',
  expiresIn: 3600
});

Delete a File

await storage.delete('path/to/file.txt');

Provider-Specific Operations

// Get driver for specific provider
const r2Driver = storage.getDriver(StorageProviderType.R2);
const bucketUrl = r2Driver.getBucketUrl();

Configuration Options

AWS S3

{
  provider: StorageProviderType.S3,
  bucketName: string,
  region?: string,           // Default: 'us-west-2'
  accessKeyId: string,
  secretAccessKey: string
}

Cloudflare R2

{
  provider: StorageProviderType.R2,
  bucketName: string,
  accountId: string,
  publicId?: string,
  customDomain?: string,
  accessKeyId: string,
  secretAccessKey: string
}

DigitalOcean Spaces

{
  provider: StorageProviderType.DO,
  bucketName: string,
  region?: string,           // Default: 'nyc3'
  cdnEndpoint?: string,
  accessKeyId: string,
  secretAccessKey: string
}

Important: Database Entity Recommendation

⚠️ IMPORTANT DISCLAIMER: When using this SDK in production applications, we strongly recommend creating a database entity to track which storage provider was used for each file.

Why This Matters

Storage providers can change over time due to:

  • Cost optimization
  • Performance requirements
  • Geographic distribution needs
  • Provider availability issues

If you don't track which provider stored each file, you may lose access to your files when switching providers.

Recommended Database Schema

interface FileEntity {
  id: string;
  key: string;                    // File path/key in storage
  provider: StorageProviderType;  // Which provider stores this file
  bucketName: string;             // Which bucket stores this file
  contentType: string;
  size: number;
  uploadedAt: Date;
  // ... other metadata
}

Example Implementation

// When uploading
const fileKey = 'uploads/user-avatar.png';
const provider = StorageProviderType.S3;

await storage.upload(fileKey, buffer, { contentType: 'image/png' }, { provider });

// Save to database
await db.files.create({
  key: fileKey,
  provider: provider,
  bucketName: 'my-bucket',
  contentType: 'image/png',
  size: buffer.length,
  uploadedAt: new Date()
});

// When retrieving
const fileRecord = await db.files.findOne({ id: fileId });
const url = await storage.getSignedUrl(fileRecord.key, 3600, {
  provider: fileRecord.provider
});

This approach ensures you can:

  • Migrate files between providers gradually
  • Support multiple providers simultaneously
  • Maintain access to all files regardless of provider changes
  • Track file metadata and usage patterns

License

MIT