s3bridge
v1.0.1
Published
A unified SDK for managing files across AWS S3-compatible storage platforms (S3, Cloudflare R2, DigitalOcean Spaces)
Downloads
110
Maintainers
Readme
s3bridge
🧩 Problem Statement
Modern developers often rely on a single cloud storage provider, typically AWS S3. But this creates major problems:
Vendor lock-in: Once your application depends heavily on one storage provider, switching becomes painful and expensive.
Inconsistent APIs: Even though many services claim "S3 compatibility," each provider (DigitalOcean, Cloudflare R2, Wasabi, etc.) behaves slightly differently — different errors, configs, URLs, permissions.
Scattered configurations: Developers must manage different SDKs, write adapters, and maintain duplicate code.
No simple way to use multiple S3-compatible providers at the same time: You might want
Cloudflare R2 for public assets
DigitalOcean Spaces for backups
AWS S3 for logging But no package easily allows plug-and-play multi-provider usage.
Because of this, migrations are hard, flexibility is low, and application complexity increases.
A unified SDK for managing files across AWS S3-compatible storage platforms including AWS S3, Cloudflare R2, and DigitalOcean Spaces.
Features
- Unified API for multiple S3-compatible storage providers
- Support for AWS S3, Cloudflare R2, and DigitalOcean Spaces
- Easy provider switching without code changes
- TypeScript support with full type definitions
- Stream-based uploads and downloads
- Presigned URL generation
- Public and signed URL access
Installation
npm install s3bridgePeer Dependencies
This SDK requires the following peer dependencies:
npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presignerUsage
Instantiation for JavaScript or TypeScript Projects
TypeScript Example
import { StorageClient, StorageProviderType } from 's3bridge';
// Single provider configuration
const storage = new StorageClient({
provider: StorageProviderType.S3,
bucketName: 'my-bucket',
region: 'us-west-2',
accessKeyId: 'YOUR_ACCESS_KEY',
secretAccessKey: 'YOUR_SECRET_KEY'
});JavaScript Example
const { StorageClient, StorageProviderType } = require('s3bridge');
// Single provider configuration
const storage = new StorageClient({
provider: StorageProviderType.S3,
bucketName: 'my-bucket',
region: 'us-west-2',
accessKeyId: 'YOUR_ACCESS_KEY',
secretAccessKey: 'YOUR_SECRET_KEY'
});Using the Default Storage Provider
When you instantiate with a single configuration, that becomes the default provider. All operations will use it unless specified otherwise.
// Upload using default provider
await storage.upload('path/to/file.txt', fileBuffer, {
contentType: 'text/plain'
});
// Download using default provider
const stream = await storage.getObjectReadStream('path/to/file.txt');Setting Your Own Provider Instance Different from the Default
When instantiated with multiple configurations, you can specify which provider to use for each operation.
// Multiple providers configuration
const storage = new StorageClient([
{
provider: StorageProviderType.S3,
bucketName: 'my-s3-bucket',
region: 'us-west-2',
accessKeyId: 'AWS_ACCESS_KEY',
secretAccessKey: 'AWS_SECRET_KEY'
},
{
provider: StorageProviderType.R2,
bucketName: 'my-r2-bucket',
accountId: 'YOUR_ACCOUNT_ID',
accessKeyId: 'R2_ACCESS_KEY',
secretAccessKey: 'R2_SECRET_KEY'
}
]);
// Upload to specific provider (R2 in this case)
await storage.upload('path/to/file.txt', fileBuffer, {
contentType: 'text/plain'
}, {
provider: StorageProviderType.R2 // Specify different provider
});
// Get signed URL from specific provider
const signedUrl = await storage.getSignedUrl('path/to/file.txt', 3600, {
provider: StorageProviderType.S3
});Upload a File
const fileBuffer = Buffer.from('Hello World');
await storage.upload('path/to/file.txt', fileBuffer, {
contentType: 'text/plain',
acl: 'public-read'
});
// Upload to a specific provider
await storage.upload('path/to/file.txt', fileBuffer, {
contentType: 'text/plain'
}, {
provider: StorageProviderType.R2
});Upload from Stream
import { createReadStream } from 'fs';
const stream = createReadStream('./local-file.txt');
await storage.uploadFromStream(stream, 'path/to/file.txt', {
contentType: 'text/plain'
});Download a File
// Download to local file
await storage.downloadFile('path/to/file.txt', './local-file.txt');
// Get read stream
const stream = await storage.getObjectReadStream('path/to/file.txt');Get URLs
// Get public URL
const publicUrl = storage.getPublicUrl('path/to/file.txt');
// Get signed URL (expires in 1 hour by default)
const signedUrl = await storage.getSignedUrl('path/to/file.txt', 3600);
// Get presigned upload URL
const uploadUrl = await storage.getUploadUrl('path/to/file.txt', {
contentType: 'image/png',
expiresIn: 3600
});Delete a File
await storage.delete('path/to/file.txt');Provider-Specific Operations
// Get driver for specific provider
const r2Driver = storage.getDriver(StorageProviderType.R2);
const bucketUrl = r2Driver.getBucketUrl();Configuration Options
AWS S3
{
provider: StorageProviderType.S3,
bucketName: string,
region?: string, // Default: 'us-west-2'
accessKeyId: string,
secretAccessKey: string
}Cloudflare R2
{
provider: StorageProviderType.R2,
bucketName: string,
accountId: string,
publicId?: string,
customDomain?: string,
accessKeyId: string,
secretAccessKey: string
}DigitalOcean Spaces
{
provider: StorageProviderType.DO,
bucketName: string,
region?: string, // Default: 'nyc3'
cdnEndpoint?: string,
accessKeyId: string,
secretAccessKey: string
}Important: Database Entity Recommendation
⚠️ IMPORTANT DISCLAIMER: When using this SDK in production applications, we strongly recommend creating a database entity to track which storage provider was used for each file.
Why This Matters
Storage providers can change over time due to:
- Cost optimization
- Performance requirements
- Geographic distribution needs
- Provider availability issues
If you don't track which provider stored each file, you may lose access to your files when switching providers.
Recommended Database Schema
interface FileEntity {
id: string;
key: string; // File path/key in storage
provider: StorageProviderType; // Which provider stores this file
bucketName: string; // Which bucket stores this file
contentType: string;
size: number;
uploadedAt: Date;
// ... other metadata
}Example Implementation
// When uploading
const fileKey = 'uploads/user-avatar.png';
const provider = StorageProviderType.S3;
await storage.upload(fileKey, buffer, { contentType: 'image/png' }, { provider });
// Save to database
await db.files.create({
key: fileKey,
provider: provider,
bucketName: 'my-bucket',
contentType: 'image/png',
size: buffer.length,
uploadedAt: new Date()
});
// When retrieving
const fileRecord = await db.files.findOne({ id: fileId });
const url = await storage.getSignedUrl(fileRecord.key, 3600, {
provider: fileRecord.provider
});This approach ensures you can:
- Migrate files between providers gradually
- Support multiple providers simultaneously
- Maintain access to all files regardless of provider changes
- Track file metadata and usage patterns
License
MIT
