@d3oxy/s3-pilot
v4.0.0
Published
A TypeScript wrapper for AWS S3 and S3-compatible services (R2, MinIO, DigitalOcean Spaces) with a simplified single-client, single-bucket architecture.
Downloads
297
Maintainers
Readme
S3Pilot
S3Pilot is a TypeScript library that abstracts AWS S3 and S3-compatible service operations, making it easier to interact with buckets and objects. It provides a cleaner API to manage file uploads, deletions, signed URL generation, and secure file downloads.
Features
- Simple single-client, single-bucket architecture - one instance per bucket.
- S3-Compatible Services: Works with Cloudflare R2, DigitalOcean Spaces, MinIO, and other S3-compatible storage providers.
- Easily upload, delete, rename, and manage files in S3 buckets.
- Bulk delete support for deleting multiple files efficiently.
- Generate signed URLs for private access to objects.
- Secure file download methods to avoid CORS issues.
- Enhanced signed URLs with download-specific headers.
- Streaming support for large files.
- Optional bucket validation on initialization.
- Supports custom key prefixes and folders.
Installation
# Using pnpm
pnpm install @d3oxy/s3-pilot
# Using npm
npm install @d3oxy/s3-pilot
# Using bun
bun add @d3oxy/s3-pilot
# Using yarn
yarn add @d3oxy/s3-pilotUsage
Initialize S3Pilot
First, import the S3Pilot class into your TypeScript project:
import { S3Pilot } from "@d3oxy/s3-pilot";Configuration
Create a new instance of S3Pilot for each bucket you want to work with:
// One instance per bucket
const bucket1 = new S3Pilot({
region: "us-east-1",
accessKeyId: "AWS_ACCESS_KEY_ID",
secretAccessKey: "AWS_SECRET_ACCESS_KEY",
bucket: "my-bucket-1",
keyPrefix: process.env.NODE_ENV === "development" ? "dev" : undefined,
});
// Create another instance for a different bucket
const bucket2 = new S3Pilot({
region: "us-west-2",
accessKeyId: "AWS_ACCESS_KEY_ID",
secretAccessKey: "AWS_SECRET_ACCESS_KEY",
bucket: "my-bucket-2",
});S3-Compatible Services
S3Pilot works with any S3-compatible storage service. Use the endpoint option to specify the service URL and publicBaseUrl for generating public URLs.
Cloudflare R2
const r2 = new S3Pilot({
region: "auto", // R2 uses "auto" for region
accessKeyId: "R2_ACCESS_KEY_ID",
secretAccessKey: "R2_SECRET_ACCESS_KEY",
bucket: "my-r2-bucket",
endpoint: "https://ACCOUNT_ID.r2.cloudflarestorage.com",
publicBaseUrl: "https://pub-xxx.r2.dev", // Your R2 public bucket URL or custom domain
});
// All methods work the same!
await r2.uploadFile({ filename: "test.txt", file: "Hello R2!", contentType: "text/plain" });
const url = r2.getUrl("test.txt"); // Returns: https://pub-xxx.r2.dev/test.txtDigitalOcean Spaces
const spaces = new S3Pilot({
region: "nyc3",
accessKeyId: "DO_SPACES_KEY",
secretAccessKey: "DO_SPACES_SECRET",
bucket: "my-space",
endpoint: "https://nyc3.digitaloceanspaces.com",
publicBaseUrl: "https://my-space.nyc3.digitaloceanspaces.com",
});MinIO
const minio = new S3Pilot({
region: "us-east-1", // MinIO default
accessKeyId: "minio-access-key",
secretAccessKey: "minio-secret-key",
bucket: "my-bucket",
endpoint: "http://localhost:9000",
publicBaseUrl: "http://localhost:9000/my-bucket",
});Upload Files
To upload files to an S3 bucket, use the uploadFile method:
(async () => {
const response = await bucket1.uploadFile({
filename: "example.jpg",
file: Buffer.from("Your file data"),
contentType: "image/jpeg",
});
console.log("Uploaded File URL:", response.url);
console.log("File Key:", response.key);
})();Upload to a specific folder:
const response = await bucket1.uploadFile({
filename: "example.jpg",
folder: "images",
file: Buffer.from("Your file data"),
contentType: "image/jpeg",
});Download Files
Direct File Download (Recommended for Server-Side)
Use the getFile method to download file content as a Buffer. This is ideal for server-side processing or when you need to stream files through your API server to avoid CORS issues:
(async () => {
const fileResponse = await bucket1.getFile({
key: "example.jpg",
});
console.log("File content:", fileResponse.buffer);
console.log("Content type:", fileResponse.contentType);
console.log("File size:", fileResponse.contentLength);
// Use in your API response
// res.setHeader('Content-Type', fileResponse.contentType);
// res.setHeader('Content-Disposition', 'attachment; filename="example.jpg"');
// res.send(fileResponse.buffer);
})();Streaming Large Files
For large files, use the getFileStream method to stream content efficiently:
(async () => {
const streamResponse = await bucket1.getFileStream({
key: "large-file.zip",
});
console.log("Content type:", streamResponse.contentType);
console.log("File size:", streamResponse.contentLength);
// Pipe the stream to your response
// streamResponse.stream.pipe(res);
})();Generate Signed URLs
Basic Signed URL
To generate signed URLs for private access to S3 objects:
(async () => {
const signedUrl = await bucket1.generateSignedUrl({
key: "example.jpg",
expiresIn: 3600, // URL valid for 1 hour
});
console.log("Signed URL:", signedUrl);
})();Enhanced Signed URL with Download Headers
Generate signed URLs with download-specific headers to force browser download behavior and avoid CORS issues:
(async () => {
const signedUrl = await bucket1.generateSignedUrl({
key: "document.pdf",
expiresIn: 3600, // 1 hour
responseContentDisposition: 'attachment; filename="document.pdf"',
responseContentType: "application/pdf",
responseCacheControl: "no-cache",
});
console.log("Download URL:", signedUrl);
})();Delete Files
Delete Single File
To delete a single file from an S3 bucket, use the deleteFile method:
(async () => {
await bucket1.deleteFile({
key: "example.jpg",
});
console.log("File deleted successfully.");
})();Bulk Delete Files
To delete multiple files efficiently, use the deleteFiles method. This automatically batches requests (S3 allows up to 1000 objects per request):
(async () => {
const result = await bucket1.deleteFiles({
keys: ["file1.jpg", "file2.jpg", "file3.jpg", "file4.jpg"],
});
console.log("Deleted files:", result.deleted);
if (result.errors.length > 0) {
console.log("Errors:", result.errors);
}
})();The deleteFiles method returns an object with:
deleted: Array of successfully deleted keyserrors: Array of errors (if any) withkey,code, andmessageproperties
Move File
To move a file within the same bucket (change its key/path):
(async () => {
const result = await bucket1.moveFile({
sourceKey: "uploads/temp/photo.jpg",
destinationKey: "uploads/processed/photo.jpg",
});
console.log("Moved to:", result.url);
})();Move File to Another Bucket
To move a file from one bucket to another:
const sourceBucket = new S3Pilot({ region: "us-east-1", bucket: "source-bucket", ... });
const destBucket = new S3Pilot({ region: "us-east-1", bucket: "dest-bucket", ... });
(async () => {
const result = await sourceBucket.moveToBucket({
sourceKey: "uploads/photo.jpg",
destinationKey: "archive/photo.jpg",
destination: destBucket,
});
console.log("Moved to destination bucket:", result.url);
})();This method:
- First attempts a direct S3 copy (fastest, works if same credentials have access to both buckets)
- Falls back to download + upload if direct copy fails (works across different AWS accounts)
Delete Folder
To delete a folder and all its contents:
(async () => {
await bucket1.deleteFolder({
folder: "images",
});
console.log("Folder and all contents deleted successfully.");
})();File Download Use Cases
1. Server-Side Download (Recommended)
When you need to control access permissions or avoid CORS issues:
// In your API endpoint
app.get("/download/:fileKey", async (req, res) => {
try {
const fileResponse = await bucket1.getFile({
key: req.params.fileKey,
});
res.setHeader("Content-Type", fileResponse.contentType || "application/octet-stream");
res.setHeader("Content-Disposition", `attachment; filename="${req.params.fileKey}"`);
res.setHeader("Content-Length", fileResponse.contentLength?.toString() || "0");
res.send(fileResponse.buffer);
} catch (error) {
res.status(404).json({ error: "File not found" });
}
});2. Enhanced Signed URL for Direct Download
When you want the browser to download directly from S3:
// Generate a signed URL that forces download
const downloadUrl = await bucket1.generateSignedUrl({
key: fileKey,
expiresIn: 3600, // 1 hour
responseContentDisposition: `attachment; filename="${fileName}"`,
responseContentType: fileType,
});
// Redirect user to download URL
res.redirect(downloadUrl);3. Streaming Large Files
For files larger than 100MB:
app.get("/stream/:fileKey", async (req, res) => {
try {
const streamResponse = await bucket1.getFileStream({
key: req.params.fileKey,
});
res.setHeader("Content-Type", streamResponse.contentType || "application/octet-stream");
res.setHeader("Content-Disposition", `attachment; filename="${req.params.fileKey}"`);
res.setHeader("Content-Length", streamResponse.contentLength?.toString() || "0");
streamResponse.stream.pipe(res);
} catch (error) {
res.status(404).json({ error: "File not found" });
}
});Security Considerations
- Private Buckets: Always use private S3 buckets for sensitive files
- Signed URLs: Use short expiration times (1 hour max) for signed URLs
- Access Control: Implement proper authentication before generating download URLs
- CORS: Configure S3 bucket CORS settings if using direct signed URL downloads
- File Validation: Implement file type and size validation in your application layer before uploading files
License
This project is licensed under the MIT License - see the LICENSE file for details.
