mm-s3-file-upload-sdk
v1.1.0
Published
Allow file uploads to s3 buckets
Readme
mm-s3-file-upload-sdk
A robust and flexible Node.js SDK for simplified file uploads to AWS S3 buckets. This library streamlines the process of uploading single or multiple files, providing detailed results, automatic content-type detection, and extensive error handling.
Table of Contents
- Features
- Installation
- Configuration
- Usage
- Error Handling
- Contributing
- Code of Conduct
- Changelog
- License
- Author
Features
- Simplified API: Easy-to-use
S3Uploaderclass for common S3 upload tasks. - Single & Batch Uploads: Seamlessly upload one file or an array of files.
- Robust Error Handling: Utilizes
Promise.allSettledfor batch operations, providing results for all files even if some fail. CustomS3UploadErrorfor clear diagnostics. - Detailed Upload Results: Returns
FileUploadResultobjects for each upload, including success status, S3 ETag, and specific error messages. - Automatic Content-Type Detection: Automatically determines the correct MIME type for files using
mime-types. - Flexible S3
PutObjectParameters: Supports passing any standardPutObjectCommandInputoptions (e.g.,ACL,Metadata,CacheControl) for fine-grained control over your S3 objects. - Configurable S3 Client: Option to provide your own
S3Clientinstance for advanced setups. - Input Validation: Ensures valid configuration and file paths.
- Informative Logging: Provides clear console messages for upload progress and errors.
Installation
You can install mm-s3-file-upload-sdk using npm or Bun:
# Using npm
npm install mm-s3-file-upload-sdk @aws-sdk/client-s3 mime-types
# Using Bun
bun add mm-s3-file-upload-sdk @aws-sdk/client-s3 mime-typesNote: @aws-sdk/client-s3 and mime-types are peer dependencies and should be installed alongside this SDK.
Configuration
The S3Uploader requires your AWS credentials and S3 bucket details. It's highly recommended to use environment variables for sensitive information like accessKeyId and secretAccessKey.
You can use a .env file in your project root and a package like dotenv for local development.
# .env example
AWS_ACCESS_KEY_ID=YOUR_AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY=YOUR_AWS_SECRET_ACCESS_KEY
AWS_REGION=your-aws-region # e.g., us-east-1
S3_BUCKET_NAME=your-s3-bucket-nameUsage
Basic Setup
First, import the S3Uploader and configure it.
import { S3Uploader, S3UploaderConfig, FileUploadResult } from 'mm-s3-file-upload-sdk';
import 'dotenv/config'; // For loading environment variables
const config: S3UploaderConfig = {
accessKeyId: process.env.AWS_ACCESS_KEY_ID as string,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY as string,
region: process.env.AWS_REGION as string,
bucket: process.env.S3_BUCKET_NAME as string,
};
const uploader = new S3Uploader(config);
console.log("S3Uploader initialized successfully.");Uploading a Single File
import { S3Uploader, S3UploaderConfig, FileUploadResult } from 'mm-s3-file-upload-sdk';
import 'dotenv/config'; // For loading environment variables
import * as path from 'path';
import * as fs from 'fs'; // For creating a dummy file for example
// --- Setup (as above) ---
const config: S3UploaderConfig = {
accessKeyId: process.env.AWS_ACCESS_KEY_ID as string,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY as string,
region: process.env.AWS_REGION as string,
bucket: process.env.S3_BUCKET_NAME as string,
};
const uploader = new S3Uploader(config);
// --- End Setup ---
async function uploadExample() {
// Create a dummy file for demonstration
const dummyFilePath = path.join(__dirname, 'dummy-file.txt');
fs.writeFileSync(dummyFilePath, 'Hello, S3! This is a test file.');
const localPath = dummyFilePath;
const remotePath = 'uploads/my-first-file.txt'; // The desired path in your S3 bucket
console.log(`Attempting to upload single file: ${localPath} to s3://${config.bucket}/${remotePath}`);
try {
const results = await uploader.uploadFiles({ localPath, remotePath });
const result = results[0]; // For single uploads, results will have one item
if (result.success) {
console.log(`Single file upload successful! ETag: ${result.etag}`);
console.log(`File available at: s3://${config.bucket}/${result.remotePath}`);
} else {
console.error(`Single file upload failed: ${result.error}`);
}
} catch (error) {
console.error("An unexpected error occurred during single upload:", error);
} finally {
// Clean up dummy file
fs.unlinkSync(dummyFilePath);
}
}
uploadExample();Uploading Multiple Files (Batch Upload)
import { S3Uploader, S3UploaderConfig, FileUploadResult, FileUploadPath } from 'mm-s3-file-upload-sdk';
import 'dotenv/config'; // For loading environment variables
import * as path from 'path';
import * as fs from 'fs';
// --- Setup (as above) ---
const config: S3UploaderConfig = {
accessKeyId: process.env.AWS_ACCESS_KEY_ID as string,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY as string,
region: process.env.AWS_REGION as string,
bucket: process.env.S3_BUCKET_NAME as string,
};
const uploader = new S3Uploader(config);
// --- End Setup ---
async function uploadBatchExample() {
// Create dummy files for demonstration
const dummyFile1Path = path.join(__dirname, 'batch-file-1.json');
const dummyFile2Path = path.join(__dirname, 'batch-file-2.jpeg'); // Will be automatically detected as image/jpeg
const dummyFile3Path = path.join(__dirname, 'non-existent-file.txt'); // Will fail gracefully
fs.writeFileSync(dummyFile1Path, JSON.stringify({ data: "batch test 1" }));
// In a real scenario, you'd have an actual JPEG file here.
// For this example, we'll just write some dummy content.
fs.writeFileSync(dummyFile2Path, 'This is a dummy JPEG content placeholder.', { encoding: 'binary' });
const filesToUpload: FileUploadPath[] = [
{ localPath: dummyFile1Path, remotePath: 'batch/data/config.json' },
{ localPath: dummyFile2Path, remotePath: 'batch/images/photo.jpeg' },
{ localPath: dummyFile3Path, remotePath: 'batch/missing/file.txt' }, // This one will fail
];
console.log(`Attempting to upload a batch of ${filesToUpload.length} files.`);
try {
const results: FileUploadResult[] = await uploader.uploadFiles(filesToUpload);
results.forEach(result => {
if (result.success) {
console.log(`✅ SUCCESS: Uploaded '${result.localPath}' to '${result.remotePath}'. ETag: ${result.etag}`);
} else {
console.error(`❌ FAILED: Failed to upload '${result.localPath}' to '${result.remotePath}'. Error: ${result.error}`);
}
});
} catch (error) {
console.error("An unexpected error occurred during batch upload:", error);
} finally {
// Clean up dummy files
fs.unlinkSync(dummyFile1Path);
fs.unlinkSync(dummyFile2Path);
// fs.unlinkSync(dummyFile3Path) would throw an error if it doesn't exist, so skip it.
}
}
uploadBatchExample();Handling Upload Results
The uploadFiles method (for both single and batch uploads) returns an array of FileUploadResult objects. This allows you to programmatically handle success or failure for each file.
// Example from batch upload, showing how to process results:
const results: FileUploadResult[] = await uploader.uploadFiles(myFiles);
const successfulUploads = results.filter(r => r.success);
const failedUploads = results.filter(r => !r.success);
console.log(`\n--- Upload Summary ---`);
console.log(`Total files attempted: ${results.length}`);
console.log(`Successful uploads: ${successfulUploads.length}`);
console.log(`Failed uploads: ${failedUploads.length}`);
if (failedUploads.length > 0) {
console.log(`\nDetails of failures:`);
failedUploads.forEach(fail => {
console.error(` - ${fail.localPath} -> ${fail.remotePath}: ${fail.error}`);
});
}Using Advanced S3 PutObject Parameters
You can pass additional parameters directly to the underlying PutObjectCommandInput to control aspects like public access, metadata, or caching.
import { S3Uploader, S3UploaderConfig } from 'mm-s3-file-upload-sdk';
import { ObjectCannedACL } from '@aws-sdk/client-s3'; // Import specific types from AWS SDK
import 'dotenv/config';
import * as path from 'path';
import * as fs from 'fs';
// --- Setup ---
const config: S3UploaderConfig = {
accessKeyId: process.env.AWS_ACCESS_KEY_ID as string,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY as string,
region: process.env.AWS_REGION as string,
bucket: process.env.S3_BUCKET_NAME as string,
};
const uploader = new S3Uploader(config);
// --- End Setup ---
async function uploadWithCustomParams() {
const dummyFilePath = path.join(__dirname, 'public-image.png');
fs.writeFileSync(dummyFilePath, 'Dummy image content.', { encoding: 'binary' });
const localPath = dummyFilePath;
const remotePath = 'public/images/logo.png';
// Parameters to make the object publicly readable and add custom metadata
const customParams = {
ACL: ObjectCannedACL.public_read, // Make the uploaded object publicly readable
Metadata: {
'x-amz-meta-uploader': 'mm-s3-sdk',
'x-amz-meta-project': 'example',
},
CacheControl: 'max-age=31536000' // Cache for 1 year
};
console.log(`Attempting to upload file with custom S3 parameters.`);
try {
// For a single file, pass customParams directly to uploadFiles
const results = await uploader.uploadFiles(
{ localPath, remotePath },
customParams // Global parameters for this batch (or single file)
);
const result = results[0];
if (result.success) {
console.log(`File uploaded with custom params. ETag: ${result.etag}`);
console.log(`You can now access it publicly via S3 URL if bucket policy allows.`);
} else {
console.error(`Upload with custom params failed: ${result.error}`);
}
} catch (error) {
console.error("An unexpected error occurred during custom params upload:", error);
} finally {
fs.unlinkSync(dummyFilePath);
}
}
uploadWithCustomParams();Error Handling
The S3Uploader throws an S3UploadError for issues directly related to the upload process (e.g., file not found, S3 service errors). This custom error class wraps the original AWS SDK error (S3ServiceException) when available, allowing for detailed inspection.
Always use try...catch blocks when calling uploadFiles to gracefully handle potential errors.
import { S3Uploader, S3UploadError } from 'mm-s3-file-upload-sdk';
// ... uploader initialization ...
try {
// This will throw an S3UploadError if localPath is invalid
await uploader.uploadFiles({ localPath: '/path/to/non-existent-file.txt', remotePath: 's3/path.txt' });
} catch (error) {
if (error instanceof S3UploadError) {
console.error(`S3 Upload Specific Error: ${error.message}`);
if (error.originalError) {
console.error(`Original Error Details:`, error.originalError);
}
} else {
console.error(`General Application Error: ${error.message}`);
}
}Contributing
We welcome contributions to the mm-s3-file-upload-sdk! Please see our CONTRIBUTING.md file for guidelines on how to set up your development environment, submit bug reports, and propose new features.
Code of Conduct
This project adheres to the Code of Conduct. By participating, you are expected to uphold this code.
Changelog
For a detailed list of changes in each version, please refer to the CHANGELOG.md file.
License
This project is licensed under the MIT License.
Author
Mahd-Mehn
Initial Contributions:**
