@bernierllc/file-handler-storage-backends
v0.4.2
Published
Additional storage backends for @bernierllc/file-handler: AWS S3, Azure Blob, Google Cloud Storage, Local storage, Dropbox, and Google Drive
Readme
@bernierllc/file-handler-storage-backends
Additional storage backends for @bernierllc/file-handler package, providing multi-cloud storage support with AWS S3, Azure Blob Storage, Google Cloud Storage, and local file system storage.
Features
- AWS S3 Backend - Multi-cloud storage with multipart uploads
- Azure Blob Storage - Microsoft Azure storage with managed identity support
- Google Cloud Storage - GCP storage with service account authentication
- Local Storage - File system storage with metadata files
- Type-safe - Full TypeScript support with strict mode
- Optional Dependencies - Install only the cloud SDKs you need
Installation
npm install @bernierllc/file-handler-storage-backendsCloud Provider SDKs (Optional)
Install only the SDKs for the backends you'll use:
# AWS S3
npm install @aws-sdk/client-s3 @aws-sdk/lib-storage @aws-sdk/s3-request-presigner
# Azure Blob Storage
npm install @azure/storage-blob @azure/identity
# Google Cloud Storage
npm install @google-cloud/storage
# Local File Storage
npm install fs-extra mime-typesUsage
AWS S3 Storage
import { S3StorageBackend } from '@bernierllc/file-handler-storage-backends';
// Configure with environment variables:
// AWS_ACCESS_KEY_ID=your_key
// AWS_SECRET_ACCESS_KEY=your_secret
// AWS_REGION=us-east-1
// AWS_S3_BUCKET=my-bucket
const s3Backend = new S3StorageBackend();
// Upload file
const result = await s3Backend.upload({
buffer: fileBuffer,
filename: 'document.pdf',
contentType: 'application/pdf',
size: fileBuffer.length
}, {
path: 'documents/2024',
public: true,
cacheControl: '3600'
});
console.log(result.url); // Public S3 URL
console.log(result.fileId); // S3 object key
// Download file
const file = await s3Backend.download(result.fileId);
// Delete file
await s3Backend.delete(result.fileId);
// List files
const files = await s3Backend.list({ path: 'documents/', limit: 100 });
// Get signed URL
const signedUrl = await s3Backend.getUrl(result.fileId, 7200); // 2 hour expiryAzure Blob Storage
import { AzureStorageBackend } from '@bernierllc/file-handler-storage-backends';
// Configure with environment variables:
// AZURE_STORAGE_ACCOUNT_NAME=myaccount
// AZURE_STORAGE_ACCOUNT_KEY=your_key (for shared key auth)
// OR for managed identity:
// AZURE_CLIENT_ID=your_client_id
// AZURE_CLIENT_SECRET=your_secret
// AZURE_TENANT_ID=your_tenant
// AZURE_CONTAINER_NAME=uploads
const azureBackend = new AzureStorageBackend();
// Upload file
const result = await azureBackend.upload({
buffer: imageBuffer,
filename: 'photo.jpg',
contentType: 'image/jpeg',
size: imageBuffer.length
}, {
public: true
});
// Get SAS URL
const sasUrl = await azureBackend.getUrl(result.fileId, 3600);Google Cloud Storage
import { GCPStorageBackend } from '@bernierllc/file-handler-storage-backends';
// Configure with environment variables:
// GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json
// OR:
// GOOGLE_CLOUD_SERVICE_ACCOUNT_JSON={"type":"service_account",...}
// GOOGLE_CLOUD_PROJECT_ID=my-project
// GOOGLE_CLOUD_STORAGE_BUCKET=my-bucket
const gcpBackend = new GCPStorageBackend();
// Upload file
const result = await gcpBackend.upload({
buffer: videoBuffer,
filename: 'video.mp4',
contentType: 'video/mp4',
size: videoBuffer.length
}, {
path: 'videos',
public: false
});
// Get signed URL
const signedUrl = await gcpBackend.getUrl(result.fileId, 86400); // 24 hour expiryLocal File Storage
import { LocalStorageBackend } from '@bernierllc/file-handler-storage-backends';
// Configure with environment variables:
// LOCAL_STORAGE_PATH=/app/uploads (default: ./uploads)
// LOCAL_STORAGE_URL_BASE=http://localhost:3000/uploads (optional)
const localBackend = new LocalStorageBackend();
// Upload file
const result = await localBackend.upload({
buffer: dataBuffer,
filename: 'data.csv',
contentType: 'text/csv',
size: dataBuffer.length
}, {
path: 'exports/2024',
public: true
});
// Files are stored with metadata in .meta.json files
// /app/uploads/exports/2024/1234567890-data.csv
// /app/uploads/exports/2024/1234567890-data.csv.meta.jsonBackend Interface
All backends implement the FileStorageBackend interface from @bernierllc/file-handler:
interface FileStorageBackend {
upload(file: FileData, options?: UploadOptions): Promise<UploadResult>;
download(fileId: string, options?: DownloadOptions): Promise<FileData>;
delete(fileId: string): Promise<DeleteResult>;
list(options?: ListOptions): Promise<FileList>;
getMetadata(fileId: string): Promise<FileMetadata>;
getUrl(fileId: string, expiresIn?: number): Promise<string>;
testConnection(): Promise<boolean>;
getBackendInfo(): BackendInfo;
}Environment Variables
AWS S3
| Variable | Required | Default | Description |
|----------|----------|---------|-------------|
| AWS_ACCESS_KEY_ID | Yes | - | AWS access key |
| AWS_SECRET_ACCESS_KEY | Yes | - | AWS secret key |
| AWS_REGION | No | us-east-1 | AWS region |
| AWS_S3_BUCKET | No | uploads | S3 bucket name |
Azure Blob Storage
| Variable | Required | Default | Description |
|----------|----------|---------|-------------|
| AZURE_STORAGE_ACCOUNT_NAME | Yes | - | Storage account name |
| AZURE_STORAGE_ACCOUNT_KEY | No* | - | Account key (shared key auth) |
| AZURE_CLIENT_ID | No* | - | Client ID (managed identity) |
| AZURE_CLIENT_SECRET | No* | - | Client secret (service principal) |
| AZURE_TENANT_ID | No* | - | Tenant ID (service principal) |
| AZURE_CONTAINER_NAME | No | uploads | Container name |
*Either AZURE_STORAGE_ACCOUNT_KEY or Azure identity credentials are required
Google Cloud Storage
| Variable | Required | Default | Description |
|----------|----------|---------|-------------|
| GOOGLE_APPLICATION_CREDENTIALS | No* | - | Path to service account JSON |
| GOOGLE_CLOUD_SERVICE_ACCOUNT_JSON | No* | - | Service account JSON content |
| GOOGLE_CLOUD_PROJECT_ID | No | - | GCP project ID |
| GOOGLE_CLOUD_STORAGE_BUCKET | No | uploads | GCS bucket name |
*Either credentials path or JSON content is required
Local File Storage
| Variable | Required | Default | Description |
|----------|----------|---------|-------------|
| LOCAL_STORAGE_PATH | No | ./uploads | Base storage directory |
| LOCAL_STORAGE_URL_BASE | No | - | Base URL for public access |
Authentication Methods
AWS S3
- IAM Roles - Use EC2/ECS IAM roles (recommended for production)
- Access Keys - Use environment variables for development
Azure Blob Storage
- Managed Identity - Use Azure managed identity (recommended for Azure resources)
- Shared Key - Use storage account key
- Service Principal - Use Azure AD service principal
Google Cloud Storage
- Workload Identity - Use GKE workload identity (recommended for GKE)
- Service Account - Use service account key file or JSON
Local Storage
- File System - Direct file system access with permissions
Error Handling
All methods throw descriptive errors:
try {
const backend = new S3StorageBackend();
await backend.upload(file);
} catch (error) {
if (error.message.includes('AWS SDK not found')) {
// Install AWS SDK: npm install @aws-sdk/client-s3 @aws-sdk/lib-storage
} else if (error.message.includes('credentials')) {
// Check AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
}
}Testing Connection
Test if a backend is properly configured:
const backend = new AzureStorageBackend();
const isConnected = await backend.testConnection();
if (isConnected) {
console.log('Azure connection successful');
} else {
console.error('Azure connection failed - check credentials');
}
// Get backend info
const info = backend.getBackendInfo();
console.log(info.type); // 'azure'
console.log(info.capabilities); // ['upload', 'download', 'delete', ...]
console.log(info.status); // 'connected' or 'disconnected'File Metadata
All backends store and retrieve file metadata:
const metadata = await backend.getMetadata(fileId);
console.log(metadata.filename); // Original filename
console.log(metadata.contentType); // MIME type
console.log(metadata.size); // File size in bytes
console.log(metadata.uploadedAt); // Upload timestampPerformance
- S3 - Multipart uploads for large files (>5MB automatically)
- Azure - Streaming uploads and downloads
- GCP - Resumable uploads for large files
- Local - Direct file system operations with metadata caching
License
Bernier LLC - Licensed for client use within delivered projects only.
Related Packages
- @bernierllc/file-handler - Core file handling package
- @bernierllc/image-processor - Image processing utilities
- @bernierllc/file-uploader - HTTP file upload handling
