npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@bernierllc/file-handler-storage-backends

v0.4.2

Published

Additional storage backends for @bernierllc/file-handler: AWS S3, Azure Blob, Google Cloud Storage, Local storage, Dropbox, and Google Drive

Readme

@bernierllc/file-handler-storage-backends

Additional storage backends for @bernierllc/file-handler package, providing multi-cloud storage support with AWS S3, Azure Blob Storage, Google Cloud Storage, and local file system storage.

Features

  • AWS S3 Backend - Multi-cloud storage with multipart uploads
  • Azure Blob Storage - Microsoft Azure storage with managed identity support
  • Google Cloud Storage - GCP storage with service account authentication
  • Local Storage - File system storage with metadata files
  • Type-safe - Full TypeScript support with strict mode
  • Optional Dependencies - Install only the cloud SDKs you need

Installation

npm install @bernierllc/file-handler-storage-backends

Cloud Provider SDKs (Optional)

Install only the SDKs for the backends you'll use:

# AWS S3
npm install @aws-sdk/client-s3 @aws-sdk/lib-storage @aws-sdk/s3-request-presigner

# Azure Blob Storage
npm install @azure/storage-blob @azure/identity

# Google Cloud Storage
npm install @google-cloud/storage

# Local File Storage
npm install fs-extra mime-types

Usage

AWS S3 Storage

import { S3StorageBackend } from '@bernierllc/file-handler-storage-backends';

// Configure with environment variables:
// AWS_ACCESS_KEY_ID=your_key
// AWS_SECRET_ACCESS_KEY=your_secret
// AWS_REGION=us-east-1
// AWS_S3_BUCKET=my-bucket

const s3Backend = new S3StorageBackend();

// Upload file
const result = await s3Backend.upload({
  buffer: fileBuffer,
  filename: 'document.pdf',
  contentType: 'application/pdf',
  size: fileBuffer.length
}, {
  path: 'documents/2024',
  public: true,
  cacheControl: '3600'
});

console.log(result.url); // Public S3 URL
console.log(result.fileId); // S3 object key

// Download file
const file = await s3Backend.download(result.fileId);

// Delete file
await s3Backend.delete(result.fileId);

// List files
const files = await s3Backend.list({ path: 'documents/', limit: 100 });

// Get signed URL
const signedUrl = await s3Backend.getUrl(result.fileId, 7200); // 2 hour expiry

Azure Blob Storage

import { AzureStorageBackend } from '@bernierllc/file-handler-storage-backends';

// Configure with environment variables:
// AZURE_STORAGE_ACCOUNT_NAME=myaccount
// AZURE_STORAGE_ACCOUNT_KEY=your_key (for shared key auth)
// OR for managed identity:
// AZURE_CLIENT_ID=your_client_id
// AZURE_CLIENT_SECRET=your_secret
// AZURE_TENANT_ID=your_tenant
// AZURE_CONTAINER_NAME=uploads

const azureBackend = new AzureStorageBackend();

// Upload file
const result = await azureBackend.upload({
  buffer: imageBuffer,
  filename: 'photo.jpg',
  contentType: 'image/jpeg',
  size: imageBuffer.length
}, {
  public: true
});

// Get SAS URL
const sasUrl = await azureBackend.getUrl(result.fileId, 3600);

Google Cloud Storage

import { GCPStorageBackend } from '@bernierllc/file-handler-storage-backends';

// Configure with environment variables:
// GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json
// OR:
// GOOGLE_CLOUD_SERVICE_ACCOUNT_JSON={"type":"service_account",...}
// GOOGLE_CLOUD_PROJECT_ID=my-project
// GOOGLE_CLOUD_STORAGE_BUCKET=my-bucket

const gcpBackend = new GCPStorageBackend();

// Upload file
const result = await gcpBackend.upload({
  buffer: videoBuffer,
  filename: 'video.mp4',
  contentType: 'video/mp4',
  size: videoBuffer.length
}, {
  path: 'videos',
  public: false
});

// Get signed URL
const signedUrl = await gcpBackend.getUrl(result.fileId, 86400); // 24 hour expiry

Local File Storage

import { LocalStorageBackend } from '@bernierllc/file-handler-storage-backends';

// Configure with environment variables:
// LOCAL_STORAGE_PATH=/app/uploads (default: ./uploads)
// LOCAL_STORAGE_URL_BASE=http://localhost:3000/uploads (optional)

const localBackend = new LocalStorageBackend();

// Upload file
const result = await localBackend.upload({
  buffer: dataBuffer,
  filename: 'data.csv',
  contentType: 'text/csv',
  size: dataBuffer.length
}, {
  path: 'exports/2024',
  public: true
});

// Files are stored with metadata in .meta.json files
// /app/uploads/exports/2024/1234567890-data.csv
// /app/uploads/exports/2024/1234567890-data.csv.meta.json

Backend Interface

All backends implement the FileStorageBackend interface from @bernierllc/file-handler:

interface FileStorageBackend {
  upload(file: FileData, options?: UploadOptions): Promise<UploadResult>;
  download(fileId: string, options?: DownloadOptions): Promise<FileData>;
  delete(fileId: string): Promise<DeleteResult>;
  list(options?: ListOptions): Promise<FileList>;
  getMetadata(fileId: string): Promise<FileMetadata>;
  getUrl(fileId: string, expiresIn?: number): Promise<string>;
  testConnection(): Promise<boolean>;
  getBackendInfo(): BackendInfo;
}

Environment Variables

AWS S3

| Variable | Required | Default | Description | |----------|----------|---------|-------------| | AWS_ACCESS_KEY_ID | Yes | - | AWS access key | | AWS_SECRET_ACCESS_KEY | Yes | - | AWS secret key | | AWS_REGION | No | us-east-1 | AWS region | | AWS_S3_BUCKET | No | uploads | S3 bucket name |

Azure Blob Storage

| Variable | Required | Default | Description | |----------|----------|---------|-------------| | AZURE_STORAGE_ACCOUNT_NAME | Yes | - | Storage account name | | AZURE_STORAGE_ACCOUNT_KEY | No* | - | Account key (shared key auth) | | AZURE_CLIENT_ID | No* | - | Client ID (managed identity) | | AZURE_CLIENT_SECRET | No* | - | Client secret (service principal) | | AZURE_TENANT_ID | No* | - | Tenant ID (service principal) | | AZURE_CONTAINER_NAME | No | uploads | Container name |

*Either AZURE_STORAGE_ACCOUNT_KEY or Azure identity credentials are required

Google Cloud Storage

| Variable | Required | Default | Description | |----------|----------|---------|-------------| | GOOGLE_APPLICATION_CREDENTIALS | No* | - | Path to service account JSON | | GOOGLE_CLOUD_SERVICE_ACCOUNT_JSON | No* | - | Service account JSON content | | GOOGLE_CLOUD_PROJECT_ID | No | - | GCP project ID | | GOOGLE_CLOUD_STORAGE_BUCKET | No | uploads | GCS bucket name |

*Either credentials path or JSON content is required

Local File Storage

| Variable | Required | Default | Description | |----------|----------|---------|-------------| | LOCAL_STORAGE_PATH | No | ./uploads | Base storage directory | | LOCAL_STORAGE_URL_BASE | No | - | Base URL for public access |

Authentication Methods

AWS S3

  • IAM Roles - Use EC2/ECS IAM roles (recommended for production)
  • Access Keys - Use environment variables for development

Azure Blob Storage

  • Managed Identity - Use Azure managed identity (recommended for Azure resources)
  • Shared Key - Use storage account key
  • Service Principal - Use Azure AD service principal

Google Cloud Storage

  • Workload Identity - Use GKE workload identity (recommended for GKE)
  • Service Account - Use service account key file or JSON

Local Storage

  • File System - Direct file system access with permissions

Error Handling

All methods throw descriptive errors:

try {
  const backend = new S3StorageBackend();
  await backend.upload(file);
} catch (error) {
  if (error.message.includes('AWS SDK not found')) {
    // Install AWS SDK: npm install @aws-sdk/client-s3 @aws-sdk/lib-storage
  } else if (error.message.includes('credentials')) {
    // Check AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
  }
}

Testing Connection

Test if a backend is properly configured:

const backend = new AzureStorageBackend();
const isConnected = await backend.testConnection();

if (isConnected) {
  console.log('Azure connection successful');
} else {
  console.error('Azure connection failed - check credentials');
}

// Get backend info
const info = backend.getBackendInfo();
console.log(info.type); // 'azure'
console.log(info.capabilities); // ['upload', 'download', 'delete', ...]
console.log(info.status); // 'connected' or 'disconnected'

File Metadata

All backends store and retrieve file metadata:

const metadata = await backend.getMetadata(fileId);
console.log(metadata.filename); // Original filename
console.log(metadata.contentType); // MIME type
console.log(metadata.size); // File size in bytes
console.log(metadata.uploadedAt); // Upload timestamp

Performance

  • S3 - Multipart uploads for large files (>5MB automatically)
  • Azure - Streaming uploads and downloads
  • GCP - Resumable uploads for large files
  • Local - Direct file system operations with metadata caching

License

Bernier LLC - Licensed for client use within delivered projects only.

Related Packages