npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

multi-drive-uploader

v1.0.1

Published

Production-ready Node.js package providing a unified interface for uploading files to multiple cloud storage providers: Google Drive, OneDrive, Zoho WorkDrive, Box, and Amazon S3. Backend-only with OAuth 2.0 support.

Readme

Multi Drive Uploader

npm npm bundle size License Node

Seamlessly upload files to Google Drive, OneDrive, Zoho WorkDrive, Box, or Amazon S3 through one consistent interface.

Multi Drive Uploader eliminates the complexity of integrating with multiple cloud storage APIs. Write your upload logic once and switch between providers with a simple configuration change.

Why Multi Drive Uploader?

| Challenge | Solution | |-----------|----------| | Different APIs for each provider | Single uploadFile() method works everywhere | | Complex OAuth token management | Automatic token refresh with smart caching | | Inconsistent response formats | Normalized response structure across all providers | | Large file handling varies | Built-in resumable/multipart upload support |

Quick Install

npm install multi-drive-uploader
yarn add multi-drive-uploader

Requires Node.js 16.0.0+


Getting Started

Upload to Google Drive

const { MultiDriveUploader } = require('multi-drive-uploader');

const googleUploader = new MultiDriveUploader({
  provider: 'google-drive',
  credentials: {
    clientId: process.env.GOOGLE_CLIENT_ID,
    clientSecret: process.env.GOOGLE_CLIENT_SECRET,
    refreshToken: process.env.GOOGLE_REFRESH_TOKEN,
    folderId: process.env.GOOGLE_FOLDER_ID
  }
});

// Upload from disk
const response = await googleUploader.uploadFile('./invoices/invoice-2024.pdf');
console.log('Uploaded:', response.fileId, response.permalink);

Upload to Microsoft OneDrive

const oneDriveClient = new MultiDriveUploader({
  provider: 'onedrive',
  credentials: {
    clientId: process.env.MS_CLIENT_ID,
    clientSecret: process.env.MS_CLIENT_SECRET,
    refreshToken: process.env.MS_REFRESH_TOKEN,
    folderId: 'root'
  }
});

// Upload from memory buffer
const pdfBuffer = await generatePdfReport();
const response = await oneDriveClient.uploadFile(pdfBuffer, {
  filename: 'quarterly-report.pdf'
});

Upload to Zoho WorkDrive

const zohoClient = new MultiDriveUploader({
  provider: 'zoho',
  credentials: {
    clientId: process.env.ZOHO_CLIENT_ID,
    clientSecret: process.env.ZOHO_CLIENT_SECRET,
    refreshToken: process.env.ZOHO_REFRESH_TOKEN,
    folderId: process.env.ZOHO_FOLDER_ID,
    dataCenter: 'EU'  // US, EU, IN, AU, JP, CN
  }
});

// Upload from readable stream
const fileStream = fs.createReadStream('./exports/data.csv');
const response = await zohoClient.uploadFile(fileStream, {
  filename: 'export-data.csv',
  overrideNameExist: true
});

Upload to Box

const boxClient = new MultiDriveUploader({
  provider: 'box',
  credentials: {
    clientId: process.env.BOX_CLIENT_ID,
    clientSecret: process.env.BOX_CLIENT_SECRET,
    refreshToken: process.env.BOX_REFRESH_TOKEN,
    folderId: '0'  // Root folder
  }
});

const response = await boxClient.uploadFile('./contracts/agreement.docx');
console.log('Box file ID:', response.fileId);

Upload to Amazon S3

const s3Client = new MultiDriveUploader({
  provider: 's3',
  credentials: {
    accessKeyId: process.env.AWS_ACCESS_KEY,
    secretAccessKey: process.env.AWS_SECRET_KEY,
    bucket: 'my-app-uploads',
    region: 'eu-west-1'
  }
});

const response = await s3Client.uploadFile('./media/video.mp4', {
  prefix: 'user-uploads/2024',
  contentType: 'video/mp4',
  acl: 'private',
  metadata: { uploadedBy: 'system', category: 'media' }
});
console.log('S3 URL:', response.permalink);

Provider Comparison

| Feature | Google Drive | OneDrive | Zoho | Box | S3 | |---------|:------------:|:--------:|:----:|:---:|:--:| | Authentication | OAuth 2.0 | OAuth 2.0 | OAuth 2.0 | OAuth 2.0 | IAM Keys | | Large File Support | Resumable | Session Upload | Standard | Chunked | Multipart | | Regional Options | — | — | 6 Regions | — | All AWS Regions | | Progress Tracking | — | — | — | — | ✓ |


Security Model

Server-Side Only

This package handles sensitive credentials and must only run on your backend. Never bundle it with frontend code.

┌─────────────┐     ┌─────────────┐     ┌──────────────────┐     ┌─────────────┐
│   Browser   │ ──► │  Your API   │ ──► │ MultiDriveUploader│ ──► │   Cloud     │
│  (Frontend) │     │  (Backend)  │     │                  │     │  Provider   │
└─────────────┘     └─────────────┘     └──────────────────┘     └─────────────┘

Credentials stay on your server. The frontend only communicates with your API.


Configuration Reference

Initialization

Create an uploader instance by passing a configuration object:

const uploader = new MultiDriveUploader(config);

Configuration Properties:

| Property | Type | Required | Default | Description | |----------|------|:--------:|---------|-------------| | provider | string | ✓ | — | One of: google-drive, onedrive, zoho, box, s3 | | credentials | object | ✓ | — | Provider-specific authentication (see below) | | timeout | number | | 30000 | HTTP request timeout (milliseconds) | | maxFileSize | number | | null | Maximum upload size in bytes |

Credentials by Provider

OAuth Providers (Google Drive, OneDrive, Box):

| Field | Required | Notes | |-------|:--------:|-------| | clientId | ✓ | OAuth 2.0 Client ID from provider console | | clientSecret | ✓ | OAuth 2.0 Client Secret | | refreshToken | ✓ | Long-lived refresh token (not access token) | | folderId | | Destination folder; omit for root |

Zoho WorkDrive (additional fields):

| Field | Required | Notes | |-------|:--------:|-------| | folderId | ✓ | Parent folder is mandatory for Zoho | | dataCenter | | Regional endpoint: US (default), EU, IN, AU, JP, CN |

Amazon S3 (IAM-based):

| Field | Required | Notes | |-------|:--------:|-------| | accessKeyId | ✓ | AWS IAM Access Key | | secretAccessKey | ✓ | AWS IAM Secret Key | | bucket | ✓ | Target S3 bucket name | | region | | AWS region code (default: us-east-1) |


API Documentation

Core Method: uploadFile(input, options)

Uploads a file to your configured cloud storage.

Input Types:

| Type | Example | Notes | |------|---------|-------| | string | './docs/report.pdf' | Path to file on disk | | Buffer | fs.readFileSync(path) | File contents in memory | | Readable | fs.createReadStream(path) | Node.js readable stream |

Options Object:

| Option | Applies To | Description | |--------|------------|-------------| | filename | Buffer, Stream | Required when input is not a file path | | folderId | OAuth providers | Override the default folder | | key | S3 | Custom object key (overrides filename) | | prefix | S3 | Key prefix, e.g., 'uploads/2024/' | | contentType | S3 | MIME type (default: auto-detected) | | acl | S3 | Access control list setting | | metadata | S3 | Custom metadata as key-value pairs | | onProgress | S3 | Callback: ({ loaded, total }) => {} | | overrideNameExist | Zoho | Replace existing file with same name |

Response Structure:

All providers return a normalized response:

{
  success: true,
  provider: 'onedrive',
  fileId: 'unique-file-identifier',
  filename: 'quarterly-report.pdf',
  size: 2048576,
  mimeType: 'application/pdf',
  permalink: 'https://...',      // Shareable URL
  downloadUrl: 'https://...',    // Direct download (when available)
  metadata: { /* raw provider response */ }
}

Utility Methods

| Method | Returns | Description | |--------|---------|-------------| | getProvider() | string | Current provider key (e.g., 's3') | | getProviderName() | string | Human-readable name (e.g., 'Amazon S3') | | clearCache() | void | Forces token re-fetch on next request | | MultiDriveUploader.getSupportedProviders() | string[] | List of valid provider keys | | MultiDriveUploader.getProviderNames() | object | Map of provider keys to display names |


Server Integration Patterns

Express.js + Multer Example

const express = require('express');
const multer = require('multer');
const { MultiDriveUploader } = require('multi-drive-uploader');

const app = express();
const storage = multer.memoryStorage();
const fileHandler = multer({ storage });

// Singleton uploader for the application
const cloudUploader = new MultiDriveUploader({
  provider: 'google-drive',
  credentials: {
    clientId: process.env.GCP_CLIENT_ID,
    clientSecret: process.env.GCP_CLIENT_SECRET,
    refreshToken: process.env.GCP_REFRESH_TOKEN,
    folderId: process.env.GCP_FOLDER_ID
  }
});

app.post('/documents/upload', fileHandler.single('document'), async (req, res) => {
  if (!req.file) {
    return res.status(400).json({ error: 'No file provided' });
  }

  try {
    const uploadResult = await cloudUploader.uploadFile(req.file.buffer, {
      filename: req.file.originalname
    });

    res.status(201).json({
      message: 'File uploaded successfully',
      fileId: uploadResult.fileId,
      url: uploadResult.permalink
    });
  } catch (err) {
    console.error('Upload error:', err);
    res.status(500).json({ error: 'Upload failed' });
  }
});

app.listen(process.env.PORT || 8080);

Dynamic Provider Selection

Enable users to choose their preferred storage:

// Initialize all supported providers
const providers = {
  google: new MultiDriveUploader({ provider: 'google-drive', credentials: { /* ... */ } }),
  microsoft: new MultiDriveUploader({ provider: 'onedrive', credentials: { /* ... */ } }),
  amazon: new MultiDriveUploader({ provider: 's3', credentials: { /* ... */ } })
};

app.post('/files/:storageType', fileHandler.single('file'), async (req, res) => {
  const selectedProvider = providers[req.params.storageType];

  if (!selectedProvider) {
    return res.status(404).json({ error: `Unknown storage: ${req.params.storageType}` });
  }

  const result = await selectedProvider.uploadFile(req.file.buffer, {
    filename: req.file.originalname
  });

  res.json({ provider: req.params.storageType, ...result });
});

---

## Error Handling

Multi Drive Uploader provides granular error types for precise error handling:

### Available Error Classes

| Error Class | When Thrown |
|------------|-------------|
| `MultiUploaderError` | Base class for all package errors |
| `ConfigurationError` | Missing or invalid configuration options |
| `AuthenticationError` | OAuth token refresh failure or invalid credentials |
| `UploadError` | File upload operation failed |
| `FileInputError` | Invalid file input (missing file, unreadable, etc.) |
| `ProviderAPIError` | Cloud provider returned an error response |
| `RateLimitError` | API rate limit exceeded (includes `retryAfter` property) |
| `NetworkError` | Connection issues, DNS failures |
| `TimeoutError` | Request exceeded configured timeout |
| `UnsupportedProviderError` | Unknown provider specified |

### Handling Errors Gracefully

```javascript
const {
  MultiDriveUploader,
  AuthenticationError,
  RateLimitError,
  NetworkError,
  TimeoutError
} = require('multi-drive-uploader');

async function safeUpload(uploader, filePath) {
  try {
    return await uploader.uploadFile(filePath);
  } catch (err) {
    switch (true) {
      case err instanceof AuthenticationError:
        // Token expired - prompt re-authentication
        await refreshUserTokens();
        return safeUpload(uploader, filePath);  // Retry once

      case err instanceof RateLimitError:
        // Back off and retry
        console.warn(`Rate limited. Waiting ${err.retryAfter}s...`);
        await sleep(err.retryAfter * 1000);
        return safeUpload(uploader, filePath);

      case err instanceof NetworkError:
      case err instanceof TimeoutError:
        // Transient failure - retry with backoff
        await sleep(3000);
        return safeUpload(uploader, filePath);

      default:
        // Log and rethrow unexpected errors
        console.error('Unexpected upload error:', err);
        throw err;
    }
  }
}

function sleep(ms) {
  return new Promise(resolve => setTimeout(resolve, ms));
}

Recommended Practices

Store Credentials Securely

Use environment variables or a secrets manager—never commit credentials to source control.

# .env file (add to .gitignore!)
DRIVE_CLIENT_ID=your-client-id
DRIVE_CLIENT_SECRET=your-client-secret
DRIVE_REFRESH_TOKEN=your-refresh-token
require('dotenv').config();

const uploader = new MultiDriveUploader({
  provider: 'google-drive',
  credentials: {
    clientId: process.env.DRIVE_CLIENT_ID,
    clientSecret: process.env.DRIVE_CLIENT_SECRET,
    refreshToken: process.env.DRIVE_REFRESH_TOKEN
  }
});

Validate Before Upload

Check file constraints on your server before attempting upload:

function validateFile(file) {
  const MAX_SIZE = 50 * 1024 * 1024;  // 50 MB
  const ALLOWED_EXTENSIONS = ['.pdf', '.docx', '.xlsx', '.png', '.jpg'];

  if (file.size > MAX_SIZE) {
    throw new Error(`File exceeds ${MAX_SIZE / 1024 / 1024}MB limit`);
  }

  const ext = path.extname(file.originalname).toLowerCase();
  if (!ALLOWED_EXTENSIONS.includes(ext)) {
    throw new Error(`File type ${ext} not permitted`);
  }

  return true;
}

Implement Retry with Exponential Backoff

async function uploadWithBackoff(uploader, file, opts = {}, attempt = 1) {
  const MAX_ATTEMPTS = 5;

  try {
    return await uploader.uploadFile(file, opts);
  } catch (err) {
    if (attempt >= MAX_ATTEMPTS) throw err;

    const isRetryable = err instanceof RateLimitError ||
                        err instanceof NetworkError ||
                        err instanceof TimeoutError;

    if (!isRetryable) throw err;

    const delay = Math.min(1000 * Math.pow(2, attempt), 30000);
    console.log(`Attempt ${attempt} failed. Retrying in ${delay}ms...`);
    await sleep(delay);

    return uploadWithBackoff(uploader, file, opts, attempt + 1);
  }
}

Track S3 Upload Progress

For large files uploaded to S3, provide user feedback:

const result = await s3Uploader.uploadFile(videoBuffer, {
  filename: 'presentation.mp4',
  contentType: 'video/mp4',
  onProgress: ({ loaded, total }) => {
    const pct = ((loaded / total) * 100).toFixed(1);
    process.stdout.write(`\rUploading: ${pct}%`);
  }
});
console.log('\nUpload complete!');

Common Issues & Solutions

"Token refresh failed"

| Cause | Fix | |-------|-----| | Expired refresh token | Re-authorize user through OAuth flow | | Revoked app access | User must re-grant permissions | | Wrong data center (Zoho) | Verify dataCenter matches account region | | Invalid client credentials | Double-check Client ID/Secret |

"Folder not found"

| Cause | Fix | |-------|-----| | Deleted or moved folder | Confirm folder still exists in provider | | Insufficient permissions | Ensure the OAuth token has folder access | | Incorrect folder ID format | Some providers use paths, others use IDs |

"Access Denied" (S3)

| Cause | Fix | |-------|-----| | Missing IAM permissions | Add s3:PutObject to IAM policy | | Bucket policy denies access | Review bucket policy statements | | Wrong region configured | Match region to bucket location | | Disabled/rotated access keys | Generate new access keys in IAM |

"Request timeout"

| Cause | Fix | |-------|-----| | File too large for timeout | Increase timeout in config | | Slow network connection | Use smaller files or better connection | | Provider outage | Check provider status page |


Official Documentation

Provider-specific API documentation:


License

Released under the MIT License.


Contributing

Found a bug or have a feature request? Open an issue or submit a pull request on GitHub.

github.com/techleadaadvik/multi-drive-uploader