npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@kadi.build/cloud-file-service

v1.3.3

Published

A comprehensive Node.js CLI tool and library for managing files across multiple cloud storage providers: Dropbox, Google Drive, and Box

Readme

Cloud File Service Manager

A comprehensive Node.js library and CLI tool for managing files across multiple cloud storage providers: Dropbox, Google Drive, and Box. Provides a unified interface for cloud file operations with a three-tier credential management system.

License: MIT Node.js npm version

🌟 Features

  • 🌐 Multi-Cloud Support: Unified interface for Dropbox, Google Drive, and Box
  • 📁 Complete File Management: Full CRUD operations for files and folders
  • 🔍 Advanced Search: Find files across all cloud providers with powerful search capabilities
  • 📤 Smart Uploads: Automatic chunked uploads for large files with progress tracking
  • 🔐 Three-Tier Credential Management: Native KĀDI vault → CLI vault → .env fallback
  • 🔔 Token Refresh Callback: Pluggable persistence for refreshed tokens (vault, DB, etc.)
  • 🛠️ CLI & Library: Use as command-line tool or integrate as Node.js library
  • 📝 config.yml Walk-Up: Non-secret settings loaded from nearest config.yml in directory tree
  • 🔄 Robust Error Handling: Comprehensive error handling and retry logic
  • 📊 Progress Tracking: Real-time progress for long-running operations
  • 🎯 Path Management: Automatic folder creation and path normalization
  • ⚡ Performance Optimized: Efficient memory usage and streaming for large files

📋 Table of Contents

🚀 Installation

As a Node.js Library (npm)

npm install @kadi.build/cloud-file-service
const { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');

const config = await ConfigManager.create();
const cloudManager = new CloudStorageManager(config);

await cloudManager.uploadFile('dropbox', './myfile.zip', '/backups/myfile.zip');

As a KĀDI Ability

kadi install cloud-file-manager

As a Standalone CLI Tool

git clone <repository-url>
cd cloud-file-manager-ability
npm install

⚡ Quick Start

  1. Install dependencies

    npm install
  2. Set up credentials (choose one)

    Option A — KĀDI vault (recommended if kadi is installed):

    kadi secret set -v cloud DROPBOX_CLIENT_ID "your-client-id"
    kadi secret set -v cloud DROPBOX_CLIENT_SECRET "your-secret"
    kadi secret set -v cloud DROPBOX_REFRESH_TOKEN "your-token"
    # ... repeat for each provider

    Option B — .env file (no kadi required):

    cp .env_example .env
    # Edit .env with your cloud provider credentials
  3. Set up non-secret config (optional)

    cp config_example.yml config.yml
    # Edit config.yml for chunk sizes, timeouts, etc.
  4. Test your configuration

    npm run validate
    npm run info
  5. Upload your first file

    node index.js upload --file document.pdf --service dropbox

⚙️ Configuration

Configuration is split into two systems:

| System | What goes here | Storage | |--------|---------------|---------| | config.yml | Non-secret settings (chunk sizes, timeouts, default provider) | Committed to repo | | Vault / .env | Secrets (API keys, OAuth tokens) | Never committed |

config.yml (Non-Secret Settings)

Copy the example and customise:

cp config_example.yml config.yml

The library uses walk-up discovery — it searches from the current directory upward until it finds a config.yml. Settings live under the cloud-file-manager: key:

cloud-file-manager:
  default_provider: dropbox
  default_backup_directory: /cloud-file-manager
  default_download_directory: ./downloads
  max_retry_attempts: 3
  chunk_size: 8388608          # 8 MB
  timeout_ms: 300000           # 5 minutes
  log_level: info

Environment variables override any config.yml value with the CFS_ prefix:

CFS_DEFAULT_PROVIDER=box    # overrides default_provider
CFS_LOG_LEVEL=debug          # overrides log_level

Required Credentials

| Provider | Required Variables | |----------|-------------------| | Dropbox | DROPBOX_CLIENT_ID, DROPBOX_CLIENT_SECRET, DROPBOX_REFRESH_TOKEN (or DROPBOX_ACCESS_TOKEN for short-lived token) | | Google Drive | GOOGLE_CLIENT_ID, GOOGLE_CLIENT_SECRET, GOOGLE_REFRESH_TOKEN (or GOOGLE_SERVICE_ACCOUNT_KEY / GOOGLE_SERVICE_ACCOUNT_KEY_PATH) | | Box | BOX_CLIENT_ID, BOX_CLIENT_SECRET, BOX_ACCESS_TOKEN, BOX_REFRESH_TOKEN |

Validation

Check your configuration:

npm run validate
npm run info  # Show configured services

🔐 Credential Management

Credentials (API keys, OAuth tokens) are loaded via a three-tier fallback. The library picks the best available method automatically:

| Tier | Method | When Used | Speed | |------|--------|-----------|-------| | 1 | Native secret-ability | kadiClient provided to ConfigManager | Fastest (in-process) | | 2 | KĀDI CLI (kadi secret) | kadi is installed but no kadiClient | Fast (subprocess) | | 3 | .env file | No KĀDI at all | Immediate (file read) |

All three tiers load from the same set of credential keys. The vault name is cloud.

Tier 1 — Native Secret-Ability (KĀDI Agent Context)

When running as a KĀDI ability/agent, pass a KadiClient to ConfigManager:

import { KadiClient } from '@kadi.build/core';
const { ConfigManager, CloudStorageManager } = require('@kadi.build/cloud-file-service');

const client = new KadiClient({ name: 'my-agent', version: '1.0.0' });
const config = await ConfigManager.create({ kadiClient: client });
const cloud = new CloudStorageManager(config);

Credentials are loaded via client.loadNative('secret-ability') — no subprocesses, no file I/O.

Tier 2 — KĀDI CLI (Developer Machine)

If you have kadi installed (e.g. via npm install -g @kadi.build/kadi) but aren't running inside a KĀDI agent, the library will automatically use kadi secret get/set to read from and write to the vault:

# Store credentials once
kadi secret set -v cloud DROPBOX_CLIENT_ID "your-client-id"
kadi secret set -v cloud DROPBOX_CLIENT_SECRET "your-secret"
kadi secret set -v cloud DROPBOX_REFRESH_TOKEN "your-token"
kadi secret set -v cloud DROPBOX_ACCESS_TOKEN "your-access-token"
# ... repeat for Google Drive and Box keys
const { ConfigManager, CloudStorageManager } = require('@kadi.build/cloud-file-service');

// No kadiClient needed — CLI vault is detected automatically
const config = await ConfigManager.create();
const cloud = new CloudStorageManager(config);

Tier 3 — .env File (Standalone / npm Consumer)

If KĀDI is not installed at all, the library falls back to a standard .env file:

cp .env_example .env
# Edit .env with your credentials
# Dropbox
DROPBOX_CLIENT_ID=your-client-id
DROPBOX_CLIENT_SECRET=your-secret
DROPBOX_REFRESH_TOKEN=your-refresh-token
DROPBOX_ACCESS_TOKEN=your-access-token

# Google Drive (OAuth)
GOOGLE_CLIENT_ID=your-google-client-id
GOOGLE_CLIENT_SECRET=your-google-secret
GOOGLE_REFRESH_TOKEN=your-google-refresh-token

# Google Drive (Service Account — alternative to OAuth)
# GOOGLE_SERVICE_ACCOUNT_KEY='{"type":"service_account",...}'
# GOOGLE_SERVICE_ACCOUNT_KEY_PATH=/path/to/service-account.json

# Box
BOX_CLIENT_ID=your-box-client-id
BOX_CLIENT_SECRET=your-box-secret
BOX_ACCESS_TOKEN=your-box-access-token
BOX_REFRESH_TOKEN=your-box-refresh-token

Environment Variable Overrides

Regardless of which tier loads the credentials, individual environment variables always override:

export DROPBOX_ACCESS_TOKEN="override-token"
# This takes priority over vault or .env values

Token Refresh Persistence

When OAuth tokens are refreshed at runtime, the library persists them back using the same three-tier strategy:

  1. Native — writes to vault via secret-ability
  2. CLI — writes to vault via kadi secret set
  3. Neither — logs new tokens to console (manual save needed)

You can also supply a custom callback (see Library Usage).

🔐 OAuth Setup

Automated OAuth Setup (Recommended)

Dropbox OAuth Setup

npm run setup:dropbox

This will:

  • Open your browser for authorization
  • Handle the OAuth flow automatically
  • Save refresh tokens to vault (if kadi installed) or .env
  • Test the connection

Google Drive OAuth Setup

npm run setup:googledrive

This will:

  • Guide you through Google Cloud Console setup
  • Handle the OAuth flow automatically
  • Save refresh tokens to vault (if kadi installed) or .env
  • Test the connection

Manual OAuth Setup

If the automated setup fails:

# Dropbox manual token exchange
npm run setup:dropbox-manual

# Google Drive manual token exchange
npm run setup:googledrive-manual

Cloud Provider Setup Guides

  1. Go to Dropbox App Console
  2. Create a new app with "Scoped access" and "Full Dropbox" access
  3. Enable these permissions in the Permissions tab:
    • files.metadata.read - View file and folder metadata
    • files.metadata.write - Edit file and folder metadata
    • files.content.read - View file contents
    • files.content.write - Edit file contents
  4. Add redirect URI: http://localhost:8080/callback
  5. Automated setup: Run npm run setup:dropbox
  6. Manual setup: Generate access token and add to .env
  1. Go to Google Cloud Console
  2. Create a new project or select an existing one
  3. Enable the Google Drive API
  4. Configure OAuth consent screen with external user type
  5. Create OAuth 2.0 credentials (Desktop application type)
  6. Add redirect URI: http://localhost:8080/callback
  7. Automated setup: Run npm run setup:googledrive
  8. Manual setup: Use OAuth playground to get refresh token
  1. Go to Box Developer Console
  2. Create a new "Custom App" with "Standard OAuth 2.0"
  3. Configure OAuth settings and get client credentials
  4. Add redirect URI: http://localhost:8080/callback
  5. Use OAuth flow to obtain access and refresh tokens
  6. Add credentials to your .env file

💻 CLI Usage

Basic Commands

Test connections:

node index.js test                    # Test all configured services
node index.js test --service dropbox  # Test specific service

Upload files:

node index.js upload --file document.pdf --service dropbox
node index.js upload --file archive.zip --service googledrive --directory /documents

Download files:

node index.js download --service dropbox --remote /uploads/document.pdf
node index.js download --service googledrive --remote /documents/archive.zip --local ./downloads

List files:

node index.js list --service dropbox                    # List files in default directory
node index.js list --service googledrive --directory /documents

Advanced Operations

File management:

# Copy files
node index.js copy --service dropbox --remote /file.pdf --destination /backup/file.pdf

# Rename files
node index.js rename --service googledrive --remote /old-name.pdf --name new-name.pdf

# Delete files
node index.js delete --service box --remote /old-file.pdf --yes

# Get file information
node index.js info --service dropbox --remote /uploads/document.pdf

Folder operations:

# Create folders
node index.js mkdir --service dropbox --remote /new-project-folder

# List folders
node index.js ls-folders --service googledrive --directory /projects

# Delete folders
node index.js rmdir --service box --remote /old-project --force --yes

Search files:

node index.js search --service googledrive --query "quarterly report"
node index.js search --service dropbox --query "*.pdf" --limit 50

Automated Scripts

# Run comprehensive tests
npm test
npm run test:all

# Clean up test files
npm run clean

# Setup all providers
npm run setup

📚 Library Usage

Basic Integration (npm Consumer)

const { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');

class MyApp {
  constructor() {
    this.cloudManager = null;
    this.config = null;
  }

  async initialize() {
    // ConfigManager auto-detects: vault (native/CLI) → .env → env vars
    this.config = await ConfigManager.create();
    this.cloudManager = new CloudStorageManager(this.config);

    // Validate configuration
    const validation = this.config.validate();
    if (!validation.isValid) {
      throw new Error(`Configuration errors: ${validation.errors.join(', ')}`);
    }
  }

  async uploadFile(provider, localPath, remotePath) {
    return await this.cloudManager.uploadFile(provider, localPath, remotePath);
  }

  async listFiles(provider, directory = '/') {
    return await this.cloudManager.listFiles(provider, directory);
  }
}

Integration with KĀDI Agent

import { KadiClient } from '@kadi.build/core';
const { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');

const client = new KadiClient({ name: 'my-agent', version: '1.0.0' });

// Pass kadiClient for native vault access (fastest tier)
const config = await ConfigManager.create({ kadiClient: client });
const cloudManager = new CloudStorageManager(config);

console.log('Services:', config.getConfiguredServices());

Token Refresh Callback (Custom Persistence)

By default, when a provider refreshes its OAuth tokens the library writes the new values back through the three-tier system (native vault → CLI vault → console log). If you manage credentials externally (database, custom vault, etc.) you can supply an onTokenRefresh callback as the second argument to CloudStorageManager. When provided, the callback is called instead of the default persistence. If the callback throws, the provider falls back to vault/CLI automatically.

const { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');

const config = new ConfigManager();
await config.load();

const cloudManager = new CloudStorageManager(config, {
  /**
   * Called when any provider refreshes its OAuth tokens.
   * @param {string} provider  - 'dropbox' | 'googledrive' | 'box'
   * @param {object} tokens
   * @param {string} tokens.accessToken   - New access token
   * @param {string} [tokens.refreshToken] - New refresh token (if rotated)
   * @param {number} [tokens.expiresAt]    - Expiry time (epoch ms)
   */
  onTokenRefresh: async (provider, tokens) => {
    // Example: persist to a secret vault
    await vault.set(`${provider}_access_token`, tokens.accessToken);
    if (tokens.refreshToken) {
      await vault.set(`${provider}_refresh_token`, tokens.refreshToken);
    }
    console.log(`Saved refreshed ${provider} tokens to vault`);
  },
});

The callback receives the provider name ('dropbox', 'googledrive', or 'box') and a tokens object. The same interface works identically for all three providers.

Advanced Integration

const { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');

class DocumentManager {
  async initialize() {
    this.config = await ConfigManager.create();
    this.cloudManager = new CloudStorageManager(this.config);
  }

  async backupDocument(document, metadata = {}) {
    const timestamp = new Date().toISOString().split('T')[0];
    const remotePath = `/backups/${timestamp}/${document.name}`;
    
    try {
      // Upload to primary storage
      const result = await this.cloudManager.uploadFile(
        'dropbox', 
        document.path, 
        remotePath
      );
      
      // Create backup copy on secondary storage
      await this.cloudManager.uploadFile(
        'googledrive', 
        document.path, 
        remotePath
      );
      
      return {
        success: true,
        primary: { service: 'dropbox', fileId: result.id },
        backup: { service: 'googledrive', path: remotePath },
        timestamp: new Date().toISOString()
      };
    } catch (error) {
      throw new Error(`Document backup failed: ${error.message}`);
    }
  }

  async syncDocuments(sourceProvider, targetProvider, directory = '/') {
    const files = await this.cloudManager.listFiles(sourceProvider, directory);
    const results = [];

    for (const file of files) {
      try {
        // Download from source
        const localPath = `./temp/${file.name}`;
        await this.cloudManager.downloadFile(sourceProvider, file.path, localPath);
        
        // Upload to target
        await this.cloudManager.uploadFile(targetProvider, localPath, file.path);
        
        // Clean up temp file
        await fs.unlink(localPath);
        
        results.push({ file: file.name, status: 'synced' });
      } catch (error) {
        results.push({ file: file.name, status: 'failed', error: error.message });
      }
    }
    
    return results;
  }
}

Batch Operations

// Upload multiple files
const uploadResults = await cloudManager.uploadMultipleFiles(
  'dropbox',
  ['./file1.pdf', './file2.pdf', './file3.pdf'],
  '/batch-upload'
);

// Download multiple files
const downloadResults = await cloudManager.downloadMultipleFiles(
  'googledrive',
  ['/docs/file1.pdf', '/docs/file2.pdf'],
  './downloads'
);

// Sync directories
const syncResults = await cloudManager.syncDirectory(
  'dropbox',
  './local-folder',
  '/remote-folder',
  { dryRun: false, deleteRemote: false, overwrite: true }
);

� KĀDI Ability Usage

This library powers the cloud-storage-ability — a KĀDI ability that exposes cloud file operations as 17 broker-callable tools. See the cloud-storage-ability for the full agent wrapper.

agent.json

The library ships with an agent.json that declares it as a KĀDI ability:

{
  "name": "cloud-file-manager",
  "version": "1.3.0",
  "kind": "ability",
  "abilities": {
    "secret-ability": "^0.9.1"
  }
}

How Credentials Flow

┌──────────────────────────────────────────────────────────┐
│  KĀDI Agent (e.g. cloud-storage-ability)                 │
│                                                          │
│  KadiClient ──► ConfigManager({ kadiClient })            │
│                      │                                   │
│                      ▼                                   │
│              ┌── Tier 1: native secret-ability ───┐      │
│              │  client.loadNative('secret-ability')│      │
│              │  secrets.invoke('get', {vault,key}) │      │
│              └────────────────────────────────────┘      │
└──────────────────────────────────────────────────────────┘

┌──────────────────────────────────────────────────────────┐
│  Developer Machine (kadi installed, no kadiClient)       │
│                                                          │
│  ConfigManager.create()                                  │
│              │                                           │
│              ▼                                           │
│       ┌── Tier 2: kadi CLI ───────────────┐              │
│       │  kadi secret get -v cloud <KEY>   │              │
│       └───────────────────────────────────┘              │
└──────────────────────────────────────────────────────────┘

┌──────────────────────────────────────────────────────────┐
│  npm Consumer (no kadi installed)                        │
│                                                          │
│  ConfigManager.create()                                  │
│              │                                           │
│              ▼                                           │
│       ┌── Tier 3: .env file ──────────────┐              │
│       │  reads .env from CWD              │              │
│       └───────────────────────────────────┘              │
└──────────────────────────────────────────────────────────┘

�🧪 Testing

Automated Testing

Run comprehensive tests for all providers:

npm test                    # Interactive test selection
npm run test:all           # Test all providers sequentially
npm run test:dropbox       # Test Dropbox only
npm run test:googledrive   # Test Google Drive only
npm run test:box           # Test Box only

Test Categories

The test suite includes:

  • Connection & Authentication - Verify credentials and API access
  • Basic File Operations - Upload, download, list, info
  • Folder Operations - Create, delete, rename, list
  • File Management - Copy, rename, delete operations
  • Search & Query - File search functionality
  • Large File Handling - Chunked upload/download for 2MB+ files
  • Edge Cases & Error Handling - Invalid paths, missing files
  • Performance & Stress - Batch operations, timing tests

Test Results

Tests generate detailed reports in ./test-results/ with:

  • Pass/fail statistics
  • Performance metrics
  • Error details
  • Timing information

Cleanup

Clean up test artifacts:

npm run clean  # Remove test files and directories

📖 API Reference

CloudStorageManager

File Operations

  • uploadFile(serviceName, localPath, remotePath) - Upload a file
  • downloadFile(serviceName, remotePath, localPath) - Download a file
  • getFileInfo(serviceName, remotePath) - Get file metadata
  • listFiles(serviceName, remotePath, options) - List files in directory
  • deleteFile(serviceName, remotePath) - Delete a file
  • renameFile(serviceName, remotePath, newName) - Rename a file
  • copyFile(serviceName, sourcePath, destinationPath) - Copy a file

Folder Operations

  • createFolder(serviceName, remotePath) - Create a folder
  • listFolders(serviceName, remotePath) - List folders in directory
  • deleteFolder(serviceName, remotePath, recursive) - Delete a folder
  • renameFolder(serviceName, remotePath, newName) - Rename a folder
  • getFolderInfo(serviceName, remotePath) - Get folder metadata

Temporary Links

  • getTemporaryLink(serviceName, remotePath) - Get a temporary signed download URL

Search & Utility

  • searchFiles(serviceName, query, options) - Search for files
  • testConnection(serviceName) - Test service connection
  • getAvailableServices() - Get list of configured services

Batch Operations

  • uploadMultipleFiles(serviceName, fileList, directory) - Upload multiple files
  • downloadMultipleFiles(serviceName, fileList, directory) - Download multiple files
  • syncDirectory(serviceName, localDir, remoteDir, options) - Sync directories

ConfigManager

  • ConfigManager.create(options?) - Static factory: create + load in one call
  • load() - Load configuration (vault → .env → env vars → defaults)
  • get(key) - Get configuration value
  • set(key, value) - Set configuration value
  • validate() - Validate current configuration
  • getConfiguredServices() - Get list of configured services
  • getSummary() - Get configuration summary
  • safeUpdate(updates) - Persist credential changes (vault → .env)
  • persistProviderTokens(provider, tokens) - Persist refreshed OAuth tokens
  • loadFromConfigYml() - Reload non-secret settings from config.yml

⚡ Performance

Provider Features Matrix

| Feature | Dropbox | Google Drive | Box | |---------|---------|--------------|-----| | File Upload/Download | ✅ | ✅ | ✅ | | Large File Chunking | ✅ (150MB+) | ✅ (5MB+) | ✅ (20MB+) | | Progress Tracking | ✅ | ✅ | ✅ | | Checksum Verification | ✅ (SHA256) | ✅ (MD5) | ✅ (SHA1) | | Folder Operations | ✅ | ✅ | ✅ | | File Search | ✅ | ✅ | ✅ | | OAuth Refresh | ✅ | ✅ | ✅ | | Batch Operations | ✅ | ✅ | ✅ |

Performance Optimizations

  • Automatic Chunking: Large files split into optimal chunks per provider
  • Memory Efficiency: Streaming uploads minimize memory usage
  • Retry Logic: Automatic retry with exponential backoff
  • Concurrent Operations: Parallel uploads/downloads where supported
  • Token Management: Automatic refresh prevents authentication failures

Best Practices

// Use batch operations for multiple files
const results = await cloudManager.uploadMultipleFiles(
  'dropbox', 
  fileList, 
  '/uploads'
);

// Handle large files efficiently
const result = await cloudManager.uploadFile(
  'googledrive',
  './large-file.zip',  // Automatically uses chunked upload
  '/backups/large-file.zip'
);

// Monitor progress for long operations
console.log('Upload completed:', result.name);

🔧 Error Handling

The service provides comprehensive error handling for:

  • Authentication Errors: Invalid or expired tokens (automatic refresh)
  • Network Issues: Connection timeouts and retries
  • File Operations: Not found, permission denied, quota exceeded
  • Rate Limiting: Automatic backoff and retry logic
  • Validation Errors: Invalid parameters and configurations

Error Types

try {
  await cloudManager.uploadFile('dropbox', './file.pdf', '/upload/file.pdf');
} catch (error) {
  if (error.message.includes('quota_exceeded')) {
    console.log('Storage quota exceeded');
  } else if (error.message.includes('unauthorized')) {
    console.log('Authentication failed - check credentials');
  } else if (error.message.includes('not_found')) {
    console.log('File or folder not found');
  }
}

🤝 Contributing

We welcome contributions! Here's how to get started:

Development Setup

git clone <repository-url>
cd cloud-file-service
npm install
npm run setup

Adding New Providers

  1. Create provider file in src/providers/
  2. Implement all methods from spec.md
  3. Add configuration support in configManager.js
  4. Update cloudStorageManager.js
  5. Add comprehensive tests
  6. Update documentation

Testing Contributions

npm run lint          # Check code style
npm run validate      # Validate configuration
npm test             # Run test suite

Guidelines

  • Follow the Provider Integration Guide in Cloud Provider Integration Spec.md
  • Maintain feature parity across providers
  • Add comprehensive error handling
  • Include performance optimizations
  • Update documentation

📄 License

MIT License - see LICENSE file for details.

🆘 Support

Common Issues

"Provider not configured" error:

  • Run npm run validate to check configuration
  • Run npm run info to see configured services
  • Check vault: kadi secret get -v cloud DROPBOX_CLIENT_ID (if kadi installed)
  • Check .env file has correct credentials (if no kadi)

OAuth setup fails:

  • Ensure redirect URI is http://localhost:8080/callback
  • Check firewall/antivirus isn't blocking port 8080
  • Try manual setup: npm run setup:dropbox-manual

Large file upload issues:

  • Check stable internet connection
  • Monitor progress output for specific errors
  • Verify sufficient storage quota in cloud provider

Cloud File Service Manager - Unified cloud storage for any application 🚀