@kadi.build/cloud-file-service
v1.3.3
Published
A comprehensive Node.js CLI tool and library for managing files across multiple cloud storage providers: Dropbox, Google Drive, and Box
Readme
Cloud File Service Manager
A comprehensive Node.js library and CLI tool for managing files across multiple cloud storage providers: Dropbox, Google Drive, and Box. Provides a unified interface for cloud file operations with a three-tier credential management system.
🌟 Features
- 🌐 Multi-Cloud Support: Unified interface for Dropbox, Google Drive, and Box
- 📁 Complete File Management: Full CRUD operations for files and folders
- 🔍 Advanced Search: Find files across all cloud providers with powerful search capabilities
- 📤 Smart Uploads: Automatic chunked uploads for large files with progress tracking
- 🔐 Three-Tier Credential Management: Native KĀDI vault → CLI vault →
.envfallback - 🔔 Token Refresh Callback: Pluggable persistence for refreshed tokens (vault, DB, etc.)
- 🛠️ CLI & Library: Use as command-line tool or integrate as Node.js library
- 📝 config.yml Walk-Up: Non-secret settings loaded from nearest
config.ymlin directory tree - 🔄 Robust Error Handling: Comprehensive error handling and retry logic
- 📊 Progress Tracking: Real-time progress for long-running operations
- 🎯 Path Management: Automatic folder creation and path normalization
- ⚡ Performance Optimized: Efficient memory usage and streaming for large files
📋 Table of Contents
- Installation
- Quick Start
- Configuration
- Credential Management
- OAuth Setup
- CLI Usage
- Library Usage
- KĀDI Ability Usage
- Testing
- API Reference
- Performance
- Contributing
🚀 Installation
As a Node.js Library (npm)
npm install @kadi.build/cloud-file-serviceconst { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');
const config = await ConfigManager.create();
const cloudManager = new CloudStorageManager(config);
await cloudManager.uploadFile('dropbox', './myfile.zip', '/backups/myfile.zip');As a KĀDI Ability
kadi install cloud-file-managerAs a Standalone CLI Tool
git clone <repository-url>
cd cloud-file-manager-ability
npm install⚡ Quick Start
Install dependencies
npm installSet up credentials (choose one)
Option A — KĀDI vault (recommended if kadi is installed):
kadi secret set -v cloud DROPBOX_CLIENT_ID "your-client-id" kadi secret set -v cloud DROPBOX_CLIENT_SECRET "your-secret" kadi secret set -v cloud DROPBOX_REFRESH_TOKEN "your-token" # ... repeat for each providerOption B —
.envfile (no kadi required):cp .env_example .env # Edit .env with your cloud provider credentialsSet up non-secret config (optional)
cp config_example.yml config.yml # Edit config.yml for chunk sizes, timeouts, etc.Test your configuration
npm run validate npm run infoUpload your first file
node index.js upload --file document.pdf --service dropbox
⚙️ Configuration
Configuration is split into two systems:
| System | What goes here | Storage | |--------|---------------|---------| | config.yml | Non-secret settings (chunk sizes, timeouts, default provider) | Committed to repo | | Vault / .env | Secrets (API keys, OAuth tokens) | Never committed |
config.yml (Non-Secret Settings)
Copy the example and customise:
cp config_example.yml config.ymlThe library uses walk-up discovery — it searches from the current directory upward until it finds a config.yml. Settings live under the cloud-file-manager: key:
cloud-file-manager:
default_provider: dropbox
default_backup_directory: /cloud-file-manager
default_download_directory: ./downloads
max_retry_attempts: 3
chunk_size: 8388608 # 8 MB
timeout_ms: 300000 # 5 minutes
log_level: infoEnvironment variables override any config.yml value with the CFS_ prefix:
CFS_DEFAULT_PROVIDER=box # overrides default_provider
CFS_LOG_LEVEL=debug # overrides log_levelRequired Credentials
| Provider | Required Variables |
|----------|-------------------|
| Dropbox | DROPBOX_CLIENT_ID, DROPBOX_CLIENT_SECRET, DROPBOX_REFRESH_TOKEN (or DROPBOX_ACCESS_TOKEN for short-lived token) |
| Google Drive | GOOGLE_CLIENT_ID, GOOGLE_CLIENT_SECRET, GOOGLE_REFRESH_TOKEN (or GOOGLE_SERVICE_ACCOUNT_KEY / GOOGLE_SERVICE_ACCOUNT_KEY_PATH) |
| Box | BOX_CLIENT_ID, BOX_CLIENT_SECRET, BOX_ACCESS_TOKEN, BOX_REFRESH_TOKEN |
Validation
Check your configuration:
npm run validate
npm run info # Show configured services🔐 Credential Management
Credentials (API keys, OAuth tokens) are loaded via a three-tier fallback. The library picks the best available method automatically:
| Tier | Method | When Used | Speed |
|------|--------|-----------|-------|
| 1 | Native secret-ability | kadiClient provided to ConfigManager | Fastest (in-process) |
| 2 | KĀDI CLI (kadi secret) | kadi is installed but no kadiClient | Fast (subprocess) |
| 3 | .env file | No KĀDI at all | Immediate (file read) |
All three tiers load from the same set of credential keys. The vault name is cloud.
Tier 1 — Native Secret-Ability (KĀDI Agent Context)
When running as a KĀDI ability/agent, pass a KadiClient to ConfigManager:
import { KadiClient } from '@kadi.build/core';
const { ConfigManager, CloudStorageManager } = require('@kadi.build/cloud-file-service');
const client = new KadiClient({ name: 'my-agent', version: '1.0.0' });
const config = await ConfigManager.create({ kadiClient: client });
const cloud = new CloudStorageManager(config);Credentials are loaded via client.loadNative('secret-ability') — no subprocesses, no file I/O.
Tier 2 — KĀDI CLI (Developer Machine)
If you have kadi installed (e.g. via npm install -g @kadi.build/kadi) but aren't running inside a KĀDI agent, the library will automatically use kadi secret get/set to read from and write to the vault:
# Store credentials once
kadi secret set -v cloud DROPBOX_CLIENT_ID "your-client-id"
kadi secret set -v cloud DROPBOX_CLIENT_SECRET "your-secret"
kadi secret set -v cloud DROPBOX_REFRESH_TOKEN "your-token"
kadi secret set -v cloud DROPBOX_ACCESS_TOKEN "your-access-token"
# ... repeat for Google Drive and Box keysconst { ConfigManager, CloudStorageManager } = require('@kadi.build/cloud-file-service');
// No kadiClient needed — CLI vault is detected automatically
const config = await ConfigManager.create();
const cloud = new CloudStorageManager(config);Tier 3 — .env File (Standalone / npm Consumer)
If KĀDI is not installed at all, the library falls back to a standard .env file:
cp .env_example .env
# Edit .env with your credentials# Dropbox
DROPBOX_CLIENT_ID=your-client-id
DROPBOX_CLIENT_SECRET=your-secret
DROPBOX_REFRESH_TOKEN=your-refresh-token
DROPBOX_ACCESS_TOKEN=your-access-token
# Google Drive (OAuth)
GOOGLE_CLIENT_ID=your-google-client-id
GOOGLE_CLIENT_SECRET=your-google-secret
GOOGLE_REFRESH_TOKEN=your-google-refresh-token
# Google Drive (Service Account — alternative to OAuth)
# GOOGLE_SERVICE_ACCOUNT_KEY='{"type":"service_account",...}'
# GOOGLE_SERVICE_ACCOUNT_KEY_PATH=/path/to/service-account.json
# Box
BOX_CLIENT_ID=your-box-client-id
BOX_CLIENT_SECRET=your-box-secret
BOX_ACCESS_TOKEN=your-box-access-token
BOX_REFRESH_TOKEN=your-box-refresh-tokenEnvironment Variable Overrides
Regardless of which tier loads the credentials, individual environment variables always override:
export DROPBOX_ACCESS_TOKEN="override-token"
# This takes priority over vault or .env valuesToken Refresh Persistence
When OAuth tokens are refreshed at runtime, the library persists them back using the same three-tier strategy:
- Native — writes to vault via
secret-ability - CLI — writes to vault via
kadi secret set - Neither — logs new tokens to console (manual save needed)
You can also supply a custom callback (see Library Usage).
🔐 OAuth Setup
Automated OAuth Setup (Recommended)
Dropbox OAuth Setup
npm run setup:dropboxThis will:
- Open your browser for authorization
- Handle the OAuth flow automatically
- Save refresh tokens to vault (if kadi installed) or
.env - Test the connection
Google Drive OAuth Setup
npm run setup:googledriveThis will:
- Guide you through Google Cloud Console setup
- Handle the OAuth flow automatically
- Save refresh tokens to vault (if kadi installed) or
.env - Test the connection
Manual OAuth Setup
If the automated setup fails:
# Dropbox manual token exchange
npm run setup:dropbox-manual
# Google Drive manual token exchange
npm run setup:googledrive-manualCloud Provider Setup Guides
- Go to Dropbox App Console
- Create a new app with "Scoped access" and "Full Dropbox" access
- Enable these permissions in the Permissions tab:
files.metadata.read- View file and folder metadatafiles.metadata.write- Edit file and folder metadatafiles.content.read- View file contentsfiles.content.write- Edit file contents
- Add redirect URI:
http://localhost:8080/callback - Automated setup: Run
npm run setup:dropbox - Manual setup: Generate access token and add to
.env
- Go to Google Cloud Console
- Create a new project or select an existing one
- Enable the Google Drive API
- Configure OAuth consent screen with external user type
- Create OAuth 2.0 credentials (Desktop application type)
- Add redirect URI:
http://localhost:8080/callback - Automated setup: Run
npm run setup:googledrive - Manual setup: Use OAuth playground to get refresh token
- Go to Box Developer Console
- Create a new "Custom App" with "Standard OAuth 2.0"
- Configure OAuth settings and get client credentials
- Add redirect URI:
http://localhost:8080/callback - Use OAuth flow to obtain access and refresh tokens
- Add credentials to your
.envfile
💻 CLI Usage
Basic Commands
Test connections:
node index.js test # Test all configured services
node index.js test --service dropbox # Test specific serviceUpload files:
node index.js upload --file document.pdf --service dropbox
node index.js upload --file archive.zip --service googledrive --directory /documentsDownload files:
node index.js download --service dropbox --remote /uploads/document.pdf
node index.js download --service googledrive --remote /documents/archive.zip --local ./downloadsList files:
node index.js list --service dropbox # List files in default directory
node index.js list --service googledrive --directory /documentsAdvanced Operations
File management:
# Copy files
node index.js copy --service dropbox --remote /file.pdf --destination /backup/file.pdf
# Rename files
node index.js rename --service googledrive --remote /old-name.pdf --name new-name.pdf
# Delete files
node index.js delete --service box --remote /old-file.pdf --yes
# Get file information
node index.js info --service dropbox --remote /uploads/document.pdfFolder operations:
# Create folders
node index.js mkdir --service dropbox --remote /new-project-folder
# List folders
node index.js ls-folders --service googledrive --directory /projects
# Delete folders
node index.js rmdir --service box --remote /old-project --force --yesSearch files:
node index.js search --service googledrive --query "quarterly report"
node index.js search --service dropbox --query "*.pdf" --limit 50Automated Scripts
# Run comprehensive tests
npm test
npm run test:all
# Clean up test files
npm run clean
# Setup all providers
npm run setup📚 Library Usage
Basic Integration (npm Consumer)
const { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');
class MyApp {
constructor() {
this.cloudManager = null;
this.config = null;
}
async initialize() {
// ConfigManager auto-detects: vault (native/CLI) → .env → env vars
this.config = await ConfigManager.create();
this.cloudManager = new CloudStorageManager(this.config);
// Validate configuration
const validation = this.config.validate();
if (!validation.isValid) {
throw new Error(`Configuration errors: ${validation.errors.join(', ')}`);
}
}
async uploadFile(provider, localPath, remotePath) {
return await this.cloudManager.uploadFile(provider, localPath, remotePath);
}
async listFiles(provider, directory = '/') {
return await this.cloudManager.listFiles(provider, directory);
}
}Integration with KĀDI Agent
import { KadiClient } from '@kadi.build/core';
const { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');
const client = new KadiClient({ name: 'my-agent', version: '1.0.0' });
// Pass kadiClient for native vault access (fastest tier)
const config = await ConfigManager.create({ kadiClient: client });
const cloudManager = new CloudStorageManager(config);
console.log('Services:', config.getConfiguredServices());Token Refresh Callback (Custom Persistence)
By default, when a provider refreshes its OAuth tokens the library writes the
new values back through the three-tier system (native vault → CLI vault →
console log). If you manage credentials externally (database, custom vault,
etc.) you can supply an onTokenRefresh callback as the second argument to
CloudStorageManager. When provided, the callback is called instead of
the default persistence. If the callback throws, the provider falls back to
vault/CLI automatically.
const { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');
const config = new ConfigManager();
await config.load();
const cloudManager = new CloudStorageManager(config, {
/**
* Called when any provider refreshes its OAuth tokens.
* @param {string} provider - 'dropbox' | 'googledrive' | 'box'
* @param {object} tokens
* @param {string} tokens.accessToken - New access token
* @param {string} [tokens.refreshToken] - New refresh token (if rotated)
* @param {number} [tokens.expiresAt] - Expiry time (epoch ms)
*/
onTokenRefresh: async (provider, tokens) => {
// Example: persist to a secret vault
await vault.set(`${provider}_access_token`, tokens.accessToken);
if (tokens.refreshToken) {
await vault.set(`${provider}_refresh_token`, tokens.refreshToken);
}
console.log(`Saved refreshed ${provider} tokens to vault`);
},
});The callback receives the provider name ('dropbox', 'googledrive', or
'box') and a tokens object. The same interface works identically for all
three providers.
Advanced Integration
const { CloudStorageManager, ConfigManager } = require('@kadi.build/cloud-file-service');
class DocumentManager {
async initialize() {
this.config = await ConfigManager.create();
this.cloudManager = new CloudStorageManager(this.config);
}
async backupDocument(document, metadata = {}) {
const timestamp = new Date().toISOString().split('T')[0];
const remotePath = `/backups/${timestamp}/${document.name}`;
try {
// Upload to primary storage
const result = await this.cloudManager.uploadFile(
'dropbox',
document.path,
remotePath
);
// Create backup copy on secondary storage
await this.cloudManager.uploadFile(
'googledrive',
document.path,
remotePath
);
return {
success: true,
primary: { service: 'dropbox', fileId: result.id },
backup: { service: 'googledrive', path: remotePath },
timestamp: new Date().toISOString()
};
} catch (error) {
throw new Error(`Document backup failed: ${error.message}`);
}
}
async syncDocuments(sourceProvider, targetProvider, directory = '/') {
const files = await this.cloudManager.listFiles(sourceProvider, directory);
const results = [];
for (const file of files) {
try {
// Download from source
const localPath = `./temp/${file.name}`;
await this.cloudManager.downloadFile(sourceProvider, file.path, localPath);
// Upload to target
await this.cloudManager.uploadFile(targetProvider, localPath, file.path);
// Clean up temp file
await fs.unlink(localPath);
results.push({ file: file.name, status: 'synced' });
} catch (error) {
results.push({ file: file.name, status: 'failed', error: error.message });
}
}
return results;
}
}Batch Operations
// Upload multiple files
const uploadResults = await cloudManager.uploadMultipleFiles(
'dropbox',
['./file1.pdf', './file2.pdf', './file3.pdf'],
'/batch-upload'
);
// Download multiple files
const downloadResults = await cloudManager.downloadMultipleFiles(
'googledrive',
['/docs/file1.pdf', '/docs/file2.pdf'],
'./downloads'
);
// Sync directories
const syncResults = await cloudManager.syncDirectory(
'dropbox',
'./local-folder',
'/remote-folder',
{ dryRun: false, deleteRemote: false, overwrite: true }
);� KĀDI Ability Usage
This library powers the cloud-storage-ability — a KĀDI ability that exposes cloud file operations as 17 broker-callable tools. See the cloud-storage-ability for the full agent wrapper.
agent.json
The library ships with an agent.json that declares it as a KĀDI ability:
{
"name": "cloud-file-manager",
"version": "1.3.0",
"kind": "ability",
"abilities": {
"secret-ability": "^0.9.1"
}
}How Credentials Flow
┌──────────────────────────────────────────────────────────┐
│ KĀDI Agent (e.g. cloud-storage-ability) │
│ │
│ KadiClient ──► ConfigManager({ kadiClient }) │
│ │ │
│ ▼ │
│ ┌── Tier 1: native secret-ability ───┐ │
│ │ client.loadNative('secret-ability')│ │
│ │ secrets.invoke('get', {vault,key}) │ │
│ └────────────────────────────────────┘ │
└──────────────────────────────────────────────────────────┘
┌──────────────────────────────────────────────────────────┐
│ Developer Machine (kadi installed, no kadiClient) │
│ │
│ ConfigManager.create() │
│ │ │
│ ▼ │
│ ┌── Tier 2: kadi CLI ───────────────┐ │
│ │ kadi secret get -v cloud <KEY> │ │
│ └───────────────────────────────────┘ │
└──────────────────────────────────────────────────────────┘
┌──────────────────────────────────────────────────────────┐
│ npm Consumer (no kadi installed) │
│ │
│ ConfigManager.create() │
│ │ │
│ ▼ │
│ ┌── Tier 3: .env file ──────────────┐ │
│ │ reads .env from CWD │ │
│ └───────────────────────────────────┘ │
└──────────────────────────────────────────────────────────┘�🧪 Testing
Automated Testing
Run comprehensive tests for all providers:
npm test # Interactive test selection
npm run test:all # Test all providers sequentially
npm run test:dropbox # Test Dropbox only
npm run test:googledrive # Test Google Drive only
npm run test:box # Test Box onlyTest Categories
The test suite includes:
- Connection & Authentication - Verify credentials and API access
- Basic File Operations - Upload, download, list, info
- Folder Operations - Create, delete, rename, list
- File Management - Copy, rename, delete operations
- Search & Query - File search functionality
- Large File Handling - Chunked upload/download for 2MB+ files
- Edge Cases & Error Handling - Invalid paths, missing files
- Performance & Stress - Batch operations, timing tests
Test Results
Tests generate detailed reports in ./test-results/ with:
- Pass/fail statistics
- Performance metrics
- Error details
- Timing information
Cleanup
Clean up test artifacts:
npm run clean # Remove test files and directories📖 API Reference
CloudStorageManager
File Operations
uploadFile(serviceName, localPath, remotePath)- Upload a filedownloadFile(serviceName, remotePath, localPath)- Download a filegetFileInfo(serviceName, remotePath)- Get file metadatalistFiles(serviceName, remotePath, options)- List files in directorydeleteFile(serviceName, remotePath)- Delete a filerenameFile(serviceName, remotePath, newName)- Rename a filecopyFile(serviceName, sourcePath, destinationPath)- Copy a file
Folder Operations
createFolder(serviceName, remotePath)- Create a folderlistFolders(serviceName, remotePath)- List folders in directorydeleteFolder(serviceName, remotePath, recursive)- Delete a folderrenameFolder(serviceName, remotePath, newName)- Rename a foldergetFolderInfo(serviceName, remotePath)- Get folder metadata
Temporary Links
getTemporaryLink(serviceName, remotePath)- Get a temporary signed download URL
Search & Utility
searchFiles(serviceName, query, options)- Search for filestestConnection(serviceName)- Test service connectiongetAvailableServices()- Get list of configured services
Batch Operations
uploadMultipleFiles(serviceName, fileList, directory)- Upload multiple filesdownloadMultipleFiles(serviceName, fileList, directory)- Download multiple filessyncDirectory(serviceName, localDir, remoteDir, options)- Sync directories
ConfigManager
ConfigManager.create(options?)- Static factory: create + load in one callload()- Load configuration (vault → .env → env vars → defaults)get(key)- Get configuration valueset(key, value)- Set configuration valuevalidate()- Validate current configurationgetConfiguredServices()- Get list of configured servicesgetSummary()- Get configuration summarysafeUpdate(updates)- Persist credential changes (vault → .env)persistProviderTokens(provider, tokens)- Persist refreshed OAuth tokensloadFromConfigYml()- Reload non-secret settings from config.yml
⚡ Performance
Provider Features Matrix
| Feature | Dropbox | Google Drive | Box | |---------|---------|--------------|-----| | File Upload/Download | ✅ | ✅ | ✅ | | Large File Chunking | ✅ (150MB+) | ✅ (5MB+) | ✅ (20MB+) | | Progress Tracking | ✅ | ✅ | ✅ | | Checksum Verification | ✅ (SHA256) | ✅ (MD5) | ✅ (SHA1) | | Folder Operations | ✅ | ✅ | ✅ | | File Search | ✅ | ✅ | ✅ | | OAuth Refresh | ✅ | ✅ | ✅ | | Batch Operations | ✅ | ✅ | ✅ |
Performance Optimizations
- Automatic Chunking: Large files split into optimal chunks per provider
- Memory Efficiency: Streaming uploads minimize memory usage
- Retry Logic: Automatic retry with exponential backoff
- Concurrent Operations: Parallel uploads/downloads where supported
- Token Management: Automatic refresh prevents authentication failures
Best Practices
// Use batch operations for multiple files
const results = await cloudManager.uploadMultipleFiles(
'dropbox',
fileList,
'/uploads'
);
// Handle large files efficiently
const result = await cloudManager.uploadFile(
'googledrive',
'./large-file.zip', // Automatically uses chunked upload
'/backups/large-file.zip'
);
// Monitor progress for long operations
console.log('Upload completed:', result.name);🔧 Error Handling
The service provides comprehensive error handling for:
- Authentication Errors: Invalid or expired tokens (automatic refresh)
- Network Issues: Connection timeouts and retries
- File Operations: Not found, permission denied, quota exceeded
- Rate Limiting: Automatic backoff and retry logic
- Validation Errors: Invalid parameters and configurations
Error Types
try {
await cloudManager.uploadFile('dropbox', './file.pdf', '/upload/file.pdf');
} catch (error) {
if (error.message.includes('quota_exceeded')) {
console.log('Storage quota exceeded');
} else if (error.message.includes('unauthorized')) {
console.log('Authentication failed - check credentials');
} else if (error.message.includes('not_found')) {
console.log('File or folder not found');
}
}🤝 Contributing
We welcome contributions! Here's how to get started:
Development Setup
git clone <repository-url>
cd cloud-file-service
npm install
npm run setupAdding New Providers
- Create provider file in
src/providers/ - Implement all methods from
spec.md - Add configuration support in
configManager.js - Update
cloudStorageManager.js - Add comprehensive tests
- Update documentation
Testing Contributions
npm run lint # Check code style
npm run validate # Validate configuration
npm test # Run test suiteGuidelines
- Follow the Provider Integration Guide in
Cloud Provider Integration Spec.md - Maintain feature parity across providers
- Add comprehensive error handling
- Include performance optimizations
- Update documentation
📄 License
MIT License - see LICENSE file for details.
🆘 Support
- Issues: Create an issue
- Discussions: GitHub Discussions
- Documentation: Check this README and
spec.md
Common Issues
"Provider not configured" error:
- Run
npm run validateto check configuration - Run
npm run infoto see configured services - Check vault:
kadi secret get -v cloud DROPBOX_CLIENT_ID(if kadi installed) - Check
.envfile has correct credentials (if no kadi)
OAuth setup fails:
- Ensure redirect URI is
http://localhost:8080/callback - Check firewall/antivirus isn't blocking port 8080
- Try manual setup:
npm run setup:dropbox-manual
Large file upload issues:
- Check stable internet connection
- Monitor progress output for specific errors
- Verify sufficient storage quota in cloud provider
Cloud File Service Manager - Unified cloud storage for any application 🚀
