rdash-sdk
v0.0.8
Published
Official JavaScript SDK for rdash — chunked uploads, and more
Maintainers
Readme
rdash-sdk
Official JavaScript SDK for rdash — chunked uploads, and more.
Features
- ChunkUpload — parallel chunked file uploads with retries and progress tracking
Install
npm install rdash-sdkQuick Start
Browser
import { ChunkedUploader } from "rdash-sdk";
const uploader = new ChunkedUploader({
backendBaseUrl: "https://api.rdash.io/v1/core",
authToken: "your_token",
});
const url = await uploader.upload(
file, // File or Blob
{ path: "invoice-file" },
(progress) => console.log(`${progress}%`)
);Node.js
import { ChunkedUploader } from "rdash-sdk";
import { readFile } from "node:fs/promises";
const uploader = new ChunkedUploader({
backendBaseUrl: "https://api.rdash.io/v1/core",
authToken: "your_token",
});
const buffer = await readFile("./report.pdf");
const blob = new Blob([buffer], { type: "application/pdf" });
const url = await uploader.upload(blob, { path: "reports" });Subpath Import
You can also import features directly:
import { ChunkedUploader } from "rdash-sdk/ChunkUpload";Config Options
| Option | Type | Default | Description |
|--------|------|---------|-------------|
| backendBaseUrl | string | required | Base URL for the upload API |
| multipleUpload | number | 3 | Parallel chunk uploads (1-6) |
| authToken | string \| null | null | Bearer token for authentication |
| keepAlive | boolean | true | Use HTTP keep-alive |
| chunkTimeout | number | 30000 | Timeout per chunk in milliseconds |
API
upload(fileOrUrl, options, onProgress?)
Main method. Accepts a File, Blob, or URL string. Downloads URL inputs automatically, splits the file into chunks, uploads in parallel, and returns the final upload path.
const path = await uploader.upload(file, { path: "documents" }, (pct) => {});startUpload(file, options)
Initiates a chunked upload session with the backend. Returns upload metadata including part URLs.
const uploadInfo = await uploader.startUpload(file, { path: "documents" });uploadChunks(file, uploadInfo, onProgress?)
Uploads all chunks in parallel with automatic retries (up to 3 attempts per chunk with exponential backoff). Returns an array of part ETags.
const etags = await uploader.uploadChunks(file, uploadInfo, (pct) => {});completeUpload(upload_id, object_key, etags)
Finalizes the multipart upload after all chunks are uploaded.
const result = await uploader.completeUpload(uploadId, objectKey, etags);Progress Tracking
The onProgress callback receives a number from 0-100 representing the upload percentage. Updates are throttled to ~20 reports per upload to avoid excessive callbacks.
await uploader.upload(file, { path: "data" }, (progress) => {
progressBar.style.width = `${progress}%`;
progressLabel.textContent = `${progress}%`;
});Node.js Compatibility
- Node 18+: Supported via
Blob. Pass aBlobconstructed from aBuffer. - Node 20+: Full support including
Fileobjects and URL-to-file downloads.
The fetch API is available globally in Node 18+, which this library relies on. No polyfills are needed.
