npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

rdash-sdk

v0.0.8

Published

Official JavaScript SDK for rdash — chunked uploads, and more

Readme

rdash-sdk

Official JavaScript SDK for rdash — chunked uploads, and more.

Features

  • ChunkUpload — parallel chunked file uploads with retries and progress tracking

Install

npm install rdash-sdk

Quick Start

Browser

import { ChunkedUploader } from "rdash-sdk";

const uploader = new ChunkedUploader({
  backendBaseUrl: "https://api.rdash.io/v1/core",
  authToken: "your_token",
});

const url = await uploader.upload(
  file, // File or Blob
  { path: "invoice-file" },
  (progress) => console.log(`${progress}%`)
);

Node.js

import { ChunkedUploader } from "rdash-sdk";
import { readFile } from "node:fs/promises";

const uploader = new ChunkedUploader({
  backendBaseUrl: "https://api.rdash.io/v1/core",
  authToken: "your_token",
});

const buffer = await readFile("./report.pdf");
const blob = new Blob([buffer], { type: "application/pdf" });

const url = await uploader.upload(blob, { path: "reports" });

Subpath Import

You can also import features directly:

import { ChunkedUploader } from "rdash-sdk/ChunkUpload";

Config Options

| Option | Type | Default | Description | |--------|------|---------|-------------| | backendBaseUrl | string | required | Base URL for the upload API | | multipleUpload | number | 3 | Parallel chunk uploads (1-6) | | authToken | string \| null | null | Bearer token for authentication | | keepAlive | boolean | true | Use HTTP keep-alive | | chunkTimeout | number | 30000 | Timeout per chunk in milliseconds |

API

upload(fileOrUrl, options, onProgress?)

Main method. Accepts a File, Blob, or URL string. Downloads URL inputs automatically, splits the file into chunks, uploads in parallel, and returns the final upload path.

const path = await uploader.upload(file, { path: "documents" }, (pct) => {});

startUpload(file, options)

Initiates a chunked upload session with the backend. Returns upload metadata including part URLs.

const uploadInfo = await uploader.startUpload(file, { path: "documents" });

uploadChunks(file, uploadInfo, onProgress?)

Uploads all chunks in parallel with automatic retries (up to 3 attempts per chunk with exponential backoff). Returns an array of part ETags.

const etags = await uploader.uploadChunks(file, uploadInfo, (pct) => {});

completeUpload(upload_id, object_key, etags)

Finalizes the multipart upload after all chunks are uploaded.

const result = await uploader.completeUpload(uploadId, objectKey, etags);

Progress Tracking

The onProgress callback receives a number from 0-100 representing the upload percentage. Updates are throttled to ~20 reports per upload to avoid excessive callbacks.

await uploader.upload(file, { path: "data" }, (progress) => {
  progressBar.style.width = `${progress}%`;
  progressLabel.textContent = `${progress}%`;
});

Node.js Compatibility

  • Node 18+: Supported via Blob. Pass a Blob constructed from a Buffer.
  • Node 20+: Full support including File objects and URL-to-file downloads.

The fetch API is available globally in Node 18+, which this library relies on. No polyfills are needed.