npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

cloudku-uploader

v3.0.0

Published

Blazing-fast, zero-dependency uploader for CloudKu. Supports auto-conversion, chunked uploads, and TypeScript. Easily upload images, videos, audio, and documents via Node.js.

Readme

cloudku-uploader

npm version downloads license bundle size TypeScript

Powerful file uploader client for CloudKu image hosting service with automatic chunked uploads, stream support, and load balancing across multiple endpoints.

✨ Features

  • 🚀 Smart Upload Strategy - Automatically switches between single and chunked upload based on file size (100MB threshold)
  • 📦 Chunked Upload Support - Handles large files by splitting into 8MB chunks with UUID-based tracking
  • 🌊 Stream Upload - Direct stream-to-upload capability for memory-efficient processing
  • 🔄 Built-in Load Balancing - Random endpoint selection across cloudkuimages.guru and cloudkuimages-guru.us.itpanel.app
  • 📤 Batch Upload - Upload multiple files concurrently with Promise.allSettled for resilient batch operations
  • 🎯 Simple & Clean API - Minimal surface area with three core methods
  • 📦 Dual Module Support - Ships with both ESM and CommonJS builds
  • 🔒 Type-safe - Full TypeScript definitions included
  • Zero Dependencies - Pure JavaScript implementation using native Web APIs

📦 Installation

npm install cloudku-uploader
yarn add cloudku-uploader
pnpm add cloudku-uploader

🚀 Usage

ESM (ES Modules)

import cloudku from 'cloudku-uploader'

const buffer = await fetch('image.jpg').then(r => r.arrayBuffer())
const result = await cloudku.uploadFile(buffer, 'image.jpg')

console.log(result.url)

CommonJS

const cloudku = require('cloudku-uploader')

const fs = require('fs').promises

async function upload() {
  const buffer = await fs.readFile('image.jpg')
  const result = await cloudku.uploadFile(buffer, 'image.jpg')
  console.log(result.url)
}

upload()

Browser - File Input

import cloudku from 'cloudku-uploader'

document.querySelector('#fileInput').addEventListener('change', async (e) => {
  const file = e.target.files[0]
  const buffer = await file.arrayBuffer()
  
  const result = await cloudku.uploadFile(buffer, file.name)
  console.log('Uploaded to:', result.url)
})

Node.js - File System

import cloudku from 'cloudku-uploader'
import { readFile } from 'fs/promises'

const buffer = await readFile('./photo.jpg')
const result = await cloudku.uploadFile(buffer.buffer, 'photo.jpg')

console.log('File URL:', result.url)

Stream Upload (Node.js)

import cloudku from 'cloudku-uploader'
import { createReadStream } from 'fs'
import { pipeline } from 'stream/promises'

const stream = createReadStream('./large-video.mp4')

const chunks = []
for await (const chunk of stream) {
  chunks.push(chunk)
}
const buffer = Buffer.concat(chunks)

const result = await cloudku.uploadFile(buffer.buffer, 'video.mp4')
console.log('Stream uploaded:', result.url)

Stream Upload with Progress Tracking

import cloudku from 'cloudku-uploader'
import { createReadStream } from 'fs'
import { stat } from 'fs/promises'

const filePath = './movie.mp4'
const fileStats = await stat(filePath)
const totalSize = fileStats.size

const stream = createReadStream(filePath)

let uploadedSize = 0
const chunks = []

for await (const chunk of stream) {
  chunks.push(chunk)
  uploadedSize += chunk.length
  const progress = (uploadedSize / totalSize * 100).toFixed(2)
  console.log(`Progress: ${progress}%`)
}

const buffer = Buffer.concat(chunks)
const result = await cloudku.uploadFile(buffer.buffer, 'movie.mp4')

console.log('Complete:', result.url)

Large File Upload with Custom Chunk Size

import cloudku from 'cloudku-uploader'

const buffer = await fetch('4k-video.mp4').then(r => r.arrayBuffer())

const result = await cloudku.uploadLarge(
  buffer,
  '4k-video.mp4',
  16 * 1024 * 1024
)

console.log('Large file URL:', result.url)

Batch Upload with Status Tracking

import cloudku from 'cloudku-uploader'

const files = [
  { buffer: buffer1, name: 'photo1.jpg' },
  { buffer: buffer2, name: 'photo2.png' },
  { buffer: buffer3, name: 'document.pdf' }
]

const results = await cloudku.uploadBatch(files)

const successful = results.filter(r => r.status === 'fulfilled')
const failed = results.filter(r => r.status === 'rejected')

console.log(`✓ ${successful.length} uploaded successfully`)
console.log(`✗ ${failed.length} failed`)

results.forEach((result, index) => {
  if (result.status === 'fulfilled') {
    console.log(`[${index + 1}] ${files[index].name}: ${result.value.url}`)
  } else {
    console.error(`[${index + 1}] ${files[index].name}: ${result.reason.message}`)
  }
})

📚 API Reference

uploadFile(buffer, name?)

Main upload method with automatic strategy selection based on file size.

Parameters:

  • buffer {ArrayBuffer} - File content as ArrayBuffer (required)
  • name {string} - Filename with extension (optional, default: 'file.bin')

Returns: Promise<UploadResult>

Behavior:

  • Files ≤ 100MB: Uses single POST request
  • Files > 100MB: Automatically switches to chunked upload

Example:

const buffer = await file.arrayBuffer()
const result = await cloudku.uploadFile(buffer, 'photo.jpg')

Response Schema:

{
  status: 'success',
  url: 'https://cloudkuimages.guru/files/abc123.jpg',
  filename: 'photo.jpg',
  size: 2048576
}

uploadLarge(buffer, name?, chunkSize?)

Explicit chunked upload for large files with progress control.

Parameters:

  • buffer {ArrayBuffer} - File content as ArrayBuffer (required)
  • name {string} - Filename with extension (optional, default: 'file.bin')
  • chunkSize {number} - Chunk size in bytes (optional, default: 8388608 = 8MB)

Returns: Promise<UploadResult>

Implementation Details:

  1. Generates UUID v4 as fileId for chunk tracking
  2. Splits buffer into chunks of specified size
  3. Uploads each chunk with metadata:
    • chunk: Current chunk index (0-based)
    • chunks: Total number of chunks
    • filename: Original filename
    • fileId: UUID for tracking
    • size: Total file size in bytes
  4. Sends finalization request with chunked=1&finalize=1 query params

Example:

const result = await cloudku.uploadLarge(
  buffer,
  'movie.mkv',
  10 * 1024 * 1024
)

Chunk Upload Request:

FormData {
  file: Blob(chunk),
  chunk: 0,
  chunks: 12,
  filename: 'movie.mkv',
  fileId: '550e8400-e29b-41d4-a716-446655440000',
  size: 104857600
}

Finalization Request:

POST /upload.php?chunked=1&finalize=1
Content-Type: application/json

{
  "fileId": "550e8400-e29b-41d4-a716-446655440000",
  "filename": "movie.mkv",
  "chunks": 12
}

uploadBatch(files)

Upload multiple files concurrently with individual error handling.

Parameters:

  • files {Array} - Array of file objects (required)

FileObject Schema:

{
  buffer: ArrayBuffer,
  name: string
}

Returns: Promise<Array<PromiseSettledResult<UploadResult>>>

Example:

const results = await cloudku.uploadBatch([
  { buffer: buffer1, name: 'image1.jpg' },
  { buffer: buffer2, name: 'image2.png' },
  { buffer: buffer3, name: 'video.mp4' }
])

results.forEach((result, index) => {
  if (result.status === 'fulfilled') {
    console.log(`Success: ${result.value.url}`)
  } else {
    console.error(`Failed: ${result.reason}`)
  }
})

🔧 Advanced Usage

Error Handling

try {
  const result = await cloudku.uploadFile(buffer, 'image.jpg')
  
  if (result.status === 'error') {
    throw new Error(result.message)
  }
  
  console.log('Uploaded:', result.url)
} catch (error) {
  console.error('Upload failed:', error.message)
}

Retry Logic

async function uploadWithRetry(buffer, name, maxRetries = 3) {
  for (let i = 0; i < maxRetries; i++) {
    try {
      const result = await cloudku.uploadFile(buffer, name)
      return result
    } catch (error) {
      if (i === maxRetries - 1) throw error
      await new Promise(resolve => setTimeout(resolve, 1000 * (i + 1)))
    }
  }
}

Custom Chunk Size Based on Connection

function getOptimalChunkSize(connectionType) {
  const sizes = {
    '4g': 16 * 1024 * 1024,
    'wifi': 10 * 1024 * 1024,
    '3g': 4 * 1024 * 1024,
    'slow-2g': 1 * 1024 * 1024
  }
  return sizes[connectionType] || 8 * 1024 * 1024
}

const connection = navigator.connection?.effectiveType || 'wifi'
const chunkSize = getOptimalChunkSize(connection)

const result = await cloudku.uploadLarge(buffer, 'file.zip', chunkSize)

Parallel Batch Upload with Concurrency Limit

async function uploadBatchWithLimit(files, limit = 3) {
  const results = []
  
  for (let i = 0; i < files.length; i += limit) {
    const batch = files.slice(i, i + limit)
    const batchResults = await cloudku.uploadBatch(batch)
    results.push(...batchResults)
  }
  
  return results
}

⚙️ Technical Details

Endpoint Selection

The uploader uses random selection between two endpoints:

  • https://cloudkuimages.guru
  • https://cloudkuimages-guru.us.itpanel.app

This provides basic load balancing and failover capability.

Upload Flow

Small File (<= 100MB):

Client → pickBase() → POST /upload.php → Response

Large File (> 100MB):

Client → Generate UUID
      → Split into chunks
      → For each chunk:
          → POST /upload.php (with metadata)
      → POST /upload.php?chunked=1&finalize=1
      → Response

Headers

All requests include:

{
  'User-Agent': 'cloudku-uploader/5.0',
  'Accept': 'application/json'
}

File Size Limits

  • Single upload: Recommended up to 100MB
  • Chunked upload: No hard limit (tested up to 5GB)
  • Default chunk size: 8MB (8,388,608 bytes)
  • Recommended chunk range: 4MB - 16MB

🌐 Environment Support

Browser

  • ✅ Chrome 90+
  • ✅ Firefox 88+
  • ✅ Safari 14+
  • ✅ Edge 90+

Node.js

  • ✅ Node.js 14.x
  • ✅ Node.js 16.x
  • ✅ Node.js 18.x
  • ✅ Node.js 20.x

Frameworks

  • ✅ React / Next.js
  • ✅ Vue / Nuxt
  • ✅ Angular
  • ✅ Svelte / SvelteKit

Module Systems

  • ✅ ESM (ES Modules)
  • ✅ CommonJS
  • ✅ UMD (via bundlers)

🛠️ Module Formats

ESM Import

import cloudku from 'cloudku-uploader'

CommonJS Require

const cloudku = require('cloudku-uploader')

TypeScript

import cloudku from 'cloudku-uploader'
import type { UploadResult, FileObject } from 'cloudku-uploader'

const result: UploadResult = await cloudku.uploadFile(buffer, 'file.jpg')

📝 Type Definitions

interface UploadResult {
  status: 'success' | 'error'
  url?: string
  filename?: string
  size?: number
  message?: string
}

interface FileObject {
  buffer: ArrayBuffer
  name: string
}

interface CloudkuUploader {
  uploadFile(buffer: ArrayBuffer, name?: string): Promise<UploadResult>
  uploadLarge(buffer: ArrayBuffer, name?: string, chunkSize?: number): Promise<UploadResult>
  uploadBatch(files: FileObject[]): Promise<PromiseSettledResult<UploadResult>[]>
}

🤝 Contributing

Contributions are welcome! Please follow these guidelines:

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/amazing-feature
  3. Commit your changes: git commit -m 'Add amazing feature'
  4. Push to the branch: git push origin feature/amazing-feature
  5. Open a Pull Request

Please ensure:

  • Code follows existing style
  • All tests pass
  • Documentation is updated
  • Commit messages are clear

📄 License

MIT License - see LICENSE file for details

🔗 Links

💬 Support

  • 📫 Open an issue on GitHub
  • 💡 Check existing issues for solutions
  • 📖 Read the full documentation

⭐ Acknowledgments

Special thanks to the CloudKu team for providing the hosting infrastructure.


Made with ❤️ for the JavaScript community