npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@alexwhitmore/storage-helpers

v1.0.1

Published

Upload files from URLs directly to Supabase Storage

Downloads

6

Readme

@alexwhitmore/storage-helpers

npm version License: MIT

Upload files from external URLs directly to Supabase Storage — no double bandwidth, no client-side buffering.

The Problem

When you need to upload a file from an external URL (like AI-generated images, Dropbox files, or webhook payloads), the typical flow looks like this:

External URL → Your App (download) → Your App (upload) → Supabase Storage
                    ↓                      ↓
              Uses bandwidth          Uses bandwidth AGAIN
              Uses memory             Uses memory AGAIN

This means:

  • 🔴 2x bandwidth usage (you download, then upload)
  • 🔴 High latency (two network hops)
  • 🔴 Memory pressure (file buffered in client memory)
  • 🔴 Size limitations (serverless limits, browser constraints)

The Solution

External URL → Edge Function → Supabase Storage
                    ↓
            Single server-side transfer
  • 1x bandwidth (server-side only)
  • Lower latency (single hop from edge)
  • No client memory (streaming on edge)
  • Larger files (up to 50MB)

Quick Start

1. Install the package

npm install @alexwhitmore/storage-helpers

2. Deploy the Edge Function

Copy the edge function to your Supabase project:

# Create the function directory
mkdir -p supabase/functions/upload-from-url
mkdir -p supabase/functions/_shared

# Copy the files (or use the CLI - see below)

Or use the CLI:

npx @alexwhitmore/storage-helpers-cli init

Then deploy:

supabase functions deploy upload-from-url

3. Use in your app

import { createClient } from '@supabase/supabase-js'
import { createStorageHelpers } from '@alexwhitmore/storage-helpers'

const supabase = createClient(process.env.SUPABASE_URL!, process.env.SUPABASE_ANON_KEY!)

const helpers = createStorageHelpers(supabase)

// Upload a single file
const result = await helpers.uploadFromUrl({
  bucket: 'images',
  path: 'ai-generated/image.png',
  sourceUrl: 'https://api.openai.com/v1/images/abc123.png',
})

console.log('Uploaded to:', result.path)
// => 'ai-generated/image.png'

Batch uploads

Upload multiple files in parallel with controlled concurrency:

const results = await helpers.batchUploadFromUrl({
  bucket: 'images',
  jobs: [
    { sourceUrl: 'https://example.com/1.png', path: 'batch/1.png' },
    { sourceUrl: 'https://example.com/2.png', path: 'batch/2.png' },
    { sourceUrl: 'https://example.com/3.png', path: 'batch/3.png' },
  ],
  concurrency: 3, // Process 3 at a time
  upsert: true, // Overwrite existing files
})

const successful = results.filter((r) => r.success)
const failed = results.filter((r) => !r.success)

console.log(`Uploaded ${successful.length}/${results.length} files`)

API Reference

createStorageHelpers(supabase, config?)

Creates a new StorageHelpers instance.

| Parameter | Type | Required | Description | | ------------------- | -------------- | -------- | ----------------------------------------------- | | supabase | SupabaseClient | ✅ | Supabase client instance | | config.functionName | string | ❌ | Edge function name (default: "upload-from-url") |

helpers.uploadFromUrl(options)

Upload a file from an external URL.

| Option | Type | Required | Default | Description | | ------------ | ------- | -------- | ------- | -------------------------- | | bucket | string | ✅ | - | Storage bucket name | | path | string | ✅ | - | Destination path in bucket | | sourceUrl | string | ✅ | - | URL to fetch file from | | contentType | string | ❌ | auto | Override content type | | cacheControl | string | ❌ | "3600" | Cache-Control header | | upsert | boolean | ❌ | false | Overwrite existing files |

Returns: Promise<{ path: string; id: string; fullPath: string }>

helpers.batchUploadFromUrl(options)

Upload multiple files in parallel.

| Option | Type | Required | Default | Description | | ------------ | ---------------- | -------- | ------- | ----------------------------- | | bucket | string | ✅ | - | Storage bucket name | | jobs | BatchUploadJob[] | ✅ | - | Array of upload jobs (max 20) | | concurrency | number | ❌ | 3 | Max parallel uploads | | cacheControl | string | ❌ | "3600" | Cache-Control for all files | | upsert | boolean | ❌ | false | Overwrite existing files |

Returns: Promise<BatchUploadResult[]>

Security

This package includes robust SSRF (Server-Side Request Forgery) protection:

| Protection | Description | | -------------------------- | ----------------------------------------------------------- | | 🛡️ Private IP blocking | Blocks 10.x, 172.16.x, 192.168.x, 127.x, etc. | | 🛡️ Cloud metadata blocking | Blocks 169.254.169.254 (AWS/GCP/Azure metadata) | | 🛡️ DNS validation | Resolves hostnames and checks resulting IPs | | 🛡️ Redirect blocking | Disables HTTP redirects (prevents redirect-based bypass) | | 🛡️ Protocol restriction | Only HTTP/HTTPS allowed (no file://, gopher://, etc.) | | 🛡️ Port blocking | Blocks common internal service ports (22, 3306, 5432, etc.) | | 🛡️ Encoding protection | Rejects octal IP notation and URL encoding tricks | | 🛡️ Authentication required | Requires valid Supabase JWT |

See SECURITY.md for complete details.

Error Handling

import { createStorageHelpers, SsrfError, UploadError, ValidationError } from '@alexwhitmore/storage-helpers'

try {
  await helpers.uploadFromUrl({
    bucket: 'images',
    path: 'test.png',
    sourceUrl: 'https://example.com/image.png',
  })
} catch (error) {
  if (error instanceof SsrfError) {
    // URL was blocked for security reasons
    console.error('Security blocked:', error.message)
  } else if (error instanceof UploadError) {
    // Upload failed
    console.error('Upload failed:', error.code, error.message)
    if (error.isRetryable()) {
      // Retry the upload
    }
  } else if (error instanceof ValidationError) {
    // Invalid input
    console.error('Invalid input:', error.field, error.message)
  }
}

Use Cases

  • 📸 AI Image Generation: Upload DALL-E/Midjourney/Stable Diffusion images directly
  • 📁 Cloud Storage Integration: Transfer files from Dropbox, Google Drive, etc.
  • 🔗 Webhook Processing: Store files from incoming webhooks
  • 🖼️ Image Aggregation: Collect images from multiple sources
  • 📦 Asset Migration: Bulk transfer assets from other services

Storage Bucket Setup

Your storage bucket needs RLS policies to allow uploads. Here are common configurations:

Public bucket (anyone can read)

-- Create bucket
INSERT INTO storage.buckets (id, name, public)
VALUES ('my-bucket', 'my-bucket', true);

-- Allow authenticated users to upload
CREATE POLICY "Authenticated users can upload"
ON storage.objects FOR INSERT
TO authenticated
WITH CHECK (bucket_id = 'my-bucket');

-- Allow public read access
CREATE POLICY "Public read access"
ON storage.objects FOR SELECT
TO public
USING (bucket_id = 'my-bucket');

Private bucket (user-specific folders)

-- Create private bucket
INSERT INTO storage.buckets (id, name, public)
VALUES ('user-files', 'user-files', false);

-- Users can only upload to their own folder
CREATE POLICY "Users upload to own folder"
ON storage.objects FOR INSERT
TO authenticated
WITH CHECK (
  bucket_id = 'user-files' AND
  (storage.foldername(name))[1] = auth.uid()::text
);

-- Users can only read their own files
CREATE POLICY "Users read own files"
ON storage.objects FOR SELECT
TO authenticated
USING (
  bucket_id = 'user-files' AND
  (storage.foldername(name))[1] = auth.uid()::text
);

Requirements

  • Node.js 18+
  • Supabase project with Edge Functions enabled
  • @supabase/supabase-js v2.0+

Contributing

Contributions are welcome! Please read CONTRIBUTING.md first.

License

MIT © Alex Whitmore