@alexwhitmore/storage-helpers
v1.0.1
Published
Upload files from URLs directly to Supabase Storage
Downloads
6
Maintainers
Readme
@alexwhitmore/storage-helpers
Upload files from external URLs directly to Supabase Storage — no double bandwidth, no client-side buffering.
The Problem
When you need to upload a file from an external URL (like AI-generated images, Dropbox files, or webhook payloads), the typical flow looks like this:
External URL → Your App (download) → Your App (upload) → Supabase Storage
↓ ↓
Uses bandwidth Uses bandwidth AGAIN
Uses memory Uses memory AGAINThis means:
- 🔴 2x bandwidth usage (you download, then upload)
- 🔴 High latency (two network hops)
- 🔴 Memory pressure (file buffered in client memory)
- 🔴 Size limitations (serverless limits, browser constraints)
The Solution
External URL → Edge Function → Supabase Storage
↓
Single server-side transfer- ✅ 1x bandwidth (server-side only)
- ✅ Lower latency (single hop from edge)
- ✅ No client memory (streaming on edge)
- ✅ Larger files (up to 50MB)
Quick Start
1. Install the package
npm install @alexwhitmore/storage-helpers2. Deploy the Edge Function
Copy the edge function to your Supabase project:
# Create the function directory
mkdir -p supabase/functions/upload-from-url
mkdir -p supabase/functions/_shared
# Copy the files (or use the CLI - see below)Or use the CLI:
npx @alexwhitmore/storage-helpers-cli initThen deploy:
supabase functions deploy upload-from-url3. Use in your app
import { createClient } from '@supabase/supabase-js'
import { createStorageHelpers } from '@alexwhitmore/storage-helpers'
const supabase = createClient(process.env.SUPABASE_URL!, process.env.SUPABASE_ANON_KEY!)
const helpers = createStorageHelpers(supabase)
// Upload a single file
const result = await helpers.uploadFromUrl({
bucket: 'images',
path: 'ai-generated/image.png',
sourceUrl: 'https://api.openai.com/v1/images/abc123.png',
})
console.log('Uploaded to:', result.path)
// => 'ai-generated/image.png'Batch uploads
Upload multiple files in parallel with controlled concurrency:
const results = await helpers.batchUploadFromUrl({
bucket: 'images',
jobs: [
{ sourceUrl: 'https://example.com/1.png', path: 'batch/1.png' },
{ sourceUrl: 'https://example.com/2.png', path: 'batch/2.png' },
{ sourceUrl: 'https://example.com/3.png', path: 'batch/3.png' },
],
concurrency: 3, // Process 3 at a time
upsert: true, // Overwrite existing files
})
const successful = results.filter((r) => r.success)
const failed = results.filter((r) => !r.success)
console.log(`Uploaded ${successful.length}/${results.length} files`)API Reference
createStorageHelpers(supabase, config?)
Creates a new StorageHelpers instance.
| Parameter | Type | Required | Description | | ------------------- | -------------- | -------- | ----------------------------------------------- | | supabase | SupabaseClient | ✅ | Supabase client instance | | config.functionName | string | ❌ | Edge function name (default: "upload-from-url") |
helpers.uploadFromUrl(options)
Upload a file from an external URL.
| Option | Type | Required | Default | Description | | ------------ | ------- | -------- | ------- | -------------------------- | | bucket | string | ✅ | - | Storage bucket name | | path | string | ✅ | - | Destination path in bucket | | sourceUrl | string | ✅ | - | URL to fetch file from | | contentType | string | ❌ | auto | Override content type | | cacheControl | string | ❌ | "3600" | Cache-Control header | | upsert | boolean | ❌ | false | Overwrite existing files |
Returns: Promise<{ path: string; id: string; fullPath: string }>
helpers.batchUploadFromUrl(options)
Upload multiple files in parallel.
| Option | Type | Required | Default | Description | | ------------ | ---------------- | -------- | ------- | ----------------------------- | | bucket | string | ✅ | - | Storage bucket name | | jobs | BatchUploadJob[] | ✅ | - | Array of upload jobs (max 20) | | concurrency | number | ❌ | 3 | Max parallel uploads | | cacheControl | string | ❌ | "3600" | Cache-Control for all files | | upsert | boolean | ❌ | false | Overwrite existing files |
Returns: Promise<BatchUploadResult[]>
Security
This package includes robust SSRF (Server-Side Request Forgery) protection:
| Protection | Description | | -------------------------- | ----------------------------------------------------------- | | 🛡️ Private IP blocking | Blocks 10.x, 172.16.x, 192.168.x, 127.x, etc. | | 🛡️ Cloud metadata blocking | Blocks 169.254.169.254 (AWS/GCP/Azure metadata) | | 🛡️ DNS validation | Resolves hostnames and checks resulting IPs | | 🛡️ Redirect blocking | Disables HTTP redirects (prevents redirect-based bypass) | | 🛡️ Protocol restriction | Only HTTP/HTTPS allowed (no file://, gopher://, etc.) | | 🛡️ Port blocking | Blocks common internal service ports (22, 3306, 5432, etc.) | | 🛡️ Encoding protection | Rejects octal IP notation and URL encoding tricks | | 🛡️ Authentication required | Requires valid Supabase JWT |
See SECURITY.md for complete details.
Error Handling
import { createStorageHelpers, SsrfError, UploadError, ValidationError } from '@alexwhitmore/storage-helpers'
try {
await helpers.uploadFromUrl({
bucket: 'images',
path: 'test.png',
sourceUrl: 'https://example.com/image.png',
})
} catch (error) {
if (error instanceof SsrfError) {
// URL was blocked for security reasons
console.error('Security blocked:', error.message)
} else if (error instanceof UploadError) {
// Upload failed
console.error('Upload failed:', error.code, error.message)
if (error.isRetryable()) {
// Retry the upload
}
} else if (error instanceof ValidationError) {
// Invalid input
console.error('Invalid input:', error.field, error.message)
}
}Use Cases
- 📸 AI Image Generation: Upload DALL-E/Midjourney/Stable Diffusion images directly
- 📁 Cloud Storage Integration: Transfer files from Dropbox, Google Drive, etc.
- 🔗 Webhook Processing: Store files from incoming webhooks
- 🖼️ Image Aggregation: Collect images from multiple sources
- 📦 Asset Migration: Bulk transfer assets from other services
Storage Bucket Setup
Your storage bucket needs RLS policies to allow uploads. Here are common configurations:
Public bucket (anyone can read)
-- Create bucket
INSERT INTO storage.buckets (id, name, public)
VALUES ('my-bucket', 'my-bucket', true);
-- Allow authenticated users to upload
CREATE POLICY "Authenticated users can upload"
ON storage.objects FOR INSERT
TO authenticated
WITH CHECK (bucket_id = 'my-bucket');
-- Allow public read access
CREATE POLICY "Public read access"
ON storage.objects FOR SELECT
TO public
USING (bucket_id = 'my-bucket');Private bucket (user-specific folders)
-- Create private bucket
INSERT INTO storage.buckets (id, name, public)
VALUES ('user-files', 'user-files', false);
-- Users can only upload to their own folder
CREATE POLICY "Users upload to own folder"
ON storage.objects FOR INSERT
TO authenticated
WITH CHECK (
bucket_id = 'user-files' AND
(storage.foldername(name))[1] = auth.uid()::text
);
-- Users can only read their own files
CREATE POLICY "Users read own files"
ON storage.objects FOR SELECT
TO authenticated
USING (
bucket_id = 'user-files' AND
(storage.foldername(name))[1] = auth.uid()::text
);Requirements
- Node.js 18+
- Supabase project with Edge Functions enabled
@supabase/supabase-jsv2.0+
Contributing
Contributions are welcome! Please read CONTRIBUTING.md first.
License
MIT © Alex Whitmore
