@therivalkiller3011/react-file-upload-pro
v1.0.1
Published
Production-ready React file upload with chunking and resume
Maintainers
Readme
React File Upload Pro
A production-ready React file upload library with chunked uploads, pause/resume capability, and support for files of any size including large scientific formats (.h5ad, .zarr, .nii, etc.).
✨ Features
- 📦 Chunked Uploads - Handle files of ANY size (10GB+) with configurable chunk size
- ⏸️ Pause/Resume - Resume uploads after browser restart with IndexedDB persistence
- 📁 Folder Upload - Automatic browser-side compression with JSZip
- 🎯 Drag & Drop - Beautiful drag-and-drop interface with visual feedback
- 📊 Progress Tracking - Real-time per-file and overall progress
- 🔄 Auto Retry - Automatic retry with exponential backoff for failed chunks
- 🚀 Adaptive Chunking - Dynamic chunk sizing based on connection speed
- 🎨 Beautiful UI - Built with Tailwind CSS and shadcn/ui components
- ♿ Accessible - WCAG compliant with keyboard navigation
- 📱 Responsive - Works seamlessly on desktop and mobile
- 🌙 Dark Mode - Built-in dark mode support
- 💪 TypeScript - Full type safety and excellent DX
🔬 Specialized Format Support
Perfect for scientific and large-format files:
.h5ad(AnnData/single-cell data).zarr(chunked array storage).nii,.nii.gz(NIfTI neuroimaging).tiff,.tif(large microscopy images)- Video files (
.mp4,.mov,.avi) - And any other file format!
📦 Installation
npm install @yourname/react-file-upload-pro🚀 Quick Start
import { FileUploader } from '@yourname/react-file-upload-pro';
import '@yourname/react-file-upload-pro/styles.css';
function MyApp() {
const config = {
uploadChunk: async (chunk, metadata) => {
const formData = new FormData();
formData.append('chunk', chunk);
formData.append('chunkIndex', String(metadata.chunkIndex));
formData.append('totalChunks', String(metadata.totalChunks));
formData.append('fileId', metadata.fileId);
formData.append('fileName', metadata.fileName);
await fetch('/api/upload', {
method: 'POST',
body: formData
});
},
chunkSize: 5 * 1024 * 1024, // 5MB chunks
maxConcurrent: 3,
onComplete: (fileId, fileName) => {
console.log('Upload complete:', fileName);
}
};
return <FileUploader config={config} />;
}📖 API Reference
UploadConfig
interface UploadConfig {
// Required: Function to upload each chunk
uploadChunk: (chunk: Blob, metadata: ChunkMetadata) => Promise<void>;
// Optional configurations
chunkSize?: number; // Default: 5MB
maxConcurrent?: number; // Default: 3
enableResume?: boolean; // Default: true
enableHashVerification?: boolean; // Default: false
maxRetries?: number; // Default: 3
adaptiveChunkSize?: boolean; // Default: false
// Validation
validation?: {
allowedTypes?: string[]; // MIME types or extensions
maxSize?: number; // bytes
minSize?: number; // bytes
customValidation?: (file: File) => Promise<boolean> | boolean;
};
// Callbacks
onProgress?: (progress: FileProgress) => void;
onQueueProgress?: (progress: QueueProgress) => void;
onComplete?: (fileId: string, fileName: string) => void;
onError?: (fileId: string, error: Error) => void;
onPause?: (fileId: string) => void;
onResume?: (fileId: string) => void;
onCancel?: (fileId: string) => void;
}ChunkMetadata
Data passed to your uploadChunk function:
interface ChunkMetadata {
fileId: string; // Unique file identifier
fileName: string; // Original filename
fileSize: number; // Total file size
fileType: string; // MIME type
chunkIndex: number; // Current chunk (0-indexed)
totalChunks: number; // Total number of chunks
chunkSize: number; // Size of this chunk
chunkHash?: string; // Hash for verification (if enabled)
isFolder?: boolean; // True if compressed folder
}🎯 Components
FileUploader
Main component with all features:
<FileUploader
config={uploadConfig}
maxFiles={10}
disabled={false}
className="custom-class"
/>FolderUploader
With folder support and compression:
<FolderUploader
config={uploadConfig}
maxFiles={5}
/>Using the Hook Directly
For custom UIs:
import { useFileUpload } from '@yourname/react-file-upload-pro';
function CustomUploader() {
const {
upload,
pause,
resume,
cancel,
retry,
removeFile,
queue,
queueProgress,
isUploading
} = useFileUpload(config);
return (
<div>
<input
type="file"
multiple
onChange={(e) => upload(Array.from(e.target.files || []))}
/>
{queue.map(item => (
<div key={item.id}>
{item.file.name} - {item.progress.percentage.toFixed(1)}%
<button onClick={() => pause(item.id)}>Pause</button>
<button onClick={() => resume(item.id)}>Resume</button>
</div>
))}
</div>
);
}🏗️ Backend Integration
Node.js/Express Example
const express = require('express');
const multer = require('multer');
const fs = require('fs').promises;
const path = require('path');
const app = express();
const upload = multer({ dest: 'uploads/chunks/' });
// Track upload sessions
const uploadSessions = new Map();
app.post('/api/upload', upload.single('chunk'), async (req, res) => {
try {
const { fileId, fileName, chunkIndex, totalChunks } = req.body;
const chunkPath = req.file.path;
// Initialize session if needed
if (!uploadSessions.has(fileId)) {
uploadSessions.set(fileId, {
fileName,
totalChunks: parseInt(totalChunks),
receivedChunks: new Set(),
chunks: []
});
}
const session = uploadSessions.get(fileId);
session.receivedChunks.add(parseInt(chunkIndex));
session.chunks[parseInt(chunkIndex)] = chunkPath;
// Check if all chunks received
if (session.receivedChunks.size === session.totalChunks) {
// Assemble file
const finalPath = path.join('uploads', fileName);
const writeStream = fs.createWriteStream(finalPath);
for (let i = 0; i < session.totalChunks; i++) {
const chunkData = await fs.readFile(session.chunks[i]);
writeStream.write(chunkData);
await fs.unlink(session.chunks[i]); // Delete chunk
}
writeStream.end();
uploadSessions.delete(fileId);
res.json({ complete: true, path: finalPath });
} else {
res.json({ complete: false, received: session.receivedChunks.size });
}
} catch (error) {
res.status(500).json({ error: error.message });
}
});
app.listen(3000);Python/FastAPI Example
from fastapi import FastAPI, UploadFile, Form
from pathlib import Path
import shutil
app = FastAPI()
upload_sessions = {}
@app.post("/api/upload")
async def upload_chunk(
chunk: UploadFile,
fileId: str = Form(...),
fileName: str = Form(...),
chunkIndex: int = Form(...),
totalChunks: int = Form(...)
):
# Create session
if fileId not in upload_sessions:
upload_sessions[fileId] = {
'fileName': fileName,
'totalChunks': totalChunks,
'chunks': {}
}
# Save chunk
chunk_dir = Path(f"uploads/chunks/{fileId}")
chunk_dir.mkdir(parents=True, exist_ok=True)
chunk_path = chunk_dir / f"chunk_{chunkIndex}"
with chunk_path.open("wb") as f:
shutil.copyfileobj(chunk.file, f)
session = upload_sessions[fileId]
session['chunks'][chunkIndex] = chunk_path
# Check if complete
if len(session['chunks']) == totalChunks:
# Assemble file
final_path = Path(f"uploads/{fileName}")
with final_path.open("wb") as outfile:
for i in range(totalChunks):
with session['chunks'][i].open("rb") as infile:
shutil.copyfileobj(infile, outfile)
session['chunks'][i].unlink() # Delete chunk
shutil.rmtree(chunk_dir)
del upload_sessions[fileId]
return {"complete": True, "path": str(final_path)}
return {"complete": False, "received": len(session['chunks'])}🎨 Styling
The library uses Tailwind CSS. Import the styles in your app:
import '@yourname/react-file-upload-pro/styles.css';Custom Styling
Override default styles:
/* Custom progress bar color */
.upload-progress-bar {
--primary: 220 80% 60%;
}
/* Custom dropzone */
.file-dropzone {
border-radius: 1rem;
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
}📊 Progress Events
const config = {
uploadChunk: uploadFn,
// Per-file progress
onProgress: (progress) => {
console.log({
fileName: progress.fileName,
percentage: progress.percentage,
uploadedBytes: progress.uploadedBytes,
speed: progress.speed,
estimatedTime: progress.estimatedTimeRemaining
});
},
// Overall queue progress
onQueueProgress: (progress) => {
console.log({
completedFiles: progress.completedFiles,
totalFiles: progress.totalFiles,
overallPercentage: progress.overallPercentage,
averageSpeed: progress.averageSpeed
});
}
};🔒 Security Considerations
- Always validate files on the server - Don't trust client-side validation
- Implement rate limiting - Prevent abuse of upload endpoints
- Verify file integrity - Use hash verification for critical files
- Sanitize filenames - Prevent path traversal attacks
- Set size limits - Protect against disk space exhaustion
// Enable hash verification
const config = {
uploadChunk: uploadFn,
enableHashVerification: true, // Sends SHA-256 hash with each chunk
};🧪 Testing
# Run demo app
npm run dev
# Build library
npm run build
# Test with large files
# Drop a 5GB+ file to test chunking and resume📝 License
MIT
🤝 Contributing
Contributions welcome! Please read our contributing guidelines.
📧 Support
- GitHub Issues: your-repo/issues
- Email: [email protected]
Made with ❤️ for developers who need to handle large files
