firestore-batch-updater
v1.2.0
Published
Batch update Firestore documents with query-based filtering and preview
Maintainers
Readme
Firestore Batch Updater
Easy batch updates for Firebase Firestore with query-based filtering and progress tracking.
English | 한국어
Features
- Query-based updates - Filter documents with
where()conditions - No 500 document limit - Uses Firebase Admin SDK's BulkWriter
- Preview changes - See before/after comparison before updating
- Progress tracking - Real-time progress callbacks
- Batch create/upsert/delete - Create, upsert, or delete multiple documents at once
- Sorting and limiting - Use
orderBy()andlimit()for precise control - FieldValue support - Use
increment(),arrayUnion(),delete(),serverTimestamp(), etc. - Subcollection & Collection Group - Query subcollections or all collections with the same name
- Dry run mode - Simulate operations without making changes
- Count documents - Quickly count matching documents without loading them
- Log file generation - Optional detailed operation logs for auditing
Installation
# npm
npm install firestore-batch-updater
# yarn
yarn add firestore-batch-updater
# pnpm
pnpm add firestore-batch-updaterRequired peer dependency:
# npm
npm install firebase-admin
# yarn
yarn add firebase-admin
# pnpm
pnpm add firebase-adminQuick Start
import { BatchUpdater } from "firestore-batch-updater";
import { getFirestore } from "firebase-admin/firestore";
const firestore = getFirestore();
const updater = new BatchUpdater(firestore);
// Preview changes
const preview = await updater
.collection("users")
.where("status", "==", "inactive")
.preview({ status: "archived" });
console.log(`Will affect ${preview.affectedCount} documents`);
// Execute update
const result = await updater
.collection("users")
.where("status", "==", "inactive")
.update({ status: "archived" });
console.log(`Updated ${result.successCount} documents`);API Reference
Methods Overview
| Method | Description | Returns |
|--------|-------------|---------|
| collection(path) | Select collection to operate on (supports subcollection paths) | this |
| collectionGroup(id) | Query all collections with the same ID | this |
| where(field, op, value) | Add filter condition (chainable) | this |
| orderBy(field, direction?) | Add sorting (chainable) | this |
| limit(count) | Limit number of documents (chainable) | this |
| count() | Count matching documents | CountResult |
| preview(data) | Preview changes before update | PreviewResult |
| update(data, options?) | Update matching documents | UpdateResult |
| create(docs, options?) | Create new documents | CreateResult |
| upsert(data, options?) | Update or create (set with merge) | UpsertResult |
| delete(options?) | Delete matching documents | DeleteResult |
| getFields(field) | Get specific field values | FieldValueResult[] |
Options
All write operations support an optional options parameter:
{
onProgress?: (progress: ProgressInfo) => void;
log?: LogOptions;
batchSize?: number; // For update/upsert/delete
dryRun?: boolean; // For update/upsert/delete - simulate without writing
}
// ProgressInfo
{
current: number; // Documents processed
total: number; // Total documents
percentage: number; // 0-100
}
// LogOptions
{
enabled: boolean; // Enable log file generation
path?: string; // Custom log directory (default: ./logs)
filename?: string; // Custom filename (default: auto-generated)
}batchSize option (for large collections):
- When not set: All documents are loaded into memory at once (suitable for small collections)
- When set (e.g.,
batchSize: 1000): Documents are processed in batches using cursor pagination (suitable for large collections to prevent memory issues)
dryRun option:
- When
true: ReturnsDryRunResultwithwouldAffectcount andsampleIdswithout making any changes
Return Types
| Type | Fields |
|------|--------|
| CountResult | count |
| DryRunResult | wouldAffect, sampleIds[], operation |
| PreviewResult | affectedCount, samples[], affectedFields[] |
| UpdateResult | successCount, failureCount, totalCount, failedDocIds?, logFilePath? |
| CreateResult | successCount, failureCount, totalCount, createdIds[], failedDocIds?, logFilePath? |
| UpsertResult | successCount, failureCount, totalCount, failedDocIds?, logFilePath? |
| DeleteResult | successCount, failureCount, totalCount, deletedIds[], failedDocIds?, logFilePath? |
| FieldValueResult | id, value |
Usage Examples
Update Documents
const result = await updater
.collection("users")
.where("status", "==", "inactive")
.update({ status: "archived" });Create Documents
// Auto-generated IDs
const result = await updater.collection("users").create([
{ data: { name: "Alice", age: 30 } },
{ data: { name: "Bob", age: 25 } },
]);
console.log("Created IDs:", result.createdIds);
// With specific IDs
const result2 = await updater.collection("users").create([
{ id: "user-001", data: { name: "Charlie" } },
{ id: "user-002", data: { name: "Diana" } },
]);Upsert Documents
const result = await updater
.collection("users")
.where("status", "==", "active")
.upsert({ tier: "premium", updatedAt: new Date() });Delete Documents
const result = await updater
.collection("users")
.where("status", "==", "inactive")
.delete();
console.log(`Deleted ${result.successCount} documents`);
console.log("Deleted IDs:", result.deletedIds);Preview Before Update
const preview = await updater
.collection("orders")
.where("status", "==", "pending")
.preview({ status: "cancelled" });
if (preview.affectedCount > 1000) {
console.log("Too many documents. Aborting.");
} else {
await updater
.collection("orders")
.where("status", "==", "pending")
.update({ status: "cancelled" });
}Progress Tracking
const result = await updater
.collection("products")
.where("inStock", "==", false)
.update(
{ status: "discontinued" },
{
onProgress: (progress) => {
console.log(`${progress.percentage}% complete`);
},
}
);Get Field Values
const emails = await updater
.collection("users")
.where("status", "==", "active")
.getFields("email");
// [{ id: 'user1', value: '[email protected]' }, ...]Multiple Conditions
const ninetyDaysAgo = new Date();
ninetyDaysAgo.setDate(ninetyDaysAgo.getDate() - 90);
const result = await updater
.collection("users")
.where("status", "==", "inactive")
.where("lastLoginAt", "<", ninetyDaysAgo)
.where("accountType", "==", "free")
.update({ status: "archived" });Note: When using multiple
where()conditions on different fields, or combiningwhere()withorderBy()on different fields, Firestore may require a composite index. If you see aFAILED_PRECONDITIONerror, follow the link in the error message to create the required index.
Sorting and Limiting
// Get top 10 users by score
const result = await updater
.collection("users")
.orderBy("score", "desc")
.limit(10)
.update({ featured: true });
// Delete oldest 100 inactive users
const deleted = await updater
.collection("users")
.where("status", "==", "inactive")
.orderBy("createdAt", "asc")
.limit(100)
.delete();Using FieldValue
import { BatchUpdater, FieldValue } from "firestore-batch-updater";
// Increment a counter
await updater
.collection("users")
.where("status", "==", "active")
.update({ loginCount: FieldValue.increment(1) });
// Add to array
await updater
.collection("users")
.where("tier", "==", "premium")
.update({ tags: FieldValue.arrayUnion("vip", "priority") });
// Remove from array
await updater
.collection("users")
.where("status", "==", "inactive")
.update({ tags: FieldValue.arrayRemove("active") });
// Server timestamp
await updater
.collection("users")
.where("status", "==", "active")
.update({ lastSeen: FieldValue.serverTimestamp() });
// Delete a field
await updater
.collection("users")
.where("status", "==", "inactive")
.update({ temporaryData: FieldValue.delete() });Count Documents
// Quickly count matching documents without loading them
const result = await updater
.collection("users")
.where("status", "==", "inactive")
.count();
console.log(`Found ${result.count} inactive users`);Dry Run Mode
// Simulate an operation without making any changes
const simulation = await updater
.collection("users")
.where("status", "==", "inactive")
.update(
{ status: "archived" },
{ dryRun: true }
);
console.log(`Would affect ${simulation.wouldAffect} documents`);
console.log("Sample IDs:", simulation.sampleIds);
// Also works with delete
const deleteSimulation = await updater
.collection("logs")
.where("createdAt", "<", thirtyDaysAgo)
.delete({ dryRun: true });
console.log(`Would delete ${deleteSimulation.wouldAffect} documents`);Subcollections
// Query a specific subcollection path
const result = await updater
.collection("users/user-123/orders")
.where("status", "==", "pending")
.update({ status: "cancelled" });
// Or use dynamic paths
const userId = "user-123";
await updater
.collection(`users/${userId}/notifications`)
.where("read", "==", false)
.delete();Collection Group Queries
// Query ALL "orders" subcollections across all users
const result = await updater
.collectionGroup("orders")
.where("status", "==", "pending")
.where("createdAt", "<", thirtyDaysAgo)
.update({ status: "expired" });
console.log(`Updated ${result.successCount} orders across all users`);
// Note: collectionGroup requires a Firestore index on the queried fieldsError Handling
const result = await updater
.collection("users")
.where("status", "==", "test")
.update({ status: "verified" });
if (result.failureCount > 0) {
console.log(`${result.failureCount} documents failed`);
console.log("Failed IDs:", result.failedDocIds);
}Pagination for Large Collections
// Process documents in batches of 1000 to prevent memory issues
const result = await updater
.collection("users")
.where("status", "==", "inactive")
.update(
{ status: "archived" },
{
batchSize: 1000,
onProgress: (progress) => {
console.log(`${progress.percentage}% complete`);
},
}
);Log File Generation
const result = await updater
.collection("users")
.where("status", "==", "inactive")
.update(
{ status: "archived" },
{
log: {
enabled: true,
path: "./logs", // optional
},
}
);
if (result.logFilePath) {
console.log(`Log saved to: ${result.logFilePath}`);
}Log file example:
============================================================
FIRESTORE BATCH OPERATION LOG
============================================================
Operation: UPDATE
Collection: users
Started: 2024-01-15T10:30:00.000Z
Completed: 2024-01-15T10:30:05.000Z
Conditions:
- status == "inactive"
============================================================
SUMMARY
============================================================
Total: 150
Success: 148
Failure: 2
============================================================
DETAILS
============================================================
2024-01-15T10:30:01.000Z [SUCCESS] user-001
2024-01-15T10:30:01.100Z [SUCCESS] user-002
2024-01-15T10:30:01.200Z [FAILURE] user-003
Error: Document not found
...Requirements
- Node.js 18+
- Firebase Admin SDK 13.x
- Server-side environment only (Admin SDK required)
Why BulkWriter?
This library uses Firebase's BulkWriter which:
- No 500 document limit (unlike batch writes)
- Automatic rate limiting
- Built-in retry logic
- Better performance for large operations
Examples
Check out the examples folder:
- basic.ts - Basic usage workflow
- api-route.ts - Using in API endpoints
- advanced.ts - Advanced features and patterns
Disclaimer
This package is provided "as is" without warranty of any kind. The author is not responsible for any data loss, corruption, or other issues that may arise from using this package. Always test thoroughly in a development environment before using in production, and ensure you have proper backups of your data.
License
MIT
