Bulk Operations
Perform bulk updates, deletes, and multi-document operations efficiently on large datasets
Bulk Operations Business
Bulk operations enable you to update, delete, or modify thousands of documents in a single operation, with built-in safety features and progress tracking.
Overview
Use bulk operations to:
- Update many documents - Modify fields across multiple documents
- Delete in bulk - Remove documents matching criteria
- Insert multiple documents - Create many documents at once
- Transform data - Apply calculations or transformations at scale
- Migrate schemas - Update document structures across collections
Bulk operations are available on Business and Enterprise plans with document limits based on plan tier.
Document Limits
| Plan | Max Documents per Operation |
|---|---|
| Free | 100 |
| Team | 1,000 |
| Business | 10,000 |
| Enterprise | Unlimited |
Bulk Update
Simple Field Updates
Update specific fields across multiple documents:
Navigate to the collection and apply filters to select documents.
Click Bulk Actions > Update Documents.
Define the update operation:
{
"$set": {
"status": "archived",
"archivedAt": new Date()
}
}Preview affected documents (shows first 10 matches).
Click Execute Bulk Update.
Monitor progress in real-time.

Update Operators
Use MongoDB update operators:
// Set field values
{
"$set": {
"status": "active",
"updatedAt": new Date()
}
}
// Increment numeric fields
{
"$inc": {
"viewCount": 1,
"score": 10
}
}
// Multiply values
{
"$mul": {
"price": 1.1 // 10% price increase
}
}
// Rename fields
{
"$rename": {
"oldFieldName": "newFieldName"
}
}
// Remove fields
{
"$unset": {
"deprecatedField": "",
"temporaryData": ""
}
}
// Add to array
{
"$push": {
"tags": "featured"
}
}
// Remove from array
{
"$pull": {
"tags": "deprecated"
}
}
// Add to set (no duplicates)
{
"$addToSet": {
"categories": "electronics"
}
}
Conditional Updates
Update based on current values:
// Update only if field doesn't exist
{
"$setOnInsert": {
"createdAt": new Date()
}
}
// Set minimum value
{
"$min": {
"lowestPrice": 99.99
}
}
// Set maximum value
{
"$max": {
"highestBid": 150.00
}
}
// Current date
{
"$currentDate": {
"lastModified": true,
"timestamp": { "$type": "date" }
}
}
Complex Transformations
Use aggregation pipeline updates:
// Calculate new field from existing fields
[
{
"$set": {
"fullName": {
"$concat": ["$firstName", " ", "$lastName"]
},
"discountedPrice": {
"$multiply": [
"$price",
{ "$subtract": [1, { "$divide": ["$discountPercent", 100] }] }
]
}
}
}
]
// Conditional field updates
[
{
"$set": {
"tier": {
"$switch": {
"branches": [
{ "case": { "$gte": ["$points", 1000] }, "then": "gold" },
{ "case": { "$gte": ["$points", 500] }, "then": "silver" }
],
"default": "bronze"
}
}
}
}
]
Bulk updates are irreversible. Always test with a small subset first and ensure you have backups.
Bulk Delete
Deleting Documents
Remove multiple documents matching criteria:
Apply filters to select documents for deletion.
Click Bulk Actions > Delete Documents.
Review the deletion summary:
- Number of documents to be deleted
- Sample of documents (first 10)
- Estimated time
Type DELETE to confirm (prevents accidental deletion).
Click Execute Bulk Delete.
Monitor deletion progress.

Safe Deletion Practices
Protect against accidental data loss:
- Filter carefully - Double-check your filter criteria
- Preview first - Review sample documents before deletion
- Export backup - Export data before bulk delete
- Use dry run - Test with
dryRun: trueoption - Soft delete - Mark as deleted instead of removing
// Soft delete example
{
"$set": {
"isDeleted": true,
"deletedAt": new Date(),
"deletedBy": "{{current_user}}"
}
}
Bulk Insert
Inserting Multiple Documents
Create many documents at once:
Click Bulk Actions > Insert Documents.
Provide documents in JSON array format:
[
{
"name": "Product 1",
"price": 99.99,
"category": "electronics"
},
{
"name": "Product 2",
"price": 149.99,
"category": "electronics"
}
]Or import from file (CSV, JSON, or Excel).
Configure insert options:
- Skip validation (faster but risky)
- Ordered insert (stop on first error)
- Unordered insert (continue despite errors)
Click Insert Documents.
Import from File
Bulk insert from CSV or JSON:
Click Import > From File.
Upload your file (CSV, JSON, or XLSX).
Map columns to fields:
- Auto-detect field types
- Set field transformations
- Handle missing values
Preview import (first 100 rows).
Click Start Import.

Bulk Replace
Replacing Documents
Replace entire documents:
// Find documents
{
"filter": {
"version": 1
},
// Replace with new structure
"replacement": {
"version": 2,
"data": {
// New document structure
}
}
}
Bulk replace removes all existing fields except _id. Use bulk update with $set if you want to preserve other fields.
Progress Tracking
Real-Time Progress
Monitor bulk operation execution:

Displayed metrics:
- Documents processed / total
- Success rate
- Error count
- Elapsed time
- Estimated time remaining
- Operations per second
Operation History
View past bulk operations:
{
"operationId": "bulk_op_abc123",
"type": "update",
"collection": "users",
"startTime": "2024-02-24T10:00:00Z",
"endTime": "2024-02-24T10:05:23Z",
"status": "completed",
"statistics": {
"matched": 10000,
"modified": 9987,
"failed": 13,
"duration": "5m 23s"
},
"filter": {
"status": "pending"
},
"update": {
"$set": { "status": "processed" }
}
}
Error Handling
Handling Failures
Configure how to handle errors:
Ordered Operations:
- Stop on first error
- Guarantees order of operations
- Slower but safer
Unordered Operations:
- Continue despite errors
- Faster execution
- Collects all errors
Error Reporting
View detailed error information:
{
"errors": [
{
"documentId": "507f1f77bcf86cd799439011",
"error": "Validation failed: email is required",
"operation": {
"$set": { "status": "active" }
}
},
{
"documentId": "507f1f77bcf86cd799439012",
"error": "Duplicate key error",
"operation": {
"$set": { "username": "existinguser" }
}
}
],
"successCount": 9987,
"errorCount": 13
}
Retry Failed Operations
Retry only failed documents:
Open the bulk operation from history.
Click View Errors to see failed documents.
Export error list for investigation.
Fix issues and click Retry Failed Documents.
Performance Optimization
Batch Size Configuration
Optimize throughput:
{
"batchSize": 1000, // Documents per batch
"maxConcurrency": 4 // Parallel batches
}
Guidelines:
- Small documents: 500-1000 per batch
- Large documents: 100-500 per batch
- Complex updates: 50-200 per batch
Indexing Considerations
Improve bulk operation speed:
- Disable indexes temporarily - For massive inserts
- Use covered queries - Ensure filters use indexes
- Avoid large arrays - Array operations are slower
- Use projection - Only fetch needed fields
Parallel Processing
Execute operations in parallel:
{
"parallel": true,
"maxWorkers": 8,
"batchSize": 500
}
For operations on very large collections (millions of documents), consider using MongoDB's native bulk write operations via the API for better performance.
Safety Features
Dry Run Mode
Test operations without making changes:
Configure your bulk operation.
Enable Dry Run mode.
Click Execute.
Review what would have been changed (no actual modifications).
Disable dry run and execute for real.
Backup Before Operation
Automatic backup option:
{
"operation": "update",
"backup": {
"enabled": true,
"location": "s3://backups/bulk-ops/2024-02-24/",
"format": "json"
}
}
Operation Audit Trail
All bulk operations are logged:
{
"timestamp": "2024-02-24T10:00:00Z",
"user": "admin@company.com",
"operation": "bulk_update",
"collection": "users",
"filter": { "status": "pending" },
"update": { "$set": { "status": "processed" } },
"documentsAffected": 9987,
"duration": "5m 23s"
}
Common Bulk Operations
Data Migration
Migrate to new schema:
// Add new field based on existing data
[
{
"$set": {
"fullAddress": {
"$concat": [
"$address.street", ", ",
"$address.city", ", ",
"$address.state", " ",
"$address.zip"
]
}
}
},
{
"$unset": "address"
}
]
Price Updates
Bulk price changes:
// 10% price increase on specific category
{
"filter": {
"category": "electronics",
"price": { "$exists": true }
},
"update": {
"$mul": { "price": 1.10 },
"$set": { "lastPriceUpdate": new Date() }
}
}
Status Updates
Batch status changes:
// Archive old records
{
"filter": {
"status": "pending",
"createdAt": { "$lt": new Date("2023-01-01") }
},
"update": {
"$set": {
"status": "archived",
"archivedAt": new Date(),
"archivedReason": "auto-archive-old-records"
}
}
}
Cleanup Operations
Remove obsolete data:
// Remove temporary fields
{
"filter": {},
"update": {
"$unset": {
"tempCache": "",
"processingData": "",
"__v": ""
}
}
}
API Access
Perform bulk operations via API:
# Bulk update
curl -X POST https://api.mongodash.com/v1/collections/users/bulk-update \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"filter": {"status": "pending"},
"update": {"$set": {"status": "processed"}},
"options": {
"ordered": false,
"batchSize": 1000
}
}'
# Bulk delete
curl -X POST https://api.mongodash.com/v1/collections/users/bulk-delete \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"filter": {"isDeleted": true, "deletedAt": {"$lt": "2023-01-01"}},
"options": {
"dryRun": false
}
}'
Best Practices
Planning
- Test on subset - Validate with small sample first
- Schedule off-peak - Run during low-traffic periods
- Export backup - Always backup data first
- Document changes - Record what was changed and why
- Monitor resources - Watch database CPU and memory
Execution
- Use dry run - Test before executing
- Start small - Begin with small batches
- Monitor progress - Watch for errors and slowdowns
- Have rollback plan - Know how to undo changes
- Communicate - Inform team of bulk operations
After Execution
- Verify results - Sample check updated documents
- Check metrics - Ensure expected number of changes
- Review errors - Investigate and fix failures
- Update documentation - Record schema changes
- Clean up - Remove temporary fields or backups
What's Next?
- Custom Views - Save bulk operation filters as views
- Data Sync - Automate data synchronization
- Import/Export - Learn more about data import/export