Backup and Restore
Learn how to backup and restore MongoDB databases using mongodump, mongorestore, mongoexport, and mongoimport. Cover full backups, selective backups, scheduled cron jobs, and point-in-time recovery strategies for production use.
Data loss can destroy a business. Regular backups are non-negotiable for any production MongoDB deployment. This episode covers every backup and restore method available in MongoDB, from simple dump-and-restore to automated scheduled backups.
Backup Tools Overview
- mongodump — Creates BSON backup files (fast, compact, preserves types)
- mongorestore — Restores from mongodump output
- mongoexport — Exports to JSON or CSV (human-readable, for single collections)
- mongoimport — Imports from JSON or CSV files
mongodump — Full Database Backup
# Backup ALL databases
mongodump --out /backup/full-backup-$(date +%Y%m%d)
# Backup a specific database
mongodump --db blogApp --out /backup/blogApp-$(date +%Y%m%d)
# Backup a specific collection
mongodump --db blogApp --collection posts --out /backup/posts-$(date +%Y%m%d)
# Backup with authentication
mongodump --host localhost --port 27017 \
--username admin --password secret \
--authenticationDatabase admin \
--db blogApp --out /backup/
# Backup a remote database
mongodump --uri "mongodb+srv://user:pass@cluster.mongodb.net/mydb" \
--out /backup/remote-backup
# Compressed backup (much smaller file size)
mongodump --db blogApp --gzip --out /backup/compressed
# Backup with a query filter (only published posts)
mongodump --db blogApp --collection posts \
--query '{"published": true}' \
--out /backup/published-posts
mongorestore — Restore from Backup
# Restore all databases from a backup directory
mongorestore /backup/full-backup-20240115
# Restore a specific database
mongorestore --db blogApp /backup/blogApp-20240115/blogApp
# Restore a specific collection
mongorestore --db blogApp --collection posts \
/backup/blogApp-20240115/blogApp/posts.bson
# Restore with --drop (drops existing data before restoring)
mongorestore --drop /backup/full-backup-20240115
# Restore compressed backup
mongorestore --gzip /backup/compressed
# Restore to a different database name
mongorestore --db blogApp_staging \
/backup/blogApp-20240115/blogApp
# Restore with authentication
mongorestore --host localhost --port 27017 \
--username admin --password secret \
--authenticationDatabase admin \
/backup/full-backup-20240115
mongoexport — Export to JSON/CSV
# Export a collection to JSON
mongoexport --db blogApp --collection users \
--out /exports/users.json
# Export to CSV
mongoexport --db blogApp --collection users \
--type=csv --fields=name,email,age \
--out /exports/users.csv
# Export with a query
mongoexport --db blogApp --collection posts \
--query '{"published": true}' \
--out /exports/published-posts.json
# Export as a JSON array (instead of one document per line)
mongoexport --db blogApp --collection users \
--jsonArray --out /exports/users-array.json
# Pretty-print JSON output
mongoexport --db blogApp --collection users \
--pretty --out /exports/users-pretty.json
mongoimport — Import from JSON/CSV
# Import from JSON
mongoimport --db blogApp --collection users \
--file /data/users.json
# Import from CSV (with header row)
mongoimport --db blogApp --collection users \
--type=csv --headerline \
--file /data/users.csv
# Import from CSV (specify fields manually)
mongoimport --db blogApp --collection users \
--type=csv --fields=name,email,age \
--file /data/users-no-header.csv
# Import JSON array
mongoimport --db blogApp --collection users \
--jsonArray --file /data/users-array.json
# Upsert mode — update existing, insert new
mongoimport --db blogApp --collection users \
--mode=upsert --upsertFields=email \
--file /data/updated-users.json
# Drop existing collection before import
mongoimport --db blogApp --collection users \
--drop --file /data/fresh-users.json
Automated Backup with Cron
Create a backup script and schedule it with cron:
#!/bin/bash
# /scripts/mongodb-backup.sh
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="/backup/mongodb/$TIMESTAMP"
RETENTION_DAYS=7
# Create backup
mongodump --gzip --out "$BACKUP_DIR"
# Check if backup was successful
if [ $? -eq 0 ]; then
echo "Backup successful: $BACKUP_DIR"
else
echo "Backup FAILED!" | mail -s "MongoDB Backup Failed" admin@company.com
exit 1
fi
# Delete backups older than retention period
find /backup/mongodb -type d -mtime +$RETENTION_DAYS -exec rm -rf {} +
echo "Cleanup complete. Removed backups older than $RETENTION_DAYS days."
# Schedule with cron — backup daily at 2:00 AM
crontab -e
# Add this line:
0 2 * * * /scripts/mongodb-backup.sh >> /var/log/mongodb-backup.log 2>&1
Backup Strategies
Development
- Manual backups before major changes
mongodumpto local directory
Production
- Daily automated backups with cron
- Compressed backups to save disk space
- Keep 7-30 days of retention
- Store backups off-site (S3, Google Cloud Storage)
- Test restores regularly
Upload to S3 After Backup
# Add to your backup script:
aws s3 sync "$BACKUP_DIR" "s3://my-backups/mongodb/$TIMESTAMP" --storage-class STANDARD_IA
Verifying Backups
# Always test your backups by restoring to a test environment
mongorestore --db blogApp_test --drop /backup/latest/blogApp
# Then verify the data
mongosh blogApp_test --eval "db.posts.countDocuments()"
What's Next
Your data is now protected with proper backup strategies. In the next episode, we'll integrate MongoDB with Node.js — learning how to connect, perform CRUD operations, and build a complete API using the MongoDB Node.js driver and Mongoose ODM.