BudiBadu Logo
Samplebadu

Bash by Example: Daily Backup with Timestamps

Bash 5.0+

Automating the creation of daily archives with precise timestamping for version control and backup management, implementing scheduled archival through cron integration, using date command for generating unique timestamp-based filenames, managing archive retention policies, and ensuring consistent backup snapshots for disaster recovery.

Code

#!/bin/bash

# Configuration
SOURCE="/var/www/html"
DEST="/backups/daily"
DATE_FORMAT=$(date +%Y-%m-%d_%H%M%S)
ARCHIVE_NAME="backup_$DATE_FORMAT.tar.gz"
LOG_FILE="/var/log/daily_backup.log"

# Ensure destination exists
mkdir -p "$DEST"

log() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] $1" | tee -a "$LOG_FILE"
}

log "Starting backup for $SOURCE..."

# Check if source exists
if [ ! -d "$SOURCE" ]; then
    log "CRITICAL: Source directory $SOURCE does not exist!"
    exit 1
fi

# Create the archive
# -c: create, -z: gzip, -f: file, -p: preserve permissions
if tar -czpf "$DEST/$ARCHIVE_NAME" "$SOURCE" 2>> "$LOG_FILE"; then
    log "Backup successful: $DEST/$ARCHIVE_NAME"
    
    # Calculate size
    SIZE=$(du -h "$DEST/$ARCHIVE_NAME" | cut -f1)
    log "Archive size: $SIZE"
else
    log "ERROR: Tar command failed."
    exit 1
fi

# Optional: Delete backups older than 7 days
find "$DEST" -name "backup_*.tar.gz" -mtime +7 -exec rm {} \;
log "Cleanup complete. Old backups removed."

Explanation

Data loss is an inevitability in computing, whether due to hardware failure, human error, or malicious attacks. A robust backup strategy is the first line of defense. This script demonstrates a standard "snapshot" approach where the entire state of a directory is captured into a single compressed archive file. By embedding the timestamp directly into the filename (e.g., backup_2023-10-27_140000.tar.gz), we create a natural, chronological history of the data that is easy to sort and manage.

The script uses the tar command with the -p flag, which is crucial for system backups. It stands for "preserve permissions," ensuring that when you restore the files, they retain their original ownership and read/write/execute bits. Without this, restoring a web server or a database directory could result in a broken system where the service cannot access its own files.

Logging is another critical component of automation. Since this script is likely to run as a cron job (a scheduled background task), you won't see its output on a screen. By defining a custom log function that writes to both the console (via tee) and a persistent log file, you create an audit trail that helps diagnose failures days or weeks after they happen.

Code Breakdown

13
The log function uses tee -a (append). This splits the output stream, sending it to the file and standard output simultaneously, which is useful for debugging while running manually.
27
tar -czpf. The z flag tells tar to filter the archive through gzip. This trades some CPU time for significantly reduced disk usage, which is usually a good trade-off for backups.
39
Automated cleanup prevents the disk from filling up. find ... -mtime +7 locates files modified more than 7 days ago, ensuring we keep a rolling window of backups.