How to have a rock solid Linux backup without pro budget

UPDATE: Since posting this article I have developed a new version of this script for my backup needs. It utilizes incremental backup and improves on CPU load, bandwidth and backup time.

Starting a web development company without any budget is a slippery slope. Loosing all your data can cost you all the months of hard work and development and cashflow that has been generated. Who can figure out I could not allow that with no help at all? So being in my shoes I had to come up with a backup solution that works seamlessly with the resources that I had: one dedicated Debian box in Frankfurt, my home desktop with abundant disk space and my flat rate ADSL line.


It turns out the solution is not that hard at all and can be achieved using SSH authentication keys and some simultaneous trickery. All I do to employ it is leave my home desktop box running all night and in the morning there is a fresh backup on my disk. Voila!

The solution that I developed was utilized using the following crontab entry:

# Backup
27 4    * * *   root    /root/bin/backup.sh > /dev/null

 

The backup.sh script archived the bare necessities needed to restore the system to full operation:

#!/usr/bin/env bash


LOG_FILE="/var/log/evorion-backup.log"
BACKUP_HOST=hostname.dyndns.org

echo "" >> $LOG_FILE
echo "Evorion backup utility started at: `date "+%F %T"`" >> $LOG_FILE

# Must be run as root user

if [ "$UID" -ne "0" ]
then
        echo "[`date "+%F %T"`] Error: You must run this script as root!" >> $LOG_FILE
        exit 67
fi
echo "[`date "+%F %T"`] User id check succesful" >> $LOG_FILE

# Compress directly into ssh connection
echo "[`date "+%F %T"`] Dumping and archiving started" >> $LOG_FILE
nice -n 19 dpkg -l > /root/installed_packages.txt
nice -n 19 mysqldump -u root -pYOURMYSQLPASS --lock-all-tables --all-databases | gzip | ssh -q vlatko@$BACKUP_HOST 'cat > /home/vlatko/abraham_backup/databases_`date "+%F_%T"`.gz'
nice -n 19 tar cz -C / root home/vlatko etc usr/virtualweb | ssh -q vlatko@$BACKUP_HOST 'cat > /home/vlatko/abraham_backup/archive_`date "+%F_%T"`.tgz'
echo "[`date "+%F %T"`] Dumping and archiving completed" >> $LOG_FILE

# Cleanup
rm /root/installed_packages.txt
echo "[`date "+%F %T"`] Finished" >> $LOG_FILE

After the user id is checked to be root the script stores a list of installed packages. Since my system is a clean apt install that is fine for me. After that mysqldump is started and piped directly into gzip which is piped directly into ssh connection that goes to my home desktop. Since my ssh connections are authenticated using keys there is no need for interactive login and thus this line works like a charm. I use the same trick to tarball the folders containing the valuable data. Finally the cleanup removes the temporary files and we are done.

Please leave your comments and suggestions on the script so that it can be improved.

Your thoughts

Yvan, 31-10-10 09:01
You should use the unison tool. It works with SSH, and you don't have to copy your data everytime. Just the differences (that's like rsync, but more powerful).

Of course you need to dump MySQL database and your dpkg, but that's all. You don't have to send your database dump, unison will do it (as you sync the root directory).

Give it a try, it's just the perfect tool for this.

Add comment

* - required field

*




*