Bit of a backup theme for this week but remember in my last post I did say I was obssessed with backups! The last post addressed our server-to-server backups, how they work and a nice example of how they can save our collective assess in the event of a 100% data loss meltdown.
But what about your own personal backups? We can of course use our system to restore individual files and databases for you – no problem there but everyone should also have personal backups. Files are easy. If you’re like me you have your sites in folders on your PC/laptop and you edit files there and upload via FTP. So by default you have a copy of your files. But what about databases? A database is written on the server so special considerations have to be made. You can do backups in phpMyAdmin and download them. But who wants to do that weekly, let alone daily? And then download the backups to your computer? I’ve done it all manually and it just sucks. You will definitely forget and that daily backup becomes weekly, then monthly, then quarterly….
Wouldn’t it be awesome if all your databases were backed up daily and downloaded to your computer daily without ever having to lift a finger? :D
So let me share with you what I do. Small disclaimer though, this only works on Linux hosting for obvious reasons. I’ll talk with our Windows admins to see if there’s a comparable solution Windows clients.
Disclaimer #2. Please don’t try this on a shared server with a large database! If you SQL file is a few hundreds megs you should be okay. Over 300mb and with 100 people doing the same thing the server might not be so happy. :)
So the first thing to do is setup a task in your domain’s Crontab. Choose a time in Minute and Hour. Day, Month and Day Of The Week will be * meaning everyday. The command to backup a database via cron is:
/usr/bin/mysqldump --opt --host=localhost --user=[USER] --password=[PASS] [DB NAME] > /home/httpd/vhosts/domain.com/httpdocs/backup/whatever.sql |
Just substitute your database name, user and pass and the path to where you want the sql file to be written. If it’s a backup folder obviously make sure you create that folder. For instance the following image shows 30 3 * * * which means the job runs at 3.30am every day:
If you want to backup more than one database stagger them a few minutes apart. This image shows one running at 3.30am and another at 3.40am:
You can backup databases from any domain to any domain so you could backup databases from 5 domains to one backup folder on a domain of your choice.
REMEMBER!! You are backing up your SQL files to a place they can be read. I advise you choose a random name for your backup folder and random names for your SQL files.
After you’ve added all the database cronjobs you can then add a cron that tars (or zips) the whole backup directory. The following command makes a .tgz file (gunzip compressed tar file) in httpdocs and is tarring up a directory called “backup”. But do use a more random name as mentioned above. Run this after all databases have been backed up:
/bin/nice -n 19 /bin/tar -czvf /home/httpd/vhosts/domain.com/httpdocs/your_backup.tgz /home/httpd/vhosts/domains.com/httpdocs/backup |
You can use zip (note that zip lives in /usr/bin/zip):
/bin/nice -n 19 /usr/bin/zip -r /home/httpd/vhosts/domain.com/httpdocs/your_backup.tgz /home/httpd/vhosts/domains.com/httpdocs/backup |
And thus your crontab might look like this:
So now you have all your MySQL databases being backed every day and also tarred/zipped to a location of your choosing. The last thing to do is have that tarball/zipfile downloaded to your computer automatically everyday.
There are various programs available but I use one called GetRight. It has a 90 day trial and costs $19.95 which is pretty cheap considering the value. It’s a full download manager and has a huge array of features but I just use for it’s ability to download a URL at a certain time every day. It can also override the old backup so you don’t accumulate a huge number of backup files. If you decide to try this program you’ll need to go to Tools > Daily Downloads and configure from there. I also timed mine with the 24h timer option as I had issues with the am/pm.
I’ve been very happy with this program. It downloads a tarball containing our forum, ticket system and blog databases at 5am every morning. Our ticket system sql file is over 2gb and backing that up and tarring it does cause some load and we have a nice Core2Quad machine with fast 10k drives all to ourselves so definitely don’t try the above if your databases are huge!
I’m interested to hear if anyone does things differently so please do post a comment if you have other ideas. :)
Add Comment