This waterproof data backup strategy again involves the Cobian-Tunnelier love-in for Windows users, else the alternatives scp or rsync for Linux and Mac people.
Here's the working procedure:
- A script dumps the data into a file using mysqldump
- The server's task scheduler, cron, automates that script
- Cobian plus Tunnelier (local Windows), else scp or rsync (local Linux/Mac), securely retrieve a data file copy
- cron deletes the remaining remote data file for added security
OK, I'll do my best to explain what the deuce all that means now! Go grab a coffee. (Whiskey after ;))
The above procedure may look complicated but really it's no big deal. We need to swap some strings in a text file – such as your database credentials – save the file and the automatic scheduler called cron kick-starts the script. We transfer to the backup destination with whatever tool: wpCop uses Cobian-Tunnelier for Windows and either scp and rsync for Linux and Macs. Then cron performs the last remote job to keep our data safe.
You can use your hosting control panel, so likely cPanel, to set up the remote tasks, else just use a terminal.
Dumping the data from a database
We extract the data into a file using mysqldump.
Using the #comments to help, edit this script in a text editor. The top line – #!/bin/sh – is not a comment and must remain:
This method is basic. There's a load of other stuff you can do with compression methods – whether using zip, tar or whatever – such as encrypting the content and password protecting the archive. Swapping the compression tool for yours (I like zip for this job though), you can find out all the options at the command line by typing something like:
The script needs a name and has to live somewhere. Using your file manager, therefore, create a directory called, say, myCronScripts in your remote home directory, /home/USERNAME, and open a new file within called, say, db_backup.sh. Alternatively, at the console, be lazy like me and just paste this:
Add your amended script to the open file, save it, close it.
Now for the file permissions. Manually, right-click on the file and choose Change Permissions, electing 500. Or at the terminal, here you go:
Those using a terminal can test the script with this:
Cron the script
This is where cron comes in, working like an alarm clock to trigger our backup script. You can go manual if you prefer, adding the command ~/myCronScripts/db_backup.sh to a new cPanel Cron Job entry. Else, at the terminal, edit a crontab the old-school way:
And within, paste something like this:
In a nutshell, that statement says when to run what command and is appended with &> /dev/null which bascially means kindly don't log my database password, Cron-chap, thanks awfully, have a kiss. More or less.
Change the timing and /filepath/and/name.sh to suit you. Save and close the file. While the rest of us may assume the thing will work, shared hosted people tend to receive a confirmation:
Of course, if you set up the cronjob to run a few minutes from now then you'll have a definite confirmation of whether or not your newly zipped file has arrived. So there's a hint.
Grabbing the data dump: Windows locally
Set up a New Task in a utility such as Cobian, scheduling to retrieve the dump file a short time after it's been created and transfer it locally, else to some cloud or another server. Timing is key: the above dump was created at midnight so pull it at, say, 00:10 hours.
Flushing the dump
Remember: if you don't flush the dump, you could end up in the shit. (Sorry.)
Innuendo aside, to keep the minimum of sensitive data on your server, set a second cronjob to delete the now-redundant data file, triggering that soon after your backup task has completed. Your command, correlating to the filename in the mysqldump script, will look uncannily like this:
Adding that to the cronjob's timing, you should end up with something like this:
That will kick in 10 minutes after the backup ran which gives a wide margin for the average database.
Alternatively (and even better), use an option in your backup program for deleting the source files having delivered them to their backup destination, negating any need for a secondary server-side cronjob. I'm not sure, off-hand, what that option is called in Cobian, so if some old dear could let me now I'll include it here, and thank you in advance.
That's it for the database backup for local Windows users but …
Grabbing the data dump: Linux/Mac users
… For those running a Mac or Linux locally …
You now need to follow WordPress Backup: Files & DB for Mac/Linux PC for the spiel on using scp and rsync. Either of those tools (I prefer rsync, you'll see why) will bring the mysqldump'ed db_backup.sql file safely local. ‘Safely'? Yes, ie, not being sent in plaintext, as is the case with all those half-baked backup plugins, but encoded as is the case with the Windows' Cobian-Tunnelier method. Sorry, I'm likely repeating but it's important that you grasp this.
As with the Windoze technique, and once your database has had a chance to travel remote-to-local using scp or rsync, just make sure that you delete the now-redundant, remote backup which, otherwise, sits on your server, again in plaintext and including your passwords and website user's data. As we covered in the above section, “Flushing the dump”, we could do this with a secondary cronjob but, better still, have the backup program do this instead. Using rsync, for example, we simply implement the –remove-source-files switch to delete the source files after they're safely backed up on the destination machine.
… Actually, we'll take a closer look at that right now, in WordPress Backup: Files & DB for Mac/Linux PC.