Other than for the common ground in WordPress Backup: Database to a PC or Mac, Mac and Tux types may have been feeling a little left out with all the attention given to backing up files for Windows.
Let's put that right using the OpenSSH toolkit – that we covered in Lock Down WP Connections – for secure tunnelled, encrypted transfer, the process itself being enacted by a choice of either the copy utility scp, else the file synchronizer rsync.
Full server backup to local Mac/Linux
The simplest way to backup web files is using scp, or Secure Copy, which encrypts dataflow using supersonic SSH. Logged into the remote host, run a command like this:
- scp – the program to use, Secure Copy
- -rpP 54321 – there are three optional directives here:
r – recursive copies not only the specified folder, but all sub-folders
p – preserve metadata such as permissions
P – only specify your server's port if you don't log in with the default 22. In this case, the port is 54321
- USER@220.127.116.110 – your remote username at the IP address (or hostname)
- :/path/to/WordPress – after the colon, this is the source folder to copy
- ~/backup – is your local destination for the copy, in your home directory (~)
Full WordPress backup remote to remote
Similar syntax. After the options we specify where from, then where to:
Incremental backups to local Linux or Mac
The occasional full backup is all well and very good but what most of us want is to compliment that with an ongoing incremental backup. For that we use rsync which, in any case, gives us a full backup the first time it is run. The principle is the same as with scp but the syntax is a tad more complex:
To simplify, we can break that down into three parts. Firstly, there are some useful rsync options (aka command “switches”). Secondly, because we want to connect with SSH encryption (and, unlike scp, rsync is not an intrinsic OpenSSH component) we have to call SSH and point to our local, private authentication key as well as specifying any non-default port. Thirdly, we say where the wanted files live and to where locally they need to go. Here's the first lot:
- rsync – the program we want to use
- -a – archive says to copy the entire folder tree recursively while preserving metadata such as permissions
- -u – update skips matching files while updating the rest
- -q – quiet mode suppresses error messages (ie, don't log anything)
Now to connect SSH:
- -e – specifies that we want to use a remote shell, in this case SSH
- “ssh -i /home/USER/.ssh/id_rsa -p 54321” – says to use SSH, links the private key and specifies the remote port (if we're not using the default 22)
Finally, this is just like the scp syntax:
- USER@18.104.22.1680 – the server username and IP
- :/path/to/WordPress/ – the source directory
- /home/USER/backup – the destination
Phew! Bit of a head-wrap but, put into motion, thrilling. Full steam ahead.
Incremental backup remote-to-remote
Unlike scp, rsync has to be run from the local machine. Therefore, if you want to setup a remote-to-remote backup the destination would need to pull the copy, else the source could push the copy. For the latter, use the same syntax but swap the source and destinations around, like so. (If that doesn't quite make sense just try it!):
Automating backup with cron
So what we have above, essentially, are a choice of commands to be run from a terminal, copying whatever files.
That's all well and good but what we need is automation, a dependable set-it-forget-it solution. For that we need cron, the task scheduler we've used already to back up our database, remotely, in WordPress Backup: Database to a PC or Mac.
… There is a snag here. (One that we'll overcome, all the same.)
The thing is, cron doesn't really like OpenSSH. The only way to get these programs to work together is by using passphrase-less authentication keys and, from a security perspective, that is really crap.
Bear with me, I'll write up another tutorial on precisely how to get around this but, basically, here's what you need:
- an unprivileged user account purely for backup, both on source & destination machines
- passphrase-less_auth_key (the ‘private' key) on the destination box
- passphrase-less_auth_key.pub (the ‘public' key) on the source box
- a cronjob locally, calling rsync which in turn calls your private SSH key (as we've seen above)
- a cute piece of code pre-pending the public key in the authorized_keys file which looks like this:-
What that does is to pin down login from the specific backup user, purely using the single rsync command and merely from a single IP. The IP directive may be omitted for those with a dynamic (ever-changing) address, although your IP address can be set in stone using a free DNS service these days … but IP addresses can also be spoofed!
… Then again, the bottom line with this solution is that some hacker would need your private key and, assuming you private key was somehow misappropriated, the hacker would only ever be able to use a single command. The backup user, locally and remotely, has limited rights and any attempt to connect to your server using this passphrase-less key would be met with a failure message. Even if the attacker did use the correct command for the few minutes per day that the backup awaits our download, provided our overall solution is tight, he or she could download no better than an encrypted and passworded compressed file.
So while having any password-less key to a production server is, in principle, a bad idea, this workaround is strong, nonetheless.
This is a fiddly setup though. Like I say, I'll set out the process, command by command, in a separate tutorial so feel free to nag me about that.