The following are the scripts I use to backup various servers that support rsync/ssh to a central server, automatically, keeping a couple week “change log”, with minimal fuss. There are many other backup programs out there that are much more sophisticated, but they are also more work, and I’m not interested in archiving my backups for years and years and years. I just don’t have that type of data nor a tape drive :)
The first thing to do is to make sure you can ssh from your backup host to the server you want to backup without being prompted for a password. Google for how to do this securely.
Then download this tgz file and open it up, look around, and change all the various pathnames to whatever works for you.
There’s a couple of bits such as “dvdDir” and “dbDir” in shared.inc that aren’t relevant to backing this up, but I’m too lazy to remove them just for this post.
In normal usage you’d run the script by typing “sh servers.sh”. The script will then look at each directory in the ‘servers’ directory and rsync the content specified in servers/hostname/files.include while excluding content specified in servers/hostname/files.exclude. This is useful to say backup ‘/usr/local/www’ while skipping ‘/usr/local/www/massive-log-file-directory’.
Any files that have changed or been deleted will get copied into ‘dailysDir’ in the appropriate location. This is your “archive”. ‘backupDir’ is your snapshot.
You can also specify a list of servers (ie. any valid directory in the ‘servers’ directory) on the command line and it will only process those entries. If you do this though it won’t manage the ‘dailysDir’ expiration and archiving since this should be used primarily for testing and when you’ve added a directory to be backed up and want to get it now.
The one other trick is if you need to use an ssh tunnel to get from your backup server to the backup client, you can specify that in “servers/hostname/tunnel.through” and it will first create an ssh tunnel on port 22222 and work through that.
I think that’s about it.