I have become a personal disaster recovery zealot. I’m like the former smoker railing against smokers – so I’ll try not to preach too much.
When I used Windows on all of my machines, I did the occasional random image backup. To be honest, these occurred about once every 3 to 6 months. Not good. When I switched to [Ubuntu] Linux, I was curious on best practices and stumbled across an article on simple Linux backup methods. This in turn brought me to the Simple Linux Backup software. It is and does what it says.
The configuration of SLB (my short hand for it, not theirs) consists of saying what to backup – directories and files – and what to exclude – files from the list of things to backup that really don’t need to be. For example, I backup everything in my home directory *except* I do not need to backup my “downloads” directory since I could easily get those files from the Internet.
In real practice, it is best to choose too much to backup initially and then, over the first few days, look at what actually was backed up up and decide if there are things that are not needed.
This means you choose “things to backup” and over time, add to the “things to exclude”. I backup all of “/etc” and “/home/<user_name>. My exclude list has grown over time and is now quite extensive.
I exclude things like the the .Trash, .cache, .mozilla, and .thumbnails folders – these all contain files I don’t need around anyway). I also exclude PDF, downloads, sandbox, and torrents – these contain files I can easy get from the Internet if I need them again. The most specific exclusions took a bit longer to determine. I use Lotus Notes for work and there are a lot of big files that are either on a server somewhere and I have local replicas, or represent temporary and/or log files. It is these files that I discovered in the first few days of using SimpleBackup. All I had to do is open my compressed backup file and first sort by size. I notices any large files that really were not needed. I added these to the exclusion list. Next, I looks for ‘lots of files from a single folder’ and decided if I needed any of those. If there were all unimportant, I added the folder to the exclusion list. Within a few days, I had a stable and efficient backup.
SimpleBackup is easy to schedule. It runs once each day. Once each week, it makes a full backup of what you have identified. The rest of the time, it makes an incremental backup. The backups are named “Backup.Mon*”, “Backup.Tues*”, “Backup.Wed*” etc.
I did make one change to SimpleBackup (and have sent it to the author for consideration). Each week, the full backup on Monday gets named “Backup.Mon.<YYYYMMDD>*” – this way, I have daily incremental backups starting on Tuesdays and I have weekly full backups going back several Mondays.
One reason SimpleBackup works so well is that Linux does a better job of separating user data and application configuration from the actual software programs. Otherwise, this same solution would work wonders for Windows PCs.
Footnote 1: I was also running TimeVault – a Linux solution very similar to the Mac OS X “Time Machine“. I found that I never needed used snapshots at such a rapid rate – perhaps it would make sense in some cases but not on most personal computers. There has been some discussion of extended SimpleBackup to run more often than just once each day. I might take advantage of such an enhancement but anything more often then every few hours is overkill for my data.
Footnote 2: With the growing free and fee based data centers from Amazon.com, Google, Microsoft, ISPs, and others, it’s only a matter of time before our backup solutions will go out onto the Internet. It’s a bit scary to think *all of your data* may be on one of those monstrous data servers owned by a “hopefully benevolent” corporation, but it is destined to happen – it’s just business.