A good backup strategy is very important. Most people only own a single hard drive at any given time and they will never think about backups unless they are unlucky enough to have that hard drive die on them. I probably have over a dozen hard drives here at home. With this many drives it, is inevitable that one or more will fail unexpectedly.
Automated btrfs Snapshots
I’ve been running btrfs for quite a while now. This has been both a blessing and a curse.
btrfs gives me the ability to instantly create snapshots of my file systems. This is my first line of defense against stupid mistakes. At any given time, my snapshots give me access to most of the files that existed on my laptop over the past four weeks. This is the only backup I am likely to need to rely on for anything short of a hard drive failure.
I’m scared of btrfs. I haven’t lost any data to it so far, but I am still quite paranoid.
Automated rdiff-backup Backups to My Laptop’s Second Hard Drive
The boot drive in my laptop is an 80GB SSD, and the second drive is a 320GB spinning drive. I run rdiff-backup daily, using cron, to back up the majority of
/etc to the second drive.
rdiff-backup is one of my favorite pieces of software. I currently have 176 daily backups of my home directory stored on my laptop’s second hard drive. In my case, the total space required for that is about twice the size of my home directory.
Automated Off Site Backups of Important Data
I would like to be able to automatically back up all of my data off site every day, but it just isn’t feasible for my data. My home directory is actually quite small, only around 10 to 15GB. The problem is that I often have large files that are only present for a few days at a time. Much of my home directory is pretty unorganized, except for the data I really care about.
If it is worth my time to create a file, then I figure it is also worth my time to put it in revision control. Everything worthwhile that I’ve created in the past 10 years is sitting in a repository in my
I run a daily backup of my
~/Projects directory (and a few other similar directories) to a staging area on my spinning drive using Duplicity. I need to do this because it would be difficult to install Duplicity on my Android phone.
It is very easy to run rsync on Android, though. I have another cron job that runs every few hours. It uses rsync to synchronize my local Duplicity backup with both my web server and with the micro-SD card on my Android phone. I have to run this script every few hours just in case my phone isn’t on the network when the backup job runs.
My Manual Full Backups
I also run an rdiff-backup of my home directory to an encrypted external USB drive. This backup is very manual. I try to do it as often as I can. I don’t think of it that often, though. I always run a backup to this drive before I travel, though. This drive only has 43 backups, covering almost twice the timespan of the 176 backups on the laptop.
This is one of my favorite backups, though. Even if this backup of my home directory were two months old, it would still be good enough to get me up and running and happy, as long as I can restore my smaller off-site backup over the top of it. I am happy to know that this backup will be here waiting for me in the unlikely event that someone drives a car over my laptop.
Files That Rarely Change
Some files just don’t change much. I have over 10 years of digital photos that just never change. There’s no need to back those up every single day. I have a copy of all my old photos stored on my laptop. If I know the photos are backed up in multiple places, then I store them accordingly. New photos go in one directory; old photos go in another. As the new photos become old photos, they end up being backed up in multiple locations on different kinds of media.