Important Update! As of August 22, 2017, Crashplan for Home has been discontinued and the company will be focusing solely on their small business products. I am disappointed and annoyed at their decision to abandon the best cloud backup solution for home users. They recommend either switching to Carbonite or “upgrading” to Crashplan for Business. In my first look at these alternatives, neither provide as much functionality as Crashplan for Home. Over the coming weeks I will once again be investigating backup solutions according to the criteria I spell out below.
PC Magazine has a nice comparison of online backup solutions. They just updated the article to remove Crashplan from the list. I’ll be studying this comparison in more detail as a starting point, but at first glance none of them match Crashplan. Whatever I use to replace Crashplan will likely require some compromises. I will let you know what I decide to do!
One of my goals for 2014 was to put a reliable backup system in place. Long ago I occasionally copied important files to optical discs (CDs and DVDs). By occasionally I mean once a year or so. The process was entirely manual. As I started getting heavy into digital photography, my important files grew from megabytes to hundreds of gigabytes, so I tried an external hard drive and a NAS. But I still made the copies manually and occasionally, which means I never really had a reliable backup system. I just had snapshots that were usually a year out of date because the backups took so long to create I rarely bothered to make them.
A backup system that does not provide a current backup is not a system at all.
Wake Up Calls
My not really a system for backups came to a head when a wildfire threatened my home. I was forced to evacuate. When deciding what to take with me, I looked at my external hard drive (a year out of date), my NAS (perhaps 6 months out of date) and my tower computer (current working files). With a heavy sigh I carried the bulky computer down two flights of stairs to the car and left the backups behind to burn. Fortunately the wildfire did not reach my house and nothing actually burned, but I pondered what would happen in the next fire, and I spent a day or two manually updating my NAS backup.
Two months later another wildfire threatened to destroy my home. This time I was away when the evacuation was ordered, so I wasn’t able to return home to save any possessions. My computer and its still out-of-date backups were sitting ducks. This fire was much closer than the previous one, but I was lucky again. Nothing was lost.
Two wildfires in two months? I’d better get my shit together! I needed a backup system that would actually work for me. So I put together some requirements.
Making Backups Easy
I had to figure out how to make backups happen automatically because I’m basically lazy. That’s requirement number one.
Number two is it must allow for multiple terabytes of data. Although I currently have about 700GB of data, it’s growing rapidly. I can envision the day when I’ve doubled or tripled that amount.
Third, it must be easy to store up-to-date backups at an offsite location, away from wildfires and floods. Although I could occasionally drop off an external drive at my safe deposit box or a friend’s house, that’s just not going to work for me. So backing up over the Internet is a must, as well as backing up locally so I can have quicker backups and restores.
In addition to those three must-have requirements, I thought about some features that would be nice to have.
I should have one backup system that works for multiple computers on my home network, including Windows, Linux and Mac computers. My main computer is Windows. I keep other Windows and Linux machines around, so it would be nice to easily backup their data. It’s not inconceivable that I might have a Mac someday.
I’d like to be able to restore an earlier version of a file, which means some way of keeping multiple backups or multiple versions of each file. I’ve rarely lost a file due to a hardware failure, but I’ve certainly lost a file because I clobbered it myself and wished I could retrieve yesterday’s version, or last week’s version, or last month’s version.
So in the middle of 2014 I finally setup a backup system that met all my requirements. What did I end up with?
After looking at several alternatives, including writing my own backup scripts with rsync (I’m a software geek, remember?), I am now using CrashPlan.
CrashPlan is well-known as a cloud backup service, but it also does local backups to folders, external drives, and computers on your home network or at a friend’s house. If you don’t use the cloud backup service, CrashPlan is free for non-commercial use.
Everything happens automatically. After telling CrashPlan what files to backup and where, it just happens. I’m currently backing up to a local Linux server and to CrashPlan’s cloud (I’m paying about $60/year for the cloud service for unlimited backup storage). Whenever I make a change to a file, within 15 minutes or so it’s backed up to my Linux server, an older computer running Ubuntu 14.04 with a pair of 3TB Western Digital Red drives with software RAID 1. RAID 1 essentially keeps a copy of each file on each drive, so that if one drive fails no data is lost. After the file is backed up to the Linux server, CrashPlan sends a copy to the cloud, which takes somewhat longer due to a slower network connection. So without even thinking, I get two copies of my files, one of them offsite just in case.
I also occasionally still connect my 1TB external hard drive for another local backup. It gives me yet another local copy of my data. The Linux server is working well enough that I may stop using the external hard drive, especially since it’s almost full.
I tested some restores to be sure they worked. Don’t ever rely on a backup system without verifying you can restore! I also had to use it for real one morning when I stupidly overwrote a Photoshop master file with a filtered file. Lost all my layers. I could have spent a couple of hours recreating my work, but CrashPlan had several versions backed up. I went to the previous night’s version before the clobbering event and got back the file I needed. Yay!
The first backup of my 700GB of data took about a day to the Linux server, about a day to the external hard drive, and 4 months to the cloud over my relatively slow DSL connection. It was a joyous day when the cloud backup completed, and I slept well that night knowing all my data was finally safe from disaster.
Now that my first backup is complete, here’s how it all works. Suppose I return from a photo shoot with 10GB of new photos. Shortly after I’ve imported the photos into Lightroom, CrashPlan backs them up to my Linux server, which might take 15 minutes or so. Then it backs them up to the cloud, which takes about a day over slow DSL. As I make edits to the photos, the changes get backed up to the Linux server within about 15 minutes, and get put at the front of the queue for backing up to the cloud. All without me doing a thing.
I’m using Linux on an older computer for my local backup server, but the same setup would work with Windows or Mac. I use Linux just because I want to. Computer geek and all that, you know. A NAS or an external hard drive would also work. CrashPlan supports any or all of the above.
Are You Backing Up?
There are certainly other backup products like CrashPlan, or maybe even better in some ways. I settled on CrashPlan because I tried it during their 30 day free trial and it did everything I needed, and more, at a price I could live with. CrashPlan may or may not work for you, but if you don’t have a reliable backup system in place, one day you will most certainly lose some data.
What About the Various Cloud Storage Services?
There are many cloud storage service providers that let you save files to the cloud and perhaps even share them with other people. Dropbox. Google Drive. Microsoft OneDrive. And there are photography-specific services like Google Photos and Amazon Cloud that give you “free” photo storage. And there are photography portfolio sites like SmugMug and Zenfolio that let you store photos for presentation and sales to clients.
Be careful using those services for backups.
The photo-specific services might not be backing up your original image files. They may be downsizing them, recompressing JPGs, storing only JPGs but not raw files, etc. If you try one of those services, read the terms and conditions to be sure the original files are being saved, and check the settings to make sure you haven’t set an option that will shrink image files.
Photo-specific services tend to only backup photos, and probably videos. If you were planning on backing up your spreadsheets and documents and source code and whatever other non-photo data you store on your computer, you’ll have to back those up somewhere else.
I’ve heard of people planning to use Dropbox and Amazon and Google to backup their photos. Consider whether it’s worth the hassle of juggling multiple backup services, and having to make sure they all do what you expect them to do.
For any cloud storage service you intend to use for backups, check that you can get access to older versions of a file (it may be called “history” or something similar) and that you can restore a copy of a deleted file.
Are files backed up automatically or do you have to manually copy them to the cloud storage service? If you’re doing it manually, I predict the one time you need to use it will be the time you forgot to make a copy.
May 2016 Update
I’ve been using CrashPlan for almost two years now. I thought I’d provide an update on how things have gone during the past 22 months.
My data has grown from 700 GB to 1.6 TB. The 3 TB local storage on my desktop computer and on the backup server is more than half full. In another 12 to 18 months I’ll probably need to add more disks to the backup server. Fortunately the CrashPlan cloud storage is “unlimited”.
I abandoned my 1 TB external backup drive long ago. I can live with just one local backup on the Linux server.
I setup my girlfriend’s computer to backup over the internet to my Linux server. It was simple to do and doesn’t cost her anything since she’s not backing up to the CrashPlan cloud. She has a bunch of photos but not nearly as many as I do.
My internet service is fairly slow DSL since I live out in the country where cable doesn’t exist. At full capacity I get 2.3 Mbps down and 800 Kbps up. When CrashPlan is backing up it uses the full 800 Kbps which slows down all my other internet activity. Sometimes when I need the bandwidth for something else, like streaming video, I pause the cloud backup. CrashPlan allows you to setup bandwidth limits by time of day, which I experimented with for a few weeks. Ultimately I was happier to just let it use the full bandwidth all the time, and I would manually pause it whenever it got in the way.
One issue I’ve encountered is if I’m using Photoshop to edit a file, Photoshop can’t save the file while CrashPlan is backing up the previous save. I often work with multi-gigabyte files in Photoshop which can take an hour to backup to the cloud. To save the file I have to pause the backup. So when I’m working on an extended editing session I sometimes pause CrashPlan cloud backup until I’m done editing. I still let it backup to the local server, which is much faster so it’s less likely for Photoshop to conflict. This may only be a problem on Windows the way CrashPlan uses Windows Shadow Copy for backups, and I’ve only noticed it when using Photoshop, perhaps because that’s the only application where I work with ginormous files.
There is a bug with the CrashPlan service on Linux where it sometimes loses network connectivity. I see this as a “Waiting for connection” message in the CrashPlan application on Windows. To recover the connection I ssh into the Linux server and restart the CrashPlan service. This bug seems to have existed for years and CrashPlan support thinks this is an acceptable workaround. Fortunately it doesn’t happen often, maybe once a month. I’ve considered scheduling a daily restart in cron but it hasn’t occurred frequently enough to bother. But it does get in the way of allowing my Linux server to run unattended 24×7.
Overall I’m still very happy with CrashPlan. It continues to satisfy all my backup requirements with minimal fuss and relatively low cost. I’ve learned to live with a few minor issues, but from what I’ve read about competing services there would probably be similar issues. It’s the nature of cloud backup over a slow internet connection!
April 2017 Update
Last month there was another wildfire nearby, close enough for mandatory evacuations. The Reverse-911 call came at 2:30am. It had been a while since the previous evacuation so it took some time to decide what to take with me, especially with lack of sleep. Fortunately, my CrashPlan cloud backup was up-to-date so I didn’t have to spend any time or thought processes worrying about computer data. Thankfully the fire fighters brought this fire under control before it spread to the neighborhood.
I’m not shooting nearly as many photos as I used to, which means my backup archive has not grown as rapidly as in previous years. Backup storage is currently at 1.8 TB. However I decided it was time to increase my local backup capacity anyway. I added two more 3 TB hard drives to the Linux backup server. I converted the storage array from RAID 1 to RAID 10 so the total storage available with 4 drives is now 6 TB (5.5 TB formatted). I originally planned to convert it to RAID 5 to gain that capacity with just one additional drive, but after more thought and research I felt uncomfortable with some of the double failures that could cause the entire array to be lost with RAID 5. I may be worried about small probabilities, but I sleep better at night knowing my local backup data is safer with RAID 10.
And, last week I once again had to restore a file I managed to clobber. I restored it to a version a few hours earlier from the local CrashPlan backup. No issues whatsoever.