As photographs accumulate in our collections, and as new cameras grow in resolution and file size, so does the pain of management grow for us anxious photographers. What are you using to protect your emotional/financial assets? This is a real question, as I haven’t been able to come up with a definitive answer by myself.
Sure, the Cloud is beautiful, but the use case of most online storage solutions is not one most of us togs will find appealing.
So here’s a description of my situation and need. I’d love to hear about yours and the solution you have implemented.
Over the years, my collection has grown to over 40 000 photographs. Far more than that, in fact, but a good half of it has been lost on CD. Nothing that matters, just all the photographs of my kids growing and the first travels we did as a family. Nothing that matters. Hence my eagerness to protect what is left.
I use LightRoom to process my photographs (and am currently testing Capture 1 as an alternate solution). My RAW files are therefore visible in LR, organized in yearly folders (and daily subfolders).
My computer’s internal hard drives are no longer large enough to hold this and I have reverted to external hard drives for the older photographs, to supplement the internal drive. This works, but external drives can break / be lost (don’t ask) / be stolen (welcome to Marseilles) / … So both the internal and external drives have to be backed up on … more external drives.
Alternate solutions
Many photographers, including our own Paul P and many pros use Dobro raid systems for great security. Perfectly good solution except for 3 details :
Cloud-based solutions.
Imagine a cloud-based external hard drive that shows up on your file system like Google Drive, Microsoft OneDrive and Amazon Cloud Drive. 2015 photographs are on your internal SSD (as well as on the cloud drive, for backup) all others (which you don’t edit as frequently) are only on the cloud drive. You just delete years from your internal drive as you go and redirect your editing software to the online folder. Wonderfully simple, right ?
The slower access speed isn’t much of a problem since you’re not using the older RAW photographs anywhere near as much and the 1-10% of print-ready / sales-ready TIFFs are stored locally (and backed up). Perfect! Deal!
Not gonna happen, though. Cloud-based storage seems to come in one of two flavours:
So far, I haven’t been able to implement my dream backup solution and WD Passport drives are still accumulating, gathering dust and attracting entropy like mature cheese does flies.
I’m currently investigating Amazon S3. This is the actual storage system used by the big boys (Netflix and others) as well as my company’s SaaS software and many of the cheap 1rst-category cloud-based backup apps mentioned previously. It comes as a set of web services and pricing seems mostly driven by transfer volume (30$ per TB, roughly). Seems powerful, secure and affordable. But can it be accessed by a file management system? I’ve asked, await the answer and am keeping my hopes really low.
The winding path leads nowhere but gobbles up another 40 MB. Sony A7r and a forgotten lens (Zeiss Distagon 1.4/35 ZM ?)
So there you have it.
On the one side physical storage with all the right features but pesky management. On the other dozens of online services that all force an unwanted feature set down your throat but don’t let you use files as a more basic drive would.
But maybe I haven’t looked hard enough. Maybe, some of you have come up with an elegant solution than can change the life of others in a very positive way. If so, please let us know 😉 Please, pretty please!
#1306. AI upsampling is here. Can I have my large pixels back, now?
#1302. Objective vs Subjective photography. Is there a middle ground ?
#1290. Three more thoughts about (experiential) photography
#1288. Does the Gear Choose the Photographer?
#1285. Should we look for meaning in our photographs?
#1279. “For god’s sake stop dithering…”
Session expired
Please log in again. The login page will open in a new tab. After logging in you can close it and return to this page.
You can use ExpanDrive to access your Amazon Drive and PathFinder to manage files / folders with an option to verify copies.
Hmm, S3 as a netwrok drive! Interesting. Thanks Claus. Will try that and report.
I use Netgear Raid Boxes that have an app built-in called ‘Replicate’ – it will copy a NAS drive entirely over to a second (similar sized or larger) box – done locally initially but once the first run has completed, move the second NAS to your Granny’s house and plug it onto the router. It then carries on syncing. I’m a photographer too and have 3 such 24TB (ReadyNAS 516) NAS boxes all synced up, once each run is complete I get an email to confirm. Maybe your needs do not require 24TB but its the app to look out for in the smaller versions. I find online backup too slow and judging by the vulnerabilities in Adobe, Amazon etc I’m not sure I trust it either, whereas Granny can always be relied on.
Douglas, this looks like a more affordable Dobro. You’re right, 8TB would suit me fine for now. I’ll definitely look that up.
Who would have thought Granny would be faster and more reliable than Amazon ? But I just tested a cheap online backup systel that took 7 minutes to upload an 80Mb TIFF. Ahem.
Thanks.
I do something very similar to Douglas. I have a Qnap 4bay NAS with 8tb on raid 10 (2 4tb drives mirrored) cost me about $1200. You could do raid 5 and get 12tb but I like raid 10 because it provides me hard drive redundancy and quick access speed on my network. The placement I learned from Gary Fong after he had a fire. His drive was hidden inside a cinder block and survived. So, I put the raid in my basement against the cement foundation on a cinder block. I then surround it with 2 layers of cinder blocks. Haven’t had any problems with air circulation.
I also have a 4tb drive at my parents connected to their computer. It runs Crashplan (free) and backs up specified folders in the background. It’s compressed, so I can get quite a bit on it. I don’t send video files to the second backup, just pics.
For some of my fav pics I also put them up on the cloud. I can access the NAS and crashplan remotely but cloud access is what I usually go to in a pinch. It integrates well with my phone.
I’m not sure if my solution is old hat or simple minded, but here ‘tis:
When I return from a shoot, regardless of length, where RAW files had been captured and stored on cards, the RAW files are then downloaded into a folder on my computer that describes the event (e.g. Citiscapes<Madrid<date.) Inside the “date” folder (I use a six digit scheme such as 080615, which would be today’s date on this side of the pond). First thing is to rename the files (e.g. Madrid_092212.) As soon as possible, I do a quick culling of the RAW files and delete every file I am confident I will not revisit.) The remainder are copied (not backed-up) onto a hard drive named for one of my categories (citiscapes, weddings, special events, portraits, etc) with the same folder tree as on my main computer. From here on, this hard drive will contain nothing but the RAW files. (Over the years, I have been able to condense several back-ups of RAW files into one or two. Saves space and, I assume, keeps the drives healthy by using them.
I then copy this folder tree onto a second external drive. I now have all my original files in three places, but the goal is to get everything off my computer ASAP. To this end, I process my RAW files into a folder titled “work.” When I’m done working on it for the immediate future, I copy that folder to external drive #2. I can then delete all the RAW files and work from my computer.
You’ll notice that I do not use LR or Bridge as my Go To storage, but there’s no reason for this except that I got into this habit before LR came into existence, and I haven’t changed simply out of inertia. There’s nothing automatic here either, no applications that take care of moving things around. But the workflow does have the benefit of keeping my attention focused for a week or two. And anytime I want to return to my work, I know where it is, which isn’t on my computer.
One last thing: when I finalize a project for printing, I save that file into a separate folder on external drive #2 named “PRINTS,” with file names that identify the print size as well.
Looks complicated in print, but in practice, I find things move along easily.
Leonard, that’s a highly logical system. I should have added “for Dummies” or “for Lazies” to the title. What I’m rapidly learning is that there is no cheap, lazy and efficient way of doing this. Crazy world.
More seriously, since I came to LightRoom with no prior process, I bought a video tutorial from LuLa, had a good laugh and committed to using the excellent keyword system. As bad as the folder management is in LR, keywords are a light saver. So I just import where the crazy software decides and rely on keywords to find my photographs. It’s a very elegant system that lets you tag a photo from multiple points of view (e.g. B&W, flower, Arizona, fisheye, rain). Yes, it does rain in Arizona 😉 Added to the auto date and other metadata it makes finding pictures in a catalog a breeze. So I simply backup chronologically. But even that is a tall order for my lazy and impatient half.
But lazy is good. Lazy tikes get hired in great corporations because they refuse to work hard following the normal processes and discover easier, leaner ways. I’m determined to find a simple storage/backup that meets my standards of drained finances and effortlessness 🙂
This is like asking “what camera is better”, you’ll probably get 1000s of differente answers 🙂
Anyway this is my solution (yes, I’m this paranoid):
1) first of all I export & re-import as 100% quality jpg in Lightroom every file I like I’ve worked on; if someday version compatibility should break (planned obsolescence, bugs etc.) I will at least still have my final version of the file
2) the first backup is basically in real time with CrashPlan. I have the unlimited plan, for a few euro per month I can backup whatever I want on their cloud, and given that you can actually specify your encryption key I use this service to backup basically my entire disk, not just the pictures. The first backup (around 500Gb at the time) took almost 2 months (I still had the slow ADSL and not the fiber, back then) but it went absolutely smoothly.
3) the second backup runs every 6 hours (Time machine, with a custom script to avoid it running every 1 hour)
4) I then have Carbon Copy Cloner making an automatic backup every-single-morning on a separate disk (actually a RAID enclosure that costed me around 100€ attached to my router with a Gigabit Ethernet cable, but I use it in JustaBunchOfDisks mode, so my iMac sees only a giant disk); CCC checks this same backup every two months for corruption (it takes ages, but it is something to do to avoid backing up corrupted files spreading then the corruption to other backup medias)
4) I rename the exported & reimported final versions in jpg of my files (read point 1) to FILENAME_export. This way it is really easy to make a copy of them with a simple rsync script, every month or so, on a small SSD that I then provide to hide in my home
5) lastly, I copy the same files mentioned in point 4 to an optical disk, that then I stash in my car (I know, not the best possible place, but I do not own a second home)
There you have it. It is not completely fail-proof, but for the last years it worked…
“CCC checks this same backup every two months for corruption (it takes ages, but it is something to do to avoid backing up corrupted files spreading then the corruption to other backup medias)”
This is an interesting aspect, Luca. So far I’m not checking my files for corruption, at least not 100% and not systematically. What are the best tools (besides CCC) for this on a MAC?
Boris
I’m quite positive you should be able to do something like this using rsync (from the Terminal) with the option –checksum, but I’ve not tried this myself mostly because CCC is quite handy (at a cost…) and I’ve used it for years so my backup strategy already included it.
Should I start today from scratch, I wold probably set something with a few scripts in rsync to be automatically executed with a LaunchAgent (basically what I did with Time machine) and just forget about it.
Check Out Beyond Compare http://www.scootersoftware.com
It has a superb user interface, multi platform Win, Mac and Linux and can be scripted for automation.
I use the Folder Synch feature to backup my production data to my backup drives (Synology Nas and USB 3 external drive).
It has extremely powerful Compare features. I synchronise new or changed files from my production drives to my backup drives.
WOW! Luca, that is scary!! I don’t even understand some of the words you use (not kidding). All this is making me realize how utterly inadequate my current process (if you can call it that) is.
I really don’t want to go overboard and quite like the Raid + hidden backup solution. That seems simple / affordable enough and any event that would make this insufficient is likely to make me regret more than my photographs …
Thanks for this eye opening account.
In these matters being a nerd helps a lot 🙂 In my defense I was forced to become one: I had to crack it open and re-install everything on my very first computer the day after I bought it (like 25 or so years ago). As they say: learn to swim or drown.
Anyway the motivation behind all this is quite simple, even if how to actually do it can vary a lot and can become complicated: you need at least a backup + a backup of your backup (but backing up from the original files, not from the 1st backup, so if something goes wrong with one of the backups you still have an un-corrupted one) + another backup on some other kind of media stashed, preferably, in another place altogether. To all this you can add an online option, given that prices are going down.
The all shebang should look something like this:
1) 1st backup: (preferably fast, like firewire800 or usb3) hard disk or RAID enclosure or NAS
This backup lives attached to your computer
2) 2nd backup: hard disk or RAID enclosure
Same as 1, but you keep this disk unplugged from the computer and especially from the power outlet when not in use
3) DVD or blue ray optical disks
Keep these in another location (they are basically your last resort) and trash them and burn a new bunch every couple years at most (they’re cheap, like 1€ or 2€ each)
4) optional but strongly suggested: something like CrashPlan, i.e. an online option
To do 1) and 2) you can use any program you want, even Time machine if you’re on a Mac. But, the thing we were discussing with Boris, any files can become corrupted (i.e. you will not be able to read it anymore). If this happens and your backup software copies this corrupted file into the backup then you will have not just overwritten the only functioning copy of the file you had (the one in the backup) but you will merrily continue to backup this useless file until you happen to notice the corruption.
To avoid this programs like Carbon Copy Cloner can be set to do a “deeper” check of the files once in a while, to be sure they’re actually backing up perfectly working files and not garbage.
Anyway don’t sweat it to much; once you have at least three copies of your files with one stashed somewhere else I find quite unlikely that you’ll loose all three at once! (finger crossed…)
This is what I do: All my images are stored on an external raid 1 (mirror) drive (OWC Mercury Elite Pro Dual). When I process new images after a trip, I will very often leave the original images on the memory cards until I’m completely finished with the processing.
Every 1-3 months I copy all my images and the LR library on two cheap external drives (for example WD passport) and store these drives in seperate rooms of my house (as a weak protection from theft). Every 6-12 months I will store a drive with all my images at a different place to be protected from fire or theft.
By the way I have 25000 images in my library. So far 1 TB is enough for the backup of all my images.
Boris
Hi Boris,
this is more or less how I handle backup as well. All my photographs from the US are still in their memory cards as well as on my computer and a WD Passport. It doesn’t go much futher than this, except than I copy the files onto a second passport when the cards get erased. Douglas, above, mentioned a RAID + router system which seems interesting and could make this both a little simpler and a more secure. I’ll take a look at the Netgear solutions.
Your figures more or less match mine. 45k pics is about 2TB. Times two for redundancy. A 2x4TB system should last me a long time.
Pascal
What terrifies me about “clouds” is a lack of control. They belong to someone else, and you’re allowed to use them. Then one day something happens, and you’re not. I can’t recall the circumstances, but back a couple of years there was an article on the net about one of these “clouds” being closed, and costing the susbcribers all their data. I’ve no idea whether it was fact or fiction, but I am scared stiff by that thought.
True. Some T&C on online media sites (and probably online storage as well) is outrageous.
Here’s what I do:
1. Frequently backup files I’ve imported to computer to an external hard drive and Synology NAS simultaneously.
2. Offsite backup of NAS to bank safe deposit box (external hard drive) about twice a year.
3. Incremental offsite backup to Amazon S3/AWS automatically daily via the NAS until the next offsite physical backup.
In more detail:
1. Import photos to LightRoom on my MacBook Pro a couple of times a week. Quickly review and delete what I can, export my favorites (after no more than 30s of postprocessing) for bulk upload of jpegs to Flickr (first round of backup)
2. Every couple of imports or whenever my hard drive is getting full, copy files to a fast external hard drive (currently 4TB, by the time I need more, they’re bigger and cheaper). If I ever spend time post processing, I work off these files.
3. Whenever I copy to the external hard drive, I also copy to a Synology NAS box in my basement (currently 4x4TB drives in a Synology Hybrid Raid array – gives redundancy of one drive). I like the Synology NAS software – very easy to administer, frequently and automatically updated with security fixes and new functionality, lots of features. Very easy to administer, much better than my old Windows Server (yuck).
4. My 2015 pictures folder is backed up every night to the cheapest level of S3 (Synology software has incremental backup to AWS built in – just check a directory to backup and enter your AWS credentials). Cost maybe $20/month for a TB of data up there
5. Once or twice a year I back up to an external hard drive and put it in my bank’s safe deposit box ($150/yr for a large 10″x15″x24″ box). I then adjust my Amazon S3 backup to just back up incrementally after the physical offsite backup.
At the moment our stuff still fits on 2TB drives – and is rsynced to a Synology Raid-1 (2x2TB) NAS. You can expand these as you grow, but there’s also the possibility to build your own, like:
– a HP Micro Server (look it up, it’s cheap and cool)
– a couple of 4TB drives, you could set them up as Raid-5 or so
– an operating system of your choice (I’d take Debian)
And yes, online storage is cheap. S3 seems like the best choice, since you don’t pay for storage at all, only for transfer.
Wonderful pictures btw, congrats!
Thanks Wolfgang. Much appreciated 🙂
It looks like I’ll take that S3 archive + Raid work directory approch.
Short answer:
I highly recommend two local backups and Dropbox.
(Very) long answer:
I have >16.000 photos with ~250 GB stored in sub-folders by date and activity, linked to my Lightroom catalogue.
A good backup-strategy is protects against a) hardware failures (drive loss), b) user/software errors (overwriting, deleting, file corruption) and c) environmental incidents (burglary, water & fire damage). You need to have at least three copies of your files at different locations with different snapshots. To revert like accidental overwriting you need an instant local backup (versioning). To prevent from disk failures, you need at least two different hard-drives (redundancy). And if you don’t want to come home and find your work destroyed by a water breach from your renter’s broken washer, you better have a separate drive at a geologically different location (off-site backup). Versioning, redundancy and off-site are key to a safe backup-strategy. If you missed one point, you don’t have a backup-strategy at all.
Versioning can be addressed with the embedded tools like Timemaschine for Max or Windows’ own. For hardware-redundancy I recommend using at least two different drives – and no, RAID-1 are not two drives! I use an internal HDD as my primary storage, an external HDD which gets synced at least once a week and an external RAID-1 which gets synced at least once a month. The sync-frequency depends on how often you create/update data. Use a sync-script/app to keep the content of all drives the same. What about off-site? The easiest way is to give a drive to someone you trust (family, safe deposit box) reasonable far away (not your neighbour in case of fire). But how do you update the files? You fetch the drive, update the files and return it? Murphy’s law says “Whatever can go wrong will go wrong”. Chances are that when your drive is at your home for an update, your flat will flood, your house will burn, your computer will corrupt all files or everything will be stolen. You could use two Connected Data Transporter placed at your home and one somewhere remote and all files will be synced via Internet. It’s expensive (2 transporter à 160 € each + 2 drives + cases + the other drives you need at home nevertheless), both parties need very fast internet upload (not just you) and it’s not super-reliable.
Comes in: Cloud Storage. Cloud storage solves several issues with your backup strategy. 1) Hardware-redundancy: The server centers are using highly redundant RAID drives and redundant servers. You pay for administration, overwatch and support. This is more than you can ever do. 2) Off-site backup: They have several independent centers spread throughout the world. Even if your whole country is flooded by a tsunami or radio-active contaminated, your cloud data is save on the other side of the world. Not to speak about burglary. 3) Versioning: If you accidentally overwrote or deleted files, you can just revert it back, even weeks and months after it, with no local storage wasted. 4) (this is not part of our general plan but unique to cloud) Access: You can backup from anywhere at anytime on the go. Also you can access your files from anywhere and on mobiles. And 5) (also not part of our backup-strategy) you can easily share and distribute files, albums, whole libraries or work in collaboration.
There’re mainly two types of clouds: Cloud Storage and Cloud Sync. The first is like a disk drive but remotely. You need to copy all your files there manually (or by script), you don’t waste local storage (just cache) but access is slow because you have to download the data first. Cloud Sync means you have a local copy of your files in a designated folder and all changes will automatically updated to a copy on a remote server. So, local access is very fast but you need the same amount of storage locally. Beyond that there’re different kind of services: general sync/storage, backup only and file/use-case specific services like for photos. What you want depends on your workflow, space requirements and budget. Services tend to limit their service because of a) budget and b) feature-focus. Normally, backup- and photo-specific services offer unlimited storage and are cheaper than their general file-based competitors, but therefore limited. Especially if you’re European (or just not US-based), you should also consider the location of their data center. If they only have one in the US, you typical have very slow connections from Europe or Asia. When it comes to Cloud storage, your main selection criteria should be trust, reliability and commitment. Why is the company doing this? Is their business model robust? Do they respect your privacy? Who’s the product?
I’m using cloud storage since 2011. I tried a lot of services, some just short, some over 2 years. Here’s a story: I was using Bitcasa for a very long time. They once offered unlimited, end-to-end encrypted storage with unlimited versioning for $99/year. And they offered cloud storage (instead of sync only). You had a Bitcase drive on your desktop, could create folders, copy files there, even mirror a local folder to Bitcasa, all with high upload speed. Great stuff but slow when accessing the files from the drive. Then one day, they cancelled the unlimited plan for new customers because it was just not sustainable. But they grandfathered the unlimited plan for existing users. Like nine months later they moved to an all new server/data structure and while that forced every existing customer to subscribe to the new plans ($99/1TB, $999/10TB) within 30 days. I only had 200 GB of data up there and could easily make the transition but image you had over 1 or 10 TB up there (it was unlimited previously)! People where forced to pull their data off within less than 30 days. Of course servers went down, people couldn’t get their data in time, some where at holiday … I was just pissed how this company treated their customers. But then I recognized that like >20% of all my files were corrupted and data junk, new and old ones. And that should never happen. I searched for an alternative, cancelled my account and moved on (fortunately, the company had balls and send all my money back within 24h). And I learned a lot: I lost no data because I had a local copy of all my files. How’s your exit strategy? As much as we like a cheap service, when it’s not sustainable you’re in trouble. Don’t look at the specs, look behind it.
Think about your workflow! I was then testing Picturelife extensively. It’s one of the last remaining photo-dedicated services. They offered unlimited storage for $15/month for up to 3 independent users. And they had much to like: a lovely onboarding, Lightroom extension, RAW support, very quick support, full archive download (!) … But there were 3 (actually 4) things: 1. One day I was writing the support about some feature requests and suddenly another customer joins complaining about seeing my images on his account. They fixed it immediately but this should never happen. 2. Though my biggest problem was that it is actually a web-service. If you just want a dump where you through all your images in, here you go (also like Flickr, Google Photos, Amazon Prime Photos). If you already manage your photos in folders, copy, move and delete files and use Lightroom, you have to double your photo-management on their website too. 3. During my testing I also noticed that their desktop uploader (which looks at folders and uploads new and altered files) misses like 3 out of ~800 files. It says it’s done but wasn’t. With just a few files, you can relatively easy look at missing files. But with a large collection of 10,000s of files, there’s no way to even notice it. And 4. my finally decision against Picturelife has not much to do with them itself but overall: We had similar services before. We had Loom which was acquired by Dropbox and shut down. We had Everpix which went out of business … Are photo-only services sustainable at all? Shortly after I decided to move on, Picturelife has been acquired by Stream Nation. It’s still there but … I again learned: Your service of choice must fit to your workflow. Mine is a file-based approach. And even if the company is dedicated and has a lovely product, I want a sustainable partner and ease of mind, no hip start-up.
OK, back to the usual suspects: Dropbox, Google Drive, Microsoft OneDrive, Apple iCloud Drive, Box, Amazon Prime Photo, Flickr, Backblaze and Crashplan. The first five are sync services, Amazon and Flickr are just photo dedicated and the last two are online-backup solutions. Box is more of an business solution, expensive and has troublesome syncing. We use it at work. I wouldn’t recommend it for personal use. Crashplan has a buggy java client and is very slow. Backblaze is quite OK if you just want an redundant, end-to-end encrypted, easy off-site backup. You just make a backup of your external harddrive and that’s it. But you need to connect it every once in a while otherwise Backblaze will delete it from its servers. With all backup-only solutions: It’s a backup! You need to have the files locally available. And it’s not meant to share or work with the files mobile or actually access single files easily. And upload is quite slow (<200 kb/s). Amazon Prime Photo is very cheap ($12/year) and Amazon is the backend of most other services. But actually Prime Photo is a bonus for Prime users. It's not that Amazon wants to make the best experience for you but to buy into Prime for $60/year. It's like Amazon Prime Video vs. Netflix. And actually the client works like this: You have a kind of drag-place on your desktop and push any files in there you'd like to see on Prime Photo. But just supported photos and videos. If your raw is not supported, PSDs, XMPs (!) or some video files (mkv), it will not work. And forget about "full backup" or management. It's marketed to end-consumer to upload their mobile photos and have them all in one place sorted by date. Similar is true for Google Photos or Flickr: No raw support and your files will be "optimized" (shrunk and further compressed). Avoid them as photographers.
We're left with Dropbox, Google Drive, Microsoft OneDrive and Apple iCloud Drive, all with similar functions and features. iCloud is in a beta state and I don't recommend it to anyone who needs reliability. Also, I don't give more data to Google then necessary. I avoid them, especially with my private photos. At Google, you're the product. Other than that: It's secure from a technical point of view, Google is not going anywhere, it's fast and stable. But nothing more fancy about it. Microsoft as been putting a lot of effort into OneDrive recently. You have great apps, clients, features (eg. RAW support) and low pricing ($7/1TB with Office 365). One thing that makes them stand out of the crowed is when you use Windows 8 (or above), you can select which files are stored locally and which ones stored just online. However, all files appear in your Windows Explorer. But my concern with OneDrive are privacy issues. It happened twice in the recent years that Microsoft invaded user accounts on its own. While on the search for a leak, Microsoft breached an Outlook account of a former employee to gather evidences. That should only be allowed to National Security Authorities (with a court order). Second was someone who uploaded child porn to his OneDrive and got visited by local police. Microsoft runs automated scripts over all their user content to match image recognition with lists from US National Centre for Missing and Exploited Children. Fair enough with child abuse. But I don't want to explain myself to the police why I have photos of my 3-year-old playing naked in the garden. It's an algorithm and they are prone to fail.
Dropbox: I use Dropbox ever since and it's my mobile file system. Dropbox is the most reliable and matured service I know of, fully serves my needs and with Carousel they also address basic photographer needs. Dropbox' business model is to sell people online storage – and that's it: no ads, no shop, no eco-system to buy-in. It just works flawlessly and sustainable. Dropbox offers 1TB for $10. If you need unlimited, you need a business plan for $12/user but need at least 5 users (or at least pay for them). So unlimited comes at $60/month for up to 5 users. It makes than sense to share them with friends and family.
Here's my setup: In my desktop PC I have a second internal 1TB drive which is only Dropbox. This is all synced with Dropbox Pro (as I only need 250 GB). When I come home from a shooting, I copy all photos into one folder inside Dropbox. Dropbox will immediately start syncing. Meanwhile I start to import the files to Lightroom and go through them, sort them, rate them and delete the worst ones. I can easily restore them because Dropbox offers 30 days of file versioning. If you need more, get the optional rat-pack (unlimited versioning). Mostly when I'm done going through, Dropbox has already finished uploading (1GB/15min). If not, you can leave it on or just continue next time you start up your PC (other then like Amazon). The nice thing about Dropbox is their support for block-level sync. This means, only the altered bits of a file will be uploaded again. When I mass-tag my files, I don't need to upload 10 GB of files again but just the couple of bits from the new header. This also helps a lot with fully encrypted archives like Boxcrypter (which I recommend too). I really urge you to make block-level sync support a must criteria for your decision. Microsoft and Google do not support it yet. On my other computers I enabled selected sync for Dropbox and exclude the photos folder, as I don't need them on my business laptop where I only have a 512 GB SSD. Dropbox's website and their Carousel app do not support RAW previews (what a shame). But I turned this into an advantage: When I'm done selecting and editing the best shots, I export 2048 px large JPGs to a JPEG-subfolder. With this, my timeline is not cramped with senseless second choice photos but just my selection. I can then quickly make this sub-folder a photo album for easy access and sharing. Every once in a while I make two copies of that photos folder to my external USB3-drives at different time intervals. I recommend one external, local copy beside your main work/archive drive and cloud storage, just for convenience. If your work drive completely fails, it's faster to copy hundreds of GBs from an external drive instead of downloading them. And as I learned from the Bitcasa incident, you better have a local copy! But you don't need two externals or a RAID-1. I just have both drives for legacy reasons. And a RAID-1 on it's own only prevent you from a sudden drive loss. If your RAID controller breaks and corrupts your files on both drives, RAID-1 hasn't covered you.
One more tip: If you need to store more data to Dropbox than which fits on you local drive (Dropbox doesn't support external drives) you can a) unselect old folders from being synced (but keep stored) or b) use the app ExpanDrive for Windows/Mac which embeds Dropbox as an external drive into your system (also works with Box, GDrive, OneDrive, S3, SFTP and WebDAV). You can than just copy any file to Dropbox without needing a local copy. But accessing is slow because you need to download it first. A mixture between both techniques makes sense: Keep your current folders in sync with the Dropbox client and access your archive with ExpanDrive.
Thanks, that’s very comprehensive and actually makes a lot of sense. I obviously didn’t give Dropbox the attention it deserves and will take a careful look at what the company offers.
Philippe, my co-author, quite rightly notes that we don’t need to be super bothered with the bad photographs. So his advice is to delete the less interesting photographs from the “active” locations (and leave them in some less easily accessed archive).
So, some form of low-tech archive (why not ExpanDrive) combined with more accessible (RAID or DropBox) selections seems like the way to go.
You should go for something you feel comfortable with. I’d recommend the most easiest and convenient way. I assume you started photography at least a couple of years ago (more likely a couple of decades). So when you start thinking about your backup strategy only now, it seams you’re more the “lazy” guy when it comes to computer technology and safety, like me. Then I recommend you a strategy that just happens without much input and knowledge from yourself. To make any kind of NAS really secure and reliable means a lot of research and time spend on technology rather than photography, and in the end you may end up with something that is not “magical” but work.
As I said previously, you need to think about your workflow, budget and requirements. You state above, you need more than 1 TB storage, so the regular Dropbox Pro is not for you. Are you willing to spend $60/month for the unlimited business account or can you share it with others? Do/can you have fast upload speed (>10 MBit)? Otherwise cloud storage won’t work for you at all. Do you have the time to get yourself into network and data security and set up several redundant NAS and automated scripts that do the work for you? If not, you won’t do it regularly, you won’t notice any corruption or failure, and it will fail and you will loose precious work. That’s why just keep it simple and practicable.
I want to learn more about RAID NAS systems myself. Currently using a trio of hard drives for back ups.
A NAS alone doesn’t prevent you from failure and data loose. A NAS helps you accessing you files from different devices. You still need three different locations where your data is stored and at best one off-site. So far, your backup trio is very good. You just don’t have protection against burglary, fire or water damage yet.
An interesting topic for us all.
Currently, I store all images on an external HD and Lightroom is linked to this. I have another external HD which mirrors the first HD automatically weekly. I have a third external HD to mirror the first which I sync manually weekly and store in a separate room of the house.
I think back to the days of film. I used to just put the negatives into a dresser drawer. They’re still in a drawer somewhere. If lost, they’re gone forever too. No backups, but less volume.
There are some brilliant and detailed workflows/backup schemes listed above. I’ll keep my comment relatively short and it’s more about a backup strategy than workflow.
First one is a warning, NEVER rely on a SINGLE backup, even if you are lulled into a sense of security with a Mirrored and/or Striped RAID set-up. I came very close to losing every digital and scanned file I had ever taken after thinking I was safe by backing my main library up to a Drobo drive. My problems occurred when the Drobo became corrupt. After first investigations I found a couple of companies specialising in recovering files from these drives – but at a cost of £1,000’s. Being a bit ‘techie’ and even more determined not to lose everything I found a Linux based utility which let me mount the drives and I managed to recover almost everything at no more cost that a few hours work.
Since then (and this was around 5 years ago) I now do the following…
In my old tower type Mac Pro I have the four main drive bays populated with 4 x 1TB configured for speed in a Striped array. I use this as the ‘working volume’ and copy everything initially onto it and also run my LR catalogues from it (I have a PCI card with SSD’s installed for the Boot/Apps drive).
I then have a WD ‘My Cloud’ drive on the local network hidden in a cupboard and use CCC to backup to it. So I now have a ‘live’ copy, a local backup copy so all thjat is missing is the ‘cloud’ element.
For this I use a company I came across when I had my disaster, they are called Code 42 and have a system called Crashplan. If you use their cloud storage then I’ve found they are one of the most cost effective out there. Have a look at their ‘Home’ plans, unlimited storage for $5 per month for an individual or $12.50 for a family (5 devices I think). Their software on your computer is very reliable and very unobtrusive but the real kicker if you don’t want to pay is that the same software allows you to designate any drive on a trusted computer (or computers) elsewhere on the web where you can store encrypted backups. So you carry out an initial backup to this drive locally on your machine, then you move it to your friends/relatives/other computer and do the incremental backups over the web. Whether you chose to do the same for them or not is up to you and you also have the option to do this to more than one location! Just for the record I am not (and never have been) associated with the company in any way other than as a happy customer.
But remember NEVER RELY ON A SINGLE BACKUP!!
I agree to what you said and recommend but I just point out that Crashplan has some flaws. Their clients are written in Java and therefore very memory and performance consuming. Furthermore Java can expose you to some major security issues, as seen in the past. One may try out Backblaze which has almost the same features but with native clients and the same downside of just owning their own servers in the US and therefore have slow connections from Europe or Asia. But both are reliable and have very good reputation.
I’m not sure I agree with the comments about Crashplan and Java. I run Crashplan both on Windows and Macs, none of them have Java Runtime installed and Crashplan has run perfectly on all machines never causing any hang-ups or crashes. It does use a bit of system memory (502Mb looking at task manager just now) but gives very little hit on performance. When I installed the desktop client on the machines they installed as a native app and at no time asked for Java to be installed.
Java applications are not bad bad but could be much better if they are native apps (performance, security). Take your 500 MB memory footprint: Dropbox and other native clients use 50-100 MB memory, my backup app just uses 30 MB. The same is true for overall speed and other system resources. They good thing about Java is that you can develop one code for every system you support. The downside is that you can’t hardly use any native possibilities from the OS (like services) and everything has to be translated in realtime. This eats up resources.
Crashplan doesn’t ask for Java because they embedded the Java runtime some months ago (or last year). Just google for “Crashplan Java” and you’ll find some stories.
I’m not saying Crashplan or their app is bad but there’re better options in my opinion. However they have good reputation, service and therefore loyal customers. It’s a good choice nonetheless.
You’re right, they do embed their own JRT so my apologies. Whilst Googling it I came across this article which is quite interesting and seems to sum things up quite nicely regarding the options for both Mac & PC users.
http://www.macworld.com/article/2915637/why-i-prefer-crashplan-for-online-backups.html
The all new My Cloud Business Series solutions allow small businesses to centralize and protect their data in the workplace and provide employees with secure access to critical data from anywhere.