A standard rule for many is to have offsite backup. But, when you have as much data as Facebook that would mean shipping such a huge quantity of tapes or HD that it would be a logistics nightmare. And, a WAN connection couldn't be big enough for the flow into Facebook.
GigaOm's Katie Fehrenbacher reports that the the 3rd new data center in Prineville is actually a deep storage facility.
The building, which will potentially be 84,000 square feet, will be filled with disc or flash storage and will act as the “backup to the backup to the backup,” storage for the facility’s data, explained Facebook’s Ken Patchett.
This method makes sense, and I actually use it at home/office as well. Whenever I touch my parallels environment on my Mac the whole VM needs to backed up which can be 20 - 30 GB. This change gets streamed to a Drobo-FS from my home to my office which is a separate building connected by one gigabit ethernet. Backing up this much data regularly to the cloud over my 5 megabit uplink would be so painful and take all day or more vs. an hour or two depending on how well the wireless connection works.
Will on site backup be more of a standard? Google, Facebook, Amazon, Apple, and Microsoft most likely do this. It makes a lot of sense for hospitals with the size of imaging data. Financials need to backup offsite for regulatory issues.