- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
145+ Petabytes for a single copy of the archive and they currently have two copies of everything for a total of around 290 Petaybtes.
The largest hard drive I’m aware of is 32TB so you’d “only” need over 9,000 (lol) of the largest drives ever made. I can’t even tell you what that would cost since Seagate doesn’t have a publicly available price for the damn things!
And it had to be replicated, so 3 copies somewhere (granted proper backups are compressed).
Let’s say they have a proper backup compressed at (a random) 60%. That one backup is 87 petabytes. With daily incrementals, so another what, 14 PB to get through 2 weeks of incrementals? Something in the range of 600 PB total with replicas?
(I’m completely pulling numbers out of my ass, I’m not familiar with how such large datasets are managed from a DR perspective).