Imagine being an explorer, cracking open a 10,000-year-old tomb, uncovering a priceless ancient artifact – and getting rickrolled. Our deep descendants might just get the pleasure, thanks to a Global Music Vault due to be built in Norway, featuring Microsoft’s Project Silica, a tough new data…
Realistically I think this will only be used for short (sub 100 years) storage, or archives like a microfiche archive that are in continuous use.
There are quite a few use cases where a government or company might be obligated to keep data for long periods.
I’m curious about the 10,000 year claim, does that apply to the full plate, or is that average time to fail per some unit of data?
Since I am sure error correction code is used, it is one and the same.
Id expect its something akin to average half life or whatnot, such that you can make multiple backups and further improve that number.
Honestly Im curious how something could last for over a few thousand years and not be effectively speaking eternal.
Like at a certain point, if it hasnt failed by 5,000 years, what on earth would cause it to fail after another 5,000 years? What process is slow enough to “erode” the perfectly preserved object that cant get the job done in 5,000 years, but it can get it done in 10,000?