Data Storage Problems?
Here I was, thinking that us Next-gen genomics researchers (genomicists?) had a tough time with data management, simply by the phenomenal number of Gigabytes and Terabytes of data we're generating. Then I read this article, titled How the Large Hadron Collider Might Change the Web [Scientific American].
When it's up and running, they'll be generating 15 petabytes (15 million gigabytes) per year, compared to the several Terabytes per year we work with. Ouch.
Anyhow, as a model of distributed data analysis, it's definitely of interest. Maybe we'll learn something from the physicists. Or maybe they can lend me a few Terabytes of space for my projects...
When it's up and running, they'll be generating 15 petabytes (15 million gigabytes) per year, compared to the several Terabytes per year we work with. Ouch.
Anyhow, as a model of distributed data analysis, it's definitely of interest. Maybe we'll learn something from the physicists. Or maybe they can lend me a few Terabytes of space for my projects...
1 Comments:
Fantastic post, I read this having just gone to a talk on the royal data storage mess a 1000 genomes project will entail. Always trust the physicists to push the envelope before we even know it exists.
Post a Comment
<< Home