IBM rose to CERN's Data Challenge
IBM has risen to a data management challenge presented by CERN,
the European Organisation for Nuclear Research. The challenge was set in
anticipation of the needs of the Large Hadron Collider (LHC) Computing
Grid, the world's largest scientific computing grid. The LHC is expected to
produce 15 million Gigabytes of data per year, once it is operational in
2007. Collecting and storing those data will clearly be a mammoth task.
IBM put its "storage virtualisation" software to an internal read/write
storage test in which the computing needs of the LHC were simulated.
Using the IBM TotalStorage SAN File System, the internal tests
"shattered performance records", according to IBM, by reading and
writing data over 300,000 files, each 2GB in size to disk at rates in excess
of 1GByte/s for a total input/output of over 1 petabyte (1 million gigabytes)
in a 13-day period. The test also simulated a range of failure scenarios,
such as disconnected storage targets. The system proved robust throughout.
The IBM TotalStorage SAN File System is designed to provide scalable,
high-performance and highly available management of large amounts of
data using a single file namespace regardless of where or on what supported
operating system the data reside.
For more details visit CERN and IBM websites.