Researchers tackle problem of data storage for next-generation supercomputers

September 7, 2006

The U.S. Department of Energy has awarded a five-year, $11 million grant to researchers at three universities and five national laboratories to find new ways of managing the torrent of data that will be produced by the coming generation of supercomputers.

The innovations developed by the new Petascale Data Storage Institute will enable U.S. scientists to fully exploit the power of these new computing systems, which will be capable of performing millions of billions of calculations each second.

The institute combines the talents of computer scientists at Carnegie Mellon University, the University of California at Santa Cruz and the University of Michigan with those of researchers at the DOE's Los Alamos, Sandia, Oak Ridge, Lawrence Berkeley and Pacific Northwest national laboratories.

Increased computational power is necessary because scientists depend on computer modeling to simulate extremely complicated phenomena, such as global warming, earthquake motions, the design of fuel-efficient engines, nuclear fusion and the global spread of disease. Computer simulations provide scientific insights into these processes that are often impossible through conventional observation or experimentation. This capability is critical to U.S. economic competitiveness, scientific leadership and national security, the President's Information Technology Advisory Committee concluded last year.

But simply building computers with faster processing speeds -- the new target threshold is a quadrillion (a million billion) calculations per second, or a "petaflop" -- will not be sufficient to achieve those goals. Garth Gibson, a Carnegie Mellon computer scientist who will lead the data storage institute, said new methods will be needed to handle the huge amounts of data that computer simulations both use and produce.

"Petaflop computers will achieve their high speeds by adding processors -- hundreds of thousands to millions of processors," said Gibson, an associate professor of computer science. "And they likely will require up to hundreds of thousands of magnetic hard disks to handle the data required to run simulations, provide checkpoint/restart fault tolerance and store the output of these modeling experiments.

"With such a large number of components, it is a given that some component will be failing at all times," he said.

Today's supercomputers, which perform trillions of calculations each second, suffer failures once or twice a day, said Gary Grider, a co-principal investigator at the Los Alamos National Laboratory. Once supercomputers are built out to the scale of multiple petaflops, he said, the failure rate could jump to once every few minutes. Petascale data storage systems will thus require robust designs that can tolerate many failures, mask the effects of those failures and continue to operate reliably.

"It's beyond daunting," Grider said of the challenge facing the new institute. "Imagine failures every minute or two in your PC and you'll have an idea of how a high-performance computer might be crippled. For simulations of phenomena such as global weather or nuclear stockpile safety, we're talking about running for months and months and months to get meaningful results," he explained.

Collaborating members in the Petascale Data Storage Institute represent a breadth of experience and expertise in data storage. "We felt we needed to bring the best and brightest together to address these problems that we don't yet know how to solve," said Grider, leader of Los Alamos' High Performance Computing Systems Integration Group.

Source: Carnegie Mellon University

Explore further: Study finds 'lurking malice' in cloud hosting services

Related Stories

Study finds 'lurking malice' in cloud hosting services

October 18, 2016

A study of 20 major cloud hosting services has found that as many as 10 percent of the repositories hosted by them had been compromised - with several hundred of the "buckets" actively providing malware. Such bad content ...

Nobel: Big hopes rest on tiny machines

October 5, 2016

Molecular machines, which earned their inventors the Nobel Chemistry Prize on Wednesday, are a fraction of the width of a human hair but strong enough to move things 10,000 times their size.

How the brain makes new memories while preserving the old

October 3, 2016

Columbia scientists have developed a new mathematical model that helps to explain how the human brain's biological complexity allows it to lay down new memories without wiping out old ones—illustrating how the brain maintains ...

New NIST test bed makes the 'digital thread' accessible

October 7, 2016

Researchers at the U.S. Commerce Department's National Institute of Standards and Technology (NIST) have launched the Smart Manufacturing Systems (SMS) Test Bed. The test bed is an innovative model factory that will facilitate ...

Recommended for you

Microsoft aims at Apple with high-end PCs, 3D software

October 26, 2016

Microsoft launched a new consumer offensive Wednesday, unveiling a high-end computer that challenges the Apple iMac along with an updated Windows operating system that showcases three-dimensional content and "mixed reality."

Making it easier to collaborate on code

October 26, 2016

Git is an open-source system with a polarizing reputation among programmers. It's a powerful tool to help developers track changes to code, but many view it as prohibitively difficult to use.

Dutch unveil giant vacuum to clean outside air

October 25, 2016

Dutch inventors Tuesday unveiled what they called the world's first giant outside air vacuum cleaner—a large purifying system intended to filter out toxic tiny particles from the atmosphere surrounding the machine.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.