End of the line for Roadrunner supercomputer

Mar 31, 2013 by Susan Montoya Bryan
Roadrunner supercomputer puts research at a new scale
Credit: LeRoy N. Sanchez, Records Management, Media Services and Operations

It's the end of the line for Roadrunner, a first-of-its-kind collection of processors that once reigned as the world's fastest supercomputer. The $121 million supercomputer, housed at one of the premier U.S. nuclear weapons research laboratories in northern New Mexico, will be decommissioned Sunday.

The reason? The world of supercomputing is evolving and has been replaced with something smaller, faster, more energy efficient and cheaper. Still, officials at Los Alamos National Laboratory say it's among the 25 fastest supercomputers in the world.

"Roadrunner got everyone thinking in new ways about how to build and use a supercomputer," said Gary Grider, who works in the lab's high performance computing division. "Specialized processors are being included in new ways on new systems and being used in novel ways. Our demonstration with Roadrunner caused everyone to pay attention."

In 2008, Roadrunner was first to break the elusive petaflop barrier by processing just over a quadrillion per second.

Los Alamos teamed up with IBM to build Roadrunner from commercially available parts. They ended up with 278 refrigerator-size racks filled with two different types of processors, all linked together by 55 miles (89 kilometers) of . It took nearly two dozen tractor trailer trucks to deliver the supercomputer from New York to northern New Mexico.

The supercomputer has been used over the last five years to model viruses and unseen parts of the universe, to better understand lasers and for nuclear weapons work. That includes simulations aimed at ensuring the safety and reliability of the aging U.S. arsenal.

As part of the U.S. nuclear stockpile stewardship program, researchers used Roadrunner's high-speed calculation capabilities to unravel some of the mysteries of energy flow in weapons.

Los Alamos has been helping pioneer novel computer systems for decades. In 1976, the lab helped with the development of the Cray-1. In 1993, the lab held the title with the Thinking Machine CM-5.

"And to think of where we're going to be in the next 10 to 15 years, it's just mindboggling," said lab spokesman Kevin Roark.

Right now, Los Alamos—along with scientists at Sandia National Laboratories in Albuquerque and Lawrence Livermore National Laboratory in California—is using a dubbed Cielo. Installed in 2010, it's slightly faster than Roadrunner, takes up less space and came in at just under $54 million.

Roark said in the next 10 to 20 years, it's expected that the world's supercomputers will be capable of breaking the exascale barrier, or one quintillion calculations per second.

There will be no ceremony when Roadrunner is switched off Sunday, but lab officials said researchers will spend the next month experimenting with its operating system and techniques for compressing memory before dismantling begins. They say the work could help guide the design of future supercomputers.

Explore further: Toshiba to launch world's fastest microSD memory cards

4.6 /5 (5 votes)
add to favorites email to friend print save as pdf

Related Stories

Los Alamos Supercomputer Remains Fastest in World

Nov 18, 2008

The latest list of the TOP500 computers in the world has been announced at the SC08 supercomputing conference in Austin, Texas, and continued to place the Roadrunner supercomputer at Los Alamos National Laboratory as fastest ...

Roadrunner supercomputer puts research at a new scale

Jun 12, 2008

Less than a week after Los Alamos National Laboratory's Roadrunner supercomputer began operating at world-record petaflop/s data-processing speeds, Los Alamos researchers are already using the computer to ...

IBM's Blue Gene Pulls Away from the Pack

Nov 12, 2007

IBM’s Blue Gene/L supercomputer sprinted to a new world record as it continued its four-year domination of the official TOP500 Supercomputer Sites list. The world’s fastest computer at Lawrence Livermore ...

Recommended for you

Growing app industry has developers racing to keep up

6 hours ago

Smartphone application developers say they are challenged by the glut of apps as well as the need to update their software to keep up with evolving phone technology, making creative pricing strategies essential to finding ...

Review: With Galaxy S5, Samsung proves less can be more

8 hours ago

Samsung Electronics Co. has produced the most formidable rival yet to the iPhone 5S: the Galaxy S5. The device, released over the weekend, is the fifth edition of the company's successful line of Galaxy S ...

User comments : 8

Adjust slider to filter visible comments by rank

Display comments: newest first

MR166
2.9 / 5 (8) Mar 31, 2013
One would think that this could be used by some other branch of government, a university or for some sort of group computational services. Ours is a disposable society that is living way beyond it's means.
TheGhostofOtto1923
2.6 / 5 (5) Mar 31, 2013
One would think that this could be used by some other branch of government, a university or for some sort of group computational services. Ours is a disposable society that is living way beyond it's means.
Why? It would cost them more and take more time to do it on this than on newer computers. Evolution requires death in order to function. Maintaining obsolescence is wasteful and inefficient. Progress is essential.
MR166
3.7 / 5 (6) Mar 31, 2013
It is still faster than 90% of the computers out there. One dedicated fiber optic link to the outside world and you have a great, paid for, research computer available for use by corporations and universities.
weezilla
5 / 5 (2) Mar 31, 2013
It's expensive to run, takes up a lot of space, and only gets comparatively slower with time. Further, super-processing facilities need special housing, which are expensive to build in themselves. Anyhow, I wouldn't be surprised if most of the parts are donated or sold.
TheGhostofOtto1923
1 / 5 (2) Mar 31, 2013
PLUS aging supercomputers take more man-hours to operate and maintain, and cannot run on newer software developed for newer computers. And who would use them if they could get cheaper time on faster computers? Who said there was a shortage of time on newer computers?
VendicarE
not rated yet Mar 31, 2013
In terms of peak theoretical throughput it is fast. In terms of real world application speed, it only reaches 10 to 20 percent of it's peak.

This is typical for heterogeneous computing environments.

Bob_Kob
4 / 5 (1) Mar 31, 2013
I doubt they will be simply throwing it in the garbage. One of the fastest of 25 in the world wont be wasted, I am sure there will be many bidders for it..
comendant
not rated yet Apr 01, 2013
I hope all of its parts are recycled or reused.

More news stories

Growing app industry has developers racing to keep up

Smartphone application developers say they are challenged by the glut of apps as well as the need to update their software to keep up with evolving phone technology, making creative pricing strategies essential to finding ...

Making graphene in your kitchen

Graphene has been touted as a wonder material—the world's thinnest substance, but super-strong. Now scientists say it is so easy to make you could produce some in your kitchen.