IBM To Build Supercomputer For U.S. Government

Feb 03, 2009 by John Messina weblog

(PhysOrg.com) -- The U.S. Government has contracted out IBM to build a massive supercomputer bigger than any supercomputer out there. The supercomputer system, called Sequoia, will be capable of delivering 20 petaflops (1,000 trillion sustained floating-point operations per second) and is being built for the U.S. Department of Energy.

The U.S. Department of Energy will use the supercomputer in their nuclear stockpile research. The fastest system they have today is capable of delivering up to 1 petaflop. The system will be located at the Lawrence Livermore National Laboratory in Livermore, Calif., and is expected to be up and running in 2012.

The Sequoia system will also be used for a massive power upgrade at Lawrence Livermore, which is increasing the amount of electricity available for all their computing systems from 12.5 megawatts to 30 megawatts. This power upgrade will require running additional power lines into the facility. Sequoia alone is expected to use approximately 6 megawatts.

This Sequoia computer is so massive; IBM is building a 500 teraflop system, called Dawn that will help Researchers prepare for the larger 20 petaflop system.

The Sequoia system will be using all IBM Power chips and deploy approximately 1.6 million processing cores, running Linux OS. IBM is still developing a 45-nanometer chip for the system that may contain 8, 16, or more cores. The final chip configuration has not been determined yet but the system will have 1.6TB of memory when all completed.

IBM plans to build this supercomputer at their Rochester, Minn., plant. The cost of the system has not been disclosed.

© 2009 PhysOrg.com

Explore further: Chinese man brings gay conversion therapy lawsuit

add to favorites email to friend print save as pdf

Related Stories

Pilot water conservation project

Apr 08, 2014

The Laboratory has launched a pilot project to reduce potable water use by using treated groundwater to cool equipment and research facilities at the main site.

Testing virtual nuclear stockpiles

Nov 25, 2013

In 2010 the Pentagon revealed it had a total of 5,113 warheads in its nuclear stockpile, down from a peak of 31,225 at the height of the Cold War in 1967.

New simulation speed record on Sequoia supercomputer

May 01, 2013

(Phys.org) —Computer scientists at Lawrence Livermore National Laboratory (LLNL) and Rensselaer Polytechnic Institute have set a high performance computing speed record that opens the way to the scientific ...

Sequoia supercomputer transitions to classified work

Apr 18, 2013

The National Nuclear Security Administration (NNSA) today announced that its Sequoia supercomputer at Lawrence Livermore National Laboratory (LLNL) has completed its transition to classified computing in ...

Recommended for you

Taking great ideas from the lab to the fab

5 hours ago

A "valley of death" is well-known to entrepreneurs—the lull between government funding for research and industry support for prototypes and products. To confront this problem, in 2013 the National Science ...

SR Labs research to expose BadUSB next week in Vegas

6 hours ago

A Berlin-based security research and consulting company will reveal how USB devices can do damage that can conduct two-way malice, from computer to USB or from USB to computer, and can survive traditional ...

US warns retailers on data-stealing malware

8 hours ago

US government cybersecurity watchdogs warned retailers Thursday about malware being circulated that allows hackers to get into computer networks and steal customer data.

User comments : 14

Adjust slider to filter visible comments by rank

Display comments: newest first

LuckyBrandon
1.3 / 5 (7) Feb 03, 2009
oh crap...have the manufacturer with the absolute worst hardware failure record (next to Sun anyways) build the largest supercomputer eveer. Yea...not a smart idea...take it from a person who has worked EXTENSIVELY with all major manufacturer's servers.
moj85
2.5 / 5 (2) Feb 03, 2009
1.6TB of memory? That will be nothing in 2012. ;D
OregonWind
1.5 / 5 (2) Feb 03, 2009
moj85

1.6TB of memory and you think that is nothing?! Even for 2012 (only 3 years away) that is huge and really takes a supercomputer to deal with a memory as big as that.
Sonhouse
3.8 / 5 (4) Feb 03, 2009
I think the 1.6 TB of memory is for each node, the whole system would have petabytes. As to the reliability issue, IBM big mainframes are a different order of fish than the little servers they mass produce. When IBM builds a mainframe, they are reliable, not like the server world. Does ANYONE make a reliable server? Just like the phones nowadays, remember the old AT&T bricks? You could throw them across the room and they would still work. The so-called phones you get at Best Buy and Target and such (home phones, not cells) suck so much, they seem to be designed by freshmen or high school students with no concept of either reliability or usability.
OregonWind
1 / 5 (1) Feb 03, 2009
The system will have 1.6TB of memory.
Chey
5 / 5 (3) Feb 03, 2009
Unreliable? Give me a break.... IBM Mainframes and System i Power Servers have MTBF measured in decades. Their mainframes don't break... period!
Bob_Kob
1 / 5 (4) Feb 03, 2009
Lets put the worlds top nuclear calculations in the hand of a computer that is unreliable lol. Oops! The simulations never mentioned that half the world would be obliterated..
columbiaman
5 / 5 (3) Feb 03, 2009
LOL 1.6 million processing cores!? I would love having just one of those new 16 core processors that IBM is developing.
Szkeptik
not rated yet Feb 04, 2009
It would be a lot cooler if they named it Skynet :D
Soylent
not rated yet Feb 04, 2009
1.6TB of memory and you think that is nothing?!


Yes! Even the Earth Simulator(a mere 0.036 petaflops) had 10 TB of memory.

Blue gene/L had 32 TB of RAM and 900 TB of disc space.
Palli
not rated yet Feb 04, 2009
1.6TB for 1.6 million cores? is that like 1MB on-chip cache per core, not counting external RAM?
denijane
not rated yet Feb 05, 2009
I wonder what would they do with that power. I have my doubts about it :(
moj85
not rated yet Feb 05, 2009
make the most delicious blueberry muffin recipes ever. Also, maybe create the infinite improbability drive?
Soylent
not rated yet Feb 07, 2009
The article is off by a factor 1024, the Sequoia will have 1.6 _petabytes_ of RAM.

I wonder what would they do with that power. I have my doubts about it :(


Precisely what they're claiming they'll use if for probably. They're a signatory of the comprehensive nuclear testban treaty so to "make sure the nations stockpile of nuclear weapons are safe and effective" they're going to keep the different aspects of them in computer models using ever larger computers as they become available. Also known as "stockpile stewardship".