Big Computers For Big Science

August 23, 2004

A visiting neutron scattering scientist at ORNL sends data from her experiment to a San Diego supercomputer for analysis. The calculation results are sent to Argonne National Laboratory, where they are turned into "pictures." These visualizations are sent to a collaborating scientist's workstation at North Carolina State University, one of the core universities of UT-Battelle, which manages ORNL for DOE.

To make their discoveries, scientists must interact with supercomputers to generate, examine, and archive huge datasets. To turn data into insight, this interaction must occur on human time scales, not over days or weeks, but over minutes.

Big science requires big computers that are not just scaled-up desktop personal computers. Big computers are fundamentally different from PCs in their ability to model enormous systems, generate immense volumes of data, and, as a payoff, solve uniquely difficult scientific problems. To put this difference in perspective, next-generation science datasets will approach or exceed a petabyte in size. If one of today's desktop PCs had a disk able to hold a petabyte-sized file, the PC would require over three years to read the file.

The Center for Computational Sciences at ORNL has been tasked by DOE to develop the next generation of scientific networks to address the challenges of large science applications. The techniques developed in Oak Ridge will eventually filter out into the high end of the business world. Just as yesterday's scientific supercomputers have become today's central business and engineering computers, the same transfer will result in this network, called the DOE UltraScience Net, becoming the core of tomorrow's commercial networks.

Source: ORNL

Explore further: Titan takes on the big one

Related Stories

Titan takes on the big one

November 11, 2015

The San Andreas Fault system, which runs almost the entire length of California, is prone to shaking, causing about 10,000 minor earthquakes each year just in the southern California area.

Titan helps unpuzzle decades-old plutonium perplexities

September 29, 2015

First produced in 1940, plutonium is one of the most electronically complicated elements on Earth—and because of its complexities, scientists have been struggling to prove the existence of its magnetic properties ever since.

ORNL Jaguar supercomputer surpasses 50 teraflops

August 25, 2006

An upgrade to the Cray XT3 supercomputer at Oak Ridge National Laboratory, the most powerful supercomputer available for general scientific research in the United States, has increased the system's computing power to 54 teraflops, ...

Recommended for you

Amazon deforestation leaps 16 percent in 2015

November 28, 2015

Illegal logging and clearing of Brazil's Amazon rainforest increased 16 percent in the last year, the government said, in a setback to the aim of stopping destruction of the world's greatest forest by 2030.

CERN collides heavy nuclei at new record high energy

November 25, 2015

The world's most powerful accelerator, the 27 km long Large Hadron Collider (LHC) operating at CERN in Geneva established collisions between lead nuclei, this morning, at the highest energies ever. The LHC has been colliding ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.