DOE documents challenges in reaching the level of exascale computing

May 21, 2014 by David Goddard

The US Department of Energy recently released a report through its Office of Science detailing the top ten research challenges in reaching the level of exascale computing, once again calling on the UT's Jack Dongarra for input.

Dongarra, a Distinguished Professor in the College of Engineering's Department of Electrical Engineering and Computer Science, director of the Innovative Computing Laboratory and one of five National Academy of Engineering members at UT, has long been at the forefront of exascale computing, or computing at roughly a thousand times the capability of recent supercomputers.

An icon in his field, Dongarra is a lead author of the Top500, a list he helped start in 1993 to measure the world's fastest computers as well as numerous software packages for high-performance computing.

As such, he is uniquely positioned to help current and future scientists achieve the exascale goal and has been made a member of the Department of Energy subcommittee studying how to achieve that goal.

Related: View the report and the ten items most in need of being addressed

"Numerous reports have documented the technical challenges and nonviability of simply scaling existing designs to reach exascale," said Dongarra. "Drawing from these reports and experience, our subcommittee has identified the top 10 computing technology advancements that are critical to making a productive, economically viable exascale system."

Dongarra explained that the challenge on the technical side typically has to do with one of five areas: energy consumption, memory performance, resilience, an extreme number of simultaneous calculations and the use of big data.

"Exascale is just the next major milestone in a process of exponential improvement that has continued for over half a century," said Dongarra. "The need to advance our understanding of the universe is without bounds, as is the need for modeling and computing the phenomena around us. For example, everyone is concerned about climate change and we need computers to help in modeling the climate."

"We will probably reach exascale computing in the United States around 2022," said Dongarra. "The computational challenge for doing oceanic clouds, ice, and topography are all tremendously important. And today we need at least two orders of magnitude improvement on that problem alone."

Explore further: Scientists apply new graph programming method for evolving exascale applications

More information: The DOE report, "Top Ten Exascale Research Challenges," is available online: science.energy.gov/~/media/asc… Top10reportFEB14.pdf

add to favorites email to friend print save as pdf

Related Stories

Intel flirts with exascale leap in supercomputing

Jun 19, 2012

(Phys.org) -- If exascale range is the next destination post in high-performance computing then Intel has a safe ticket to ride. Intel says its new Xeon Phi line of chips is an early stepping stone toward ...

Recommended for you

Government ups air bag warning to 7.8M vehicles

2 hours ago

The U.S. government is adding more than 3 million vehicles to a rare warning about faulty air bags that have the potential to kill or injure drivers or passengers in a crash.

Fighting cyber-crime one app at a time

2 hours ago

This summer Victoria University of Wellington will be home to four Singaporean students researching cyber threats. The students have been working with Dr Ian Welch, a lecturer in Victoria's School of Engineering and Computer ...

Using sound to picture the world in a new way

2 hours ago

Have you ever thought about using acoustics to collect data? The EAR-IT project has explored this possibility with various pioneering applications that impact on our daily lives. Monitoring traffic density ...

User comments : 0