DOE documents challenges in reaching the level of exascale computing

May 21, 2014 by David Goddard

The US Department of Energy recently released a report through its Office of Science detailing the top ten research challenges in reaching the level of exascale computing, once again calling on the UT's Jack Dongarra for input.

Dongarra, a Distinguished Professor in the College of Engineering's Department of Electrical Engineering and Computer Science, director of the Innovative Computing Laboratory and one of five National Academy of Engineering members at UT, has long been at the forefront of exascale computing, or computing at roughly a thousand times the capability of recent supercomputers.

An icon in his field, Dongarra is a lead author of the Top500, a list he helped start in 1993 to measure the world's fastest computers as well as numerous software packages for high-performance computing.

As such, he is uniquely positioned to help current and future scientists achieve the exascale goal and has been made a member of the Department of Energy subcommittee studying how to achieve that goal.

Related: View the report and the ten items most in need of being addressed

"Numerous reports have documented the technical challenges and nonviability of simply scaling existing designs to reach exascale," said Dongarra. "Drawing from these reports and experience, our subcommittee has identified the top 10 computing technology advancements that are critical to making a productive, economically viable exascale system."

Dongarra explained that the challenge on the technical side typically has to do with one of five areas: energy consumption, memory performance, resilience, an extreme number of simultaneous calculations and the use of big data.

"Exascale is just the next major milestone in a process of exponential improvement that has continued for over half a century," said Dongarra. "The need to advance our understanding of the universe is without bounds, as is the need for modeling and computing the phenomena around us. For example, everyone is concerned about climate change and we need computers to help in modeling the climate."

"We will probably reach exascale computing in the United States around 2022," said Dongarra. "The computational challenge for doing oceanic clouds, ice, and topography are all tremendously important. And today we need at least two orders of magnitude improvement on that problem alone."

Explore further: Scientists apply new graph programming method for evolving exascale applications

More information: The DOE report, "Top Ten Exascale Research Challenges," is available online: science.energy.gov/~/media/asc… Top10reportFEB14.pdf

add to favorites email to friend print save as pdf

Related Stories

Intel flirts with exascale leap in supercomputing

Jun 19, 2012

(Phys.org) -- If exascale range is the next destination post in high-performance computing then Intel has a safe ticket to ride. Intel says its new Xeon Phi line of chips is an early stepping stone toward ...

Recommended for you

Saving lots of computing capacity with a new algorithm

Oct 29, 2014

The control of modern infrastructure such as intelligent power grids needs lots of computing capacity. Scientists of the Interdisciplinary Centre for Security, Reliability and Trust (SnT) at the University of Luxembourg have ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.