How to improve data management in the supercomputers of the future

February 1, 2017, Carlos III University of Madrid
How to improve data management in the supercomputers of the future. Credit: UC3M

Researchers at Universidad Carlos III de Madrid (UC3M) are establishing new foundations for data management in the supercomputing systems of the future. In recent decades, many scientific discoveries have depended on the analysis of an enormous volume of data, which is done essentially through computational simulations performed on a large scale in supercomputers. This type of machine is used to study climate models, the development of new materials, research into the origin of the universe, the study of the human genome and new applications in bioengineering.

At present, as an ever-increasing amount of information is collected and stored, scientific confronts a problem: The software that manages the latest generation of supercomputers was not designed for the scalability requirements that are expected in coming years. In fact, in less than a decade, these infrastructures are going to be two orders of magnitude faster than current supercomputers.

"Today, these applications are encountering big problems of performance and scalability due to the exponential increase of data as a result of better instruments, the growing ubiquity of sensors and greater connectivity between devices," explained professor Florin Isaila, from the group ARCOS in the UC3M Department of Computer Science. "These days, a radical redesign of the computational infrastructures and management software is necessary to adapt them to the new model of science, which is based on the massive processing of data."

The objective of the project, "Cross-Layer Abstractions and Run-time for I/O Software Stack of Extreme-scale systems" (CLARISSE), is to increase the performance, scalability, programmability and robustness of the data management of scientific applications to underpin the design of next-generation supercomputers.

Historically, has been developed in layers with little coordination in the global management of resources. "Nowadays, this lack of coordination is one of the biggest obstacles to increasing the scalability of current systems. With CLARISSE, we research solutions to these problems through the design of new mechanisms for coordinating the data management of the different layers," said Professor Isaila.

Jesús Carretero, the project's main researcher, UC3M full professor and head of ARCOS, explained, "At present, ARCOS is actively involved in several initiatives around the world to remodel the management software of future supercomputers, including the coordination of the CLARISSE project and the research collaboration network NESUS. The resulting synergies of these efforts are going to contribute substantially to accelerating in the coming decades."

Explore further: UTA physicists to upgrade Titan supercomputer software for extreme scale applications

More information: Further information: arcos.inf.uc3m.es/~florin/clarisse

Related Stories

Customizing supercomputers from the ground up

May 27, 2010

(PhysOrg.com) -- Computer scientist Adolfy Hoisie has joined the Department of Energy's Pacific Northwest National Laboratory to lead PNNL's high performance computing activities. In one such activity, Hoisie will direct ...

Programming model for supercomputers of the future

June 10, 2013

The demand for even faster, more effective, and also energy-saving computer clusters is growing in every sector. The new asynchronous programming model GPI from Fraunhofer ITWM might become a key building block towards realizing ...

Recommended for you

In colliding galaxies, a pipsqueak shines bright

February 20, 2019

In the nearby Whirlpool galaxy and its companion galaxy, M51b, two supermassive black holes heat up and devour surrounding material. These two monsters should be the most luminous X-ray sources in sight, but a new study using ...

When does one of the central ideas in economics work?

February 20, 2019

The concept of equilibrium is one of the most central ideas in economics. It is one of the core assumptions in the vast majority of economic models, including models used by policymakers on issues ranging from monetary policy ...

Research reveals why the zebra got its stripes

February 20, 2019

Why do zebras have stripes? A study published in PLOS ONE today takes us another step closer to answering this puzzling question and to understanding how stripes actually work.

Correlated nucleons may solve 35-year-old mystery

February 20, 2019

A careful re-analysis of data taken at the Department of Energy's Thomas Jefferson National Accelerator Facility has revealed a possible link between correlated protons and neutrons in the nucleus and a 35-year-old mystery. ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.