'Charliecloud' simplifies Big Data supercomputing

At Los Alamos National Laboratory, home to more than 100 supercomputers since the dawn of the computing era, elegance and simplicity of programming are highly valued but not always achieved. In the case of a new product, ...

Roofline model boosts manycore code optimization efforts

A software toolkit developed at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) to better understand supercomputer performance is now being used to boost application performance for researchers ...

Data from the LHC converted to piano music

For almost a decade, the Large Hadron Collider (LHC) has been enabling scientists to develop a greater understanding of – and, in some cases, rewrite – the laws of physics.

Reaching for the stormy cloud with Chameleon

Some scientists dream about big data. The dream bridges two divided realms. One realm holds lofty peaks of number-crunching scientific computation. Endless waves of big data analysis line the other realm. A deep chasm separates ...

Supercharging the computers that will save the world

Computer scientist Gonzalo Rodrigo at Umeå University in Sweden has developed new techniques and tools to manage high performance computing systems more efficiently. This in an effort to comply with the increasing demand ...

page 23 from 40