The thermodynamics of computing

April 11, 2018 by Fe­lix Würsten, ETH Zurich
An integrated cooler: Heat production is now the limiting factor in information processing. Credit: Colourbox

Information processing requires a lot of energy. Energy-saving computer systems could make computing more efficient, but the efficiency of these systems can't be increased indefinitely, as ETH physicists show.

As became increasingly widespread in the 19th century, the question soon arose as to how to optimise them. Thermodynamics, the physical theory that resulted from the study of these machines, proved to be an extremely fruitful approach; it is still a central concept in the optimisation of energy use in heat engines.

Heat is a critical factor

Even in today's age, physicists and engineers hope to make use of this theory; it is becoming ever clearer that the clock rate or the number of chips used are not the limiting factors for a computer's performance, but rather its energy turnover. "The performance of a computing centre depends primarily on how much heat can be dissipated," says Renato Renner, Professor for Theoretical Physics and head of the research group for Quantum Information Theory.

Renner's statement can be illustrated by the Bitcoin boom: it is not computing capacity itself, but the exorbitant energy use – which produces a huge amount of heat – and the associated costs that have become the deciding factors for the future of the cryptocurrency. Computers' energy consumption has also become a significant cost driver in other areas.

For information processing, the question of completing computing operations as efficiently as possible in thermodynamic terms is becoming increasing urgent – or to put it another way: how can we conduct the greatest number of computing operations with the least amount of energy? As with steam engines, fridges and gas turbines, a fundamental principle is in question here: can the efficiency be increased indefinitely, or is there a physical limit that fundamentally cannot be exceeded?

Combining two theories

For ETH professor Renner, the answer is clear: there is such a limit. Together with his doctoral student Philippe Faist, who is now a postdoc at Caltech, he showed in a study soon to appear in Physical Review X that the efficiency of information processing cannot be increased indefinitely – and not only in computing centres used to calculate weather forecasts or process payments, but also in biology, for example when converting images in the brain or reproducing genetic information in cells. The two physicists also identified the deciding factors that determine the limit.

"Our work combines two theories that, at first glance, have nothing to do with one another: thermodynamics, which describes the conversion of heat in mechanical processes, and , which is concerned with the principles of ," explains Renner.

The connection between the two theories is hinted at by a formal curiosity: information theory uses a mathematical term that formally resembles the definition of entropy in thermodynamics. This is why the term entropy is also used in information . Renner and Faist have now shown that this formal similarity goes deeper than would be assumed at first glance.

No fixed limits

Notably, the efficiency limit for the processing of information is not fixed, but can be influenced: the better you understand a system, the more precisely you can tailor the software to the chip design, and the more efficiently the information will be processed. That is exactly what is done today in high-performance computing. "In future, programmers will also have to take the thermodynamics of computing into account," says Renner. "The decisive factor is not minimising the number of computing operations, but implementing algorithms that use as little as possible."

Developers could also use biological systems as a benchmark here: "Various studies have shown that our muscles function very efficiently in thermodynamic terms," explains Renner. "It would now be interesting to know how well our brain performs in processing signals."

As close to the optimum as possible

As a quantum physicist, Renner's focus on this question is no coincidence: with quantum thermodynamics, a new research field has emerged in recent years that has particular relevance for the construction of quantum computers. "It is known that qubits, which will be used by future quantum computers to perform calculations, must work close to the thermodynamic optimum to delay decoherence," says Renner. "This phenomenon is a huge problem when constructing quantum computers, because it prevents quantum mechanical superposition states from being maintained long enough to be used for computing operations."

Explore further: Physicists extend stochastic thermodynamics deeper into quantum territory

More information: Philippe Faist et al. Fundamental Work Cost of Quantum Processes, Physical Review X (2018). DOI: 10.1103/PhysRevX.8.021011

Related Stories

Quantum effects lead to more powerful battery charging

May 1, 2017

(—Physicists have theoretically shown that, when multiple nanoscale batteries are coupled together, they can be charged faster than if each battery was charged individually. The improvement arises from collective ...

Recommended for you

ATLAS experiment observes light scattering off light

March 20, 2019

Light-by-light scattering is a very rare phenomenon in which two photons interact, producing another pair of photons. This process was among the earliest predictions of quantum electrodynamics (QED), the quantum theory of ...

How heavy elements come about in the universe

March 19, 2019

Heavy elements are produced during stellar explosion or on the surfaces of neutron stars through the capture of hydrogen nuclei (protons). This occurs at extremely high temperatures, but at relatively low energies. An international ...

Trembling aspen leaves could save future Mars rovers

March 18, 2019

Researchers at the University of Warwick have been inspired by the unique movement of trembling aspen leaves, to devise an energy harvesting mechanism that could power weather sensors in hostile environments and could even ...

Quantum sensing method measures minuscule magnetic fields

March 15, 2019

A new way of measuring atomic-scale magnetic fields with great precision, not only up and down but sideways as well, has been developed by researchers at MIT. The new tool could be useful in applications as diverse as mapping ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Apr 12, 2018
No mention of the Landauer Limit or reversible computing

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.