The thermodynamics of learning

February 6, 2017 by Lisa Zyga, feature

In this model of a neuron, the neuron learns by adjusting the weights of its connections with other neurons. Credit: Goldt et al. ©2017 American Physical Society
(—While investigating how efficiently the brain can learn new information, physicists have found that, at the neuronal level, learning efficiency is ultimately limited by the laws of thermodynamics—the same principles that limit the efficiency of many other familiar processes.

"The greatest significance of our work is that we bring the second law of thermodynamics to the analysis of neural networks," Sebastian Goldt at the University of Stuttgart, Germany, told "The second law is a very powerful statement about which transformations are possible—and learning is just a transformation of a neural network at the expense of energy. This makes our results quite general and takes us one step towards understanding the ultimate limits of the efficiency of neural networks."

Goldt and coauthor Udo Seifert have published a paper on their work in a recent issue of Physical Review Letters.

Since all brain activity is tied to the firing of billions of neurons, at the neuronal level, the question of "how efficiently can we learn?" becomes the question of "how efficiently can a neuron adjust its output signal in response to the patterns of input signals it receives from other neurons?" As neurons get better at firing in response to certain patterns, the corresponding thoughts are reinforced in our brains, as implied by the adage "fire together, wire together."

In the new study, the scientists showed that learning efficiency is bounded by the total entropy production of a neural network. They demonstrated that, the slower a neuron learns, the less heat and entropy it produces, increasing its efficiency. In light of this finding, the scientists introduced a new measure of learning efficiency based on energy requirements and thermodynamics.

As the results are very general, they can be applied to any learning algorithm that does not use feedback, such as those used in artificial neural networks.

"Having a thermodynamic perspective on neural networks gives us a new tool to think about their efficiency and gives us a new way to rate their performance," Goldt said. "Finding the optimal artificial with respect to that rating is an exciting possibility, and also quite a challenge."

In the future, the researchers plan to analyze the of learning algorithms that do employ feedback, as well as investigate the possibility of experimentally testing the new model.

"On the one hand, we are currently researching what thermodynamics can teach us about other learning problems," Goldt said. "At the same time, we are looking at ways to make our models and hence our results more general. It's an exciting time to work on neural networks!"

Explore further: Study suggests computational role for neurons that prevent other neurons from firing

More information: Sebastian Goldt and Udo Seifert. "Stochastic Thermodynamics of Learning." Physical Review Letters. DOI: 10.1103/PhysRevLett.118.010601, Also at arXiv:1611.09428 [cond-mat.stat-mech]

Related Stories

Artificial brains learn to adapt

May 16, 2014

For every thought or behavior, the brain erupts in a riot of activity, as thousands of cells communicate via electrical and chemical signals. Each nerve cell influences others within an intricate, interconnected neural network. ...

Recommended for you

CMS gets first result using largest-ever LHC data sample

February 15, 2019

Just under three months after the final proton–proton collisions from the Large Hadron Collider (LHC)'s second run (Run 2), the CMS collaboration has submitted its first paper based on the full LHC dataset collected in ...

Gravitational waves will settle cosmic conundrum

February 14, 2019

Measurements of gravitational waves from approximately 50 binary neutron stars over the next decade will definitively resolve an intense debate about how quickly our universe is expanding, according to findings from an international ...


Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Feb 06, 2017
A good direction - we evidently process information - performing basic problem-solving, at a fundamental level - via an entropy-reduction process. Informational entropy in our innate form (as opposed to Shannon or whatever metric) must align to thermodynamic if not thermo-geometric equilibria. Network entropies and efficiencies must be the very stuff of information, and its processing..
not rated yet Feb 07, 2017
When someone works intensive using his brain, he feels hunger because of many glucose is needed. Glucose is a fuel for the brain. If you give fuel to the brain and create effective cooling it should work better. It's like a processor in PC.
not rated yet Feb 07, 2017
They demonstrated that, the slower a neuron learns, the less heat and entropy it produces, increasing its efficiency.

This suggests an interesting tradeoff between speed of adaptability to the environment and efficiency. Maybe not even a rigid tradeoff where we can have faster plasticity at the expense of decreased efficiency (e.g by altering the chemical balance using stimulants like caffeine?)
not rated yet Feb 08, 2017
Slow learning means low intellectual ability. What is the best: fast processor with high heat generation and powerful cooler or slow processor without cooler? It depends on concrete application.
Processor works better when it takes pure electrical power without interferences (caffeine etc.).
not rated yet Feb 12, 2017
A study investigating to what extent learning is an irreversible process would be an interesting aspect of the thermodynamic approach to learning theory.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.