Researchers prove Landauer was right in saying heat is dissipated when memory is erased

March 8, 2012 by Bob Yirka, report

The erasure protocol used in the experiment. Image: [i]Nature[/i] 483, 187-189 (08 March 2012) doi:10.1038/nature10872
( -- For over half a century, physicists and computer scientists have been troubled by a theoretical concept set forth by Rolf Landauer. He suggested that the very act of erasing a bit of memory in a digital system causes heat to be dissipated. This little idea has bothered researchers for two reasons. One is because if true, it will mean there will come a time within the next couple of decades when fabricated memories will reach a point where they cannot be made any smaller due to the heat that will be dissipated when memory is erased. The other reason is because until now, no one has been able to prove whether it was really true or not; now all that has changed, much to the dismay of computer engineers. Eric Lutz and his colleagues at the University of Augsburg in Germany have devised an experiment that proves that Landauer was right. They have published their findings in the journal Nature.

To settle the matter once and for all, Lutz and his team constructed a memory device using a very tiny floating glass bead and two equally tiny side-by-side wells for the bead to sit in. If the bead sat in the left well, that represented a “O” state, if in the right one it was a “1”. To change the state, a small barrier between the two was lowered and the surface in which they sat was tilted allowing the bead to roll into the opposite well. The wells in this case were actually optical tweezers that were able to hold the beads in place using lasers. The memory device was erased by putting the bead back in its original state. Then because the memory device was too small to be able to record if any was dissipated when the was reset, or erased, the team measured instead the speed at which the bead moved between the two wells which allowed them to calculate the heat that was given off. And lo and behold, it turned out to match Landauer’s original predictions.

Landauer came to make his prediction to refute Maxwell’s demon, named for 19th century scientist James Clerk Maxwell who dreamed up the idea of a tiny demon that could separate hot molecules from cold then create a machine that ran based on the heat flowing from one to the other; in essence a perpetual motion machine. Such a system would of course violate the second law of thermodynamics because as we all know, you can’t get something from nothing. Landauer suggested that energy would have to be expended as the system was reset (its erased) which would balance out the energy created, proving what everyone knew intuitively all along.

Explore further: Could Maxwell's Demon Exist in Nanoscale Systems?

More information: Experimental verification of Landauer’s principle linking information and thermodynamics, Nature 483, 187–189 (08 March 2012) doi:10.1038/nature10872

In 1961, Rolf Landauer argued that the erasure of information is a dissipative process. A minimal quantity of heat, proportional to the thermal energy and called the Landauer bound, is necessarily produced when a classical bit of information is deleted. A direct consequence of this logically irreversible transformation is that the entropy of the environment increases by a finite amount. Despite its fundamental importance for information theory and computer science, the erasure principle has not been verified experimentally so far, the main obstacle being the difficulty of doing single-particle experiments in the low-dissipation regime. Here we experimentally show the existence of the Landauer bound in a generic model of a one-bit memory. Using a system of a single colloidal particle trapped in a modulated double-well potential, we establish that the mean dissipated heat saturates at the Landauer bound in the limit of long erasure cycles. This result demonstrates the intimate link between information theory and thermodynamics. It further highlights the ultimate physical limit of irreversible computation.

Related Stories

Could Maxwell's Demon Exist in Nanoscale Systems?

June 24, 2009

( -- Maxwell’s demon may be making a comeback. Physicists know that the demon, an imaginary creature that decreases the entropy of a system, cannot exist in macroscopic systems due to the energy it requires ...

Holographic dark information energy

May 30, 2011

Holographic Dark Information Energy gets my vote for the best mix of arcane theoretical concepts expressed in the shortest number of words – and just to keep it interesting, it’s mostly about entropy.

Recommended for you

CMS gets first result using largest-ever LHC data sample

February 15, 2019

Just under three months after the final proton–proton collisions from the Large Hadron Collider (LHC)'s second run (Run 2), the CMS collaboration has submitted its first paper based on the full LHC dataset collected in ...

Gravitational waves will settle cosmic conundrum

February 14, 2019

Measurements of gravitational waves from approximately 50 binary neutron stars over the next decade will definitively resolve an intense debate about how quickly our universe is expanding, according to findings from an international ...


Adjust slider to filter visible comments by rank

Display comments: newest first

3.7 / 5 (6) Mar 08, 2012

If by "troubled" you mean "blown away with a really cool result vital for understanding thermodynamically reversible computers and quantum computers" then yes.
3.7 / 5 (3) Mar 08, 2012
Couldn't agree more ppnlppnl. It is really neat to see this confirmed. Landau's work is a cornerstone for quantum computing theory and has been instrumental at demystifying the QM "wave ollapse" i.e. understanding decoherence.
not rated yet Mar 08, 2012
This article, as written here, and the summary, do not distinguish between erasing memory from changing a single bit.

I haven't read the original article yet, though. But frankly, I'm not holding my breath here.
1 / 5 (1) Mar 08, 2012
He suggested that the very act of erasing a bit of memory in a digital system causes heat to be dissipated. .. if true, it will mean there will come a time within the next couple of decades when fabricated memories will reach a point where they cannot be made any smaller due to the heat that will be dissipated when memory is erased.
The uncertainty principle will limit them a way before the heat flux limit will become dominant. This is one of reasons, why I don't believe, that quantum computers could beat these classical ones in information processing density. In general, the information is an abstract concept of formal theorists - in reality every transfer of information is connected with energy transfer, which is intrinsically dissipative. And the reset of memory is an information transfer like any other.
3 / 5 (2) Mar 09, 2012
what are we going to do about Moore's laws then?
not rated yet Mar 13, 2012
While wearing my freshly-waxed tinfoil hat this morning, it occurred to me that a memory device will fail and spontaneously erase if it gets too hot, which would release more heat that would in turn increase the temp of the memory device next to it and at some point this could become a self-sustaining process, sort of like getting a fire started. So, if a critical mass of sufficiently-dense data is erased it could start a digital 'forest fire' raging at nearly the speed of light through the internet until all the servers and computers attached to it melt (or maybe vaporize in a brilliant flash). This won't be a problem until memory reaches a critical density, but thanks to Moore's Law that could be next Tuesday.
not rated yet Mar 13, 2012
that a memory device will fail and spontaneously erase if it gets too hot
It will not erase to all zeroes: it will become random and it occasionally may spontaneously form the bitmap of Mona Lisa or the source code of Windows OS (albeit the latter seems a bit more probable for me). In AWT the natural state of Universe is not zero state or some other particular state, but the random state. Such perspective has indeed its deep cosmological and gnoseological consequences.
1 / 5 (1) Mar 17, 2012
Eventually Moore's Law will fail us regardless. Once the size of a transistor reaches one atom, it CAN'T get any smaller. I doubt that one could build a single-atom transistor, and certainly couldn't connect it to anything else without using more atoms for the connections, but that is an absolute limit.

Who knows perhaps one day ( VERY far ) away we will be making gravity circuits that use singularitys strings etc in place of atoms. Of course such a thing will require a mind boggling understanding of reality and an even more boggling tech!!!

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.