New limit to the Church-Turing thesis accounts for noisy systems

September 10, 2015 by Lisa Zyga feature
Credit: Public Domain

(—The question of what a computer is capable of, and what it is not, has intrigued computer scientists since the 1930s, when Alonzo Church and Alan Turing began investigating the capabilities and limits of computers. In a new study, researchers have explored the computing abilities of noisy physical systems, which are those that are disturbed by random fluctuations.

While previously it has seemed that physical systems may violate the Church-Turing thesis—a conjecture that in a sense defines a computer—here the researchers show that this is not the case for noisy systems due to a new computing limit. The results could have implications for designing super-Turing computers, or "hypercomputers," which are hypothetical devices that outperform all existing computers.

The researchers, Professor Mark Braverman and graduate student Jonathan Schneider at Princeton University, along with Assistant Professor Cristóbal Rojas at the Andrés Bello National University in Santiago, Chile, have published a paper on noise in computing in a recent issue of Physical Review Letters.

"The greatest significance of the study is introducing a new quantitative way of thinking about noise as it affects a when the dynamical system is being viewed or used as a computer," Braverman told

As the researchers explain, the presence of noise in a physical system can lead to unpredictable behavior that seems to be incomputable. If this is the case, it would violate the Church-Turing thesis.

"The physical interpretation of the 1930s Church-Turing thesis asserts that a physical system cannot be harnessed to perform computations that cannot be performed (in principle) by a standard computer," the scientists explained. "Yet, it is well-known that even basic physical laws can give rise to chaotic and unpredictable behavior. In some cases, the long-term or steady-state features of such systems can be mathematically proven to not be computable. This seemingly defies the Church-Turing thesis, which implies that all physically realizable systems can be simulated by a computer."

In the new study, however, the researchers show that computing the behavior of noisy systems is actually relatively straightforward, and therefore noisy systems are not capable of performing very complex computations.

"In this paper, we show that the introduction of a small amount of noise into physical systems allows their equilibrium behavior not only to be computed, but even computed relatively efficiently," the scientists wrote.

To explain why this works, the researchers presented a new limitation on the ability of physical systems to perform computation: their memory. To formulate this so-called "space-bounded Church-Turing thesis" (SBCT), the researchers presented a new definition of memory for physical systems as "the maximum amount of information the evolving system can preserve per time step." Using this definition, they showed that a noisy physical system with a given amount of memory can be simulated by a computer that is limited to roughly the same amount of working memory as that of the physical system.

The researchers also showed that the SBCT holds even if a computer has an infinite amount of time to try to solve a problem. The new limitation is not the first variation of the original Church-Turing thesis, but it is different from most other variations in that time is not a limiting factor.

The memory-based limit is of course not the only limit on computing ability—for instance, the complexity of a system is another limit, and often a more stringent one than memory. One advantage of the memory-based limit, however, is that it is generally much easier to estimate a system's amount of memory than other properties, such as complexity.

"The main usefulness of the results is in giving us a relatively simple estimate on the computational power of systems," Braverman said. "Such an estimate is useful when thinking about proposals for super-Turing computation devices."

Super-Turing computation devices, also called "hypercomputers," are hypothetical devices that compute functions that a Turing machine—or any computer available today—cannot. Since these functions are considered incomputable in the Church-Turing sense, super-Turing devices would either need to have an unbounded memory or otherwise circumvent the thesis.

Along with exploring super-Turing computers, the researchers also plan to further investigate the SBCT at a deeper level.

"The most interesting direction is further refining and understanding the notion of 'memory' of a general dynamical system," Braverman said. "Another obvious direction is finding further evidence for or against the SBCT. Evidence against would be particularly exciting, since it would point at devices that can overcome their apparent information-theoretic limitations."

Explore further: Alan Turing's legacy is even bigger than we realise

More information: Mark Braverman, Jonathan Schneider, and Cristóbal Rojas. "Space-Bounded Church-Turing Thesis and Computational Tractability of Closed Systems." Physical Review Letters. DOI: 10.1103/PhysRevLett.115.098701

Related Stories

Alan Turing's legacy is even bigger than we realise

December 1, 2014

Alan Turing is one of the world's best-known mathematicians, and probably the best known in the past century. This is partly for his work on cracking German codes in World War II, and partly for his arrest, conviction and ...

Turing computing prize jumps to $1 mn (Update)

November 13, 2014

Prize money for the prestigious Turing Award for brilliance in the computing industry has quadrupled to $1 million after a cash injection from Google, organizers said Thursday.

Kasparov versus Turing (w/ Video)

June 27, 2012

( -- For the first time in public, Mr. Kasparov played a match against Turing’s chess program live on stage at The University of Manchester’s Alan Turing Centenary Conference.

When does a physical system compute?

July 11, 2014

Can physical systems from bacteria to black holes act as a computer? A University of York computer scientist and colleagues from the universities of Oxford and Leeds address this question in newly published research which ...

Recommended for you

Two teams independently test Tomonaga–Luttinger theory

October 20, 2017

(—Two teams of researchers working independently of one another have found ways to test aspects of the Tomonaga–Luttinger theory that describes interacting quantum particles in 1-D ensembles in a Tomonaga–Luttinger ...

Using optical chaos to control the momentum of light

October 19, 2017

Integrated photonic circuits, which rely on light rather than electrons to move information, promise to revolutionize communications, sensing and data processing. But controlling and moving light poses serious challenges. ...

Black butterfly wings offer a model for better solar cells

October 19, 2017

(—A team of researchers with California Institute of Technology and the Karlsruh Institute of Technology has improved the efficiency of thin film solar cells by mimicking the architecture of rose butterfly wings. ...

Terahertz spectroscopy goes nano

October 19, 2017

Brown University researchers have demonstrated a way to bring a powerful form of spectroscopy—a technique used to study a wide variety of materials—into the nano-world.


Adjust slider to filter visible comments by rank

Display comments: newest first

1 / 5 (3) Sep 10, 2015
"the maximum amount of information the evolving system can preserve per time step."
and since Conservation of Information disallows loss of information, the maximum is also the minimum.
the only trick is making that information available.
not rated yet Sep 10, 2015
I'm not clear on why anyone expected different results. Even in the realm of efficient computation rather than effective computation (e.g., complexity theory instead of computability theory), randomness isn't thought to add anything; this is known as the derandomization assumption. So random noise isn't believed to even gain you time or space efficiency, it would be the most shocking discovery in the history of computer science if it turned out randomness could give you hypercomputation.
5 / 5 (2) Sep 10, 2015
"the maximum amount of information the evolving system can preserve per time step."
and since Conservation of Information disallows loss of information, the maximum is also the minimum.
the only trick is making that information available.

Ogg, conservation of information doesn't apply here. Since a computer is a closed system, information can leave it. That information isn't lost to the universe, but it's no longer accessible to the computer.
Spaced out Engineer
1 / 5 (1) Sep 10, 2015
What if there only exists data? Would we then all exist as mere pervasive pertubating pontificators? Perhaps PPP's can go pee pee and redefine the remaining P as principle.

So are we lucky to exist in a supposedly open universe?

What if noise is relative to the system that is referenceing it or the number of anticipatory nodes?

How can they say this when there really is no end to the tuning of neural networks for hidden variables? Stochastic processes can always change at a deeper level than the current sequence can fetch. There does not exist enough certainty.
not rated yet Sep 11, 2015
This is timely, since I just discovered Gibb's H-theorem. [ https://en.wikipe...-theorem ] It would prevent Poincaré recurrences and likely Boltzmann Brains, but it needs a way to loose explicit microscale reversibility. Deterministic chaos would be one such way.

And if the physics of the observable universe is limited by its equivalent CS memory, it will effectively experience the requisite blurring sooner or later. (Analogous to how deterministic chaos also runs out of CS resources sooner or later.)

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.