The physicists, Terence E. Stuart, et al., from the University of Calgary in Alberta, Canada; ETH Zurich in Switzerland; and the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, Canada, have published their paper on the predictive power of quantum theory and alternative theories in a recent issue of *Physical Review Letters*.

“The fact that certain outcomes can only be predicted with probability 50% by quantum theory could in principle be explained in two very different ways,” coauthor Renato Renner of ETH Zurich told *Phys.org*. “One would be that quantum theory is an incomplete theory whose predictions are only random because we have not yet discovered the parameters that are relevant for determining the outcomes (and that another yet-to-be-discovered theory would therefore allow for better predictions). The other explanation would be that there is ‘inherent’ randomness in Nature. Our work excludes the first possibility. In other words, it is not only quantum theory that predicts randomness, but there is ‘real’ randomness in Nature.”

The physicists began by asking whether it may be possible to improve quantum theory’s predictive power by supplementing it with some additional information (i.e., a local hidden variable). With complete information about a scenario, classical theories can predict an outcome with 100% accuracy. But in the 1960s, physicist John Bell proved that no local hidden variable exists that could enable quantum theory to predict an outcome with complete certainty.

However, Bell’s work didn’t rule out the possibility that quantum theory’s predictive power could be improved a little bit, nor did it refute the existence of any alternative probabilistic theory that has more predictive power than quantum theory.

One recent proposal for improving quantum mechanical prediction was suggested by physicist Tony Leggett in 2003. In this model, a hidden spin vector could increase the predictive probability of quantum theory by 0.25, from 0.5 to 0.75 (with 1.0 being complete certainty). Although Leggett showed that this model is incompatible with quantum theory, there has been no reason to assume that other models don’t exist.

However, in the new paper, the physicists have experimentally demonstrated that there cannot exist any alternative theory that increases the predictive probability of quantum theory by more than 0.165, with the only assumption being that measurement parameters can be chosen independently of the other parameters of the theory. In other words, any current or future theory that can improve upon quantum theory by more than 0.165 would either be falsified by the physicists’ experimental observations here (such as Leggett’s model) or be incompatible with the free choice assumption (one example being the de Broglie-Bohm theory).

As Renner explained, it is impossible to know exactly how many alternative theories there are because most are small variations of others. Some, such as the de Broglie-Bohm theory, date back to the early days of quantum theory while others were proposed more recently, partially motivated by information-theoretic considerations. He also added that giving up the free choice assumption has its own complications.

“Our work excludes any such theory,” he said. “But there is one way to circumvent this conclusion: One may give up the assumption of ‘free choice’ on which our result is based and replace it by a weaker notion. This is possible in principle, but would, for instance, necessarily lead to an incompatibility with relativistic space-time structure.”

The researchers’ experiments involved sending a pair of entangled photons through an apparatus and making measurements to determine whether they arrive at one of two detectors. By improving the fidelity of the photon pair sources and improving the quality of the measurement apparatuses, the scientists explain that the 0.165 bound they measured here could be improved. However, decreasing this bound by more than a factor of two would require improvements that are beyond current state-of-the-art technology.

Nevertheless, the experimental results provide the tightest constraints yet on alternatives to quantum theory. The findings imply that quantum theory is close to optimal in terms of its predictive power, even when the predictions are completely random. In the future, the physicists plan to further investigate the implications of these results.

“On the theory side, this work opens a number of interesting questions related to the nature of randomness,” Renner said. “One of them is whether randomness can be ‘amplified,’ i.e., whether there are processes that start with low-quality randomness and produce virtually perfect randomness.”

The scientists already have a first result in this direction, which was published earlier this year in *Nature Physics*.

**Explore further:**
A roll of the dice: Quantum mechanics researchers show that nature is unpredictable

**More information:**
Terence E. Stuart, et al. “Experimental Bound on the Maximum Predictive Power of Physical Theories.” *PRL* 109, 020402 (2012). DOI: 10.1103/PhysRevLett.109.020402

## ant_oacute_nio354

Antonio Saraiva

## tadchem

## holoman

http://www.coloss...gled.htm

## rah

## TheGhostofOtto1923

http://www.youtub...Dwm4PMJY

## yyz

You might want to check out "An experimental test of all theories with predictive power beyond quantum theory" by Stuart, Slater, Colbeck, Renner & Tittel(same authors as the PRL paper) here:

http://arxiv.org/abs/1105.0133

## Torbjorn_Larsson_OM

I don't grok portuguese, but both classical and quantum mechanics have reality built in according to doctor Johnson's maxim: 'if I hit a stone and the stone hits back, it is reality'. (apocryphal history.) In other words reality is defined as "constrained reaction on constrained action", not everything goes and we expect robustness of real phenomena.

In classical mechanics that is action-reaction in Newton's terms, and in quantum mechanics observation-observables.

So yes, they discuss theories on reality. More specifically, in the last decades decoherence and wavefunction (amplitude and phase) seems amenable to testing as real objects and the idea of quantum mechanics as a mere statistical theory on "what we can know" is starting to get excluded. Now the latter is taking another hit.

## vacuum-mechanics

It seems that the weak point of quantum mechanics is that physicists do not understand it mechanism (i.e. how and why the theory works) behind what was observed. Indeed, know its mechanism (explain below), we would found that it working principle is something like tossing a coin.

http://www.vacuum...id=19=en

## alfie_null

Suggests payment in the form of an envelope that may or may not contain a check for $50.

## antialias_physorg

These are not exclusive statements. We expect robustness. But robstness can be something like "we expect a million coin tosses to be close to 50% heads and tails". This is a very robust occurence, but a million heads isn't a "no go" result - just so very unlikely that you'd need universes full of coin tossers to ever have a reasonable chance of seeing it.

Everyday phenomena like "a rock hitting back" comprise way, way, WAY more than events than a million 50/50 chances. So they are VERY robust, but need not exclude anything.

## Claudius

Of course, in a suitably robust virtual reality simulation, Dr. Johnson would be convinced of the reality of what he was experiencing. So what is reality anyway?

It seems we may have the ability to construct such simulations some day with direct to brain interfaces. The question of what is real and what is not real will become more and more relevant as we approach that ability.

David Deutsch in his "The Fabric of Reality" postulates that we are even now living within a vast virtual reality simulation. How could we tell that we are not? Stones kicking back won't be a good test.

## rebelclause