George C. Knee, a theoretical physicist at the University of Oxford and the University of Warwick, has created an algorithm for designing optimal experiments that could provide the strongest evidence yet that the quantum state is an ontic state (a state of reality) and not an epistemic state (a state of knowledge). Knee has published a paper on the new strategy in a recent issue of the *New Journal of Physics*.

While physicists have debated about the nature of the quantum state since the early days of quantum theory (with, most famously, Bohr being in favor of the ontic interpretation and Einstein arguing for the epistemic one), most modern evidence has supported the view that the quantum state does indeed represent reality.

Philosophically, this interpretation can be hard to swallow, as it means that the many counterintuitive features of quantum theory are properties of reality, and not due to limitations of theory. One of the most notable of these features is superposition. Before a quantum object is measured, quantum theory says that the object simultaneously exists in more than one state, each with a particular probability. If these states are ontic, it means that a particle really does occupy two states at once, not merely that it appears that way due to our limited ability to prepare particles, as in the epistemic view.

What is exactly meant by a limited ability to prepare particles? To understand this, Knee explains that different quantum states must be thought of as distributions over the possible true states of reality. If there is some overlap between these distributions, then the states of reality in which a particle can be prepared is limited.

Currently it's not clear if there actually is any overlap between quantum state distributions. If there is zero overlap, then the particle must really be occupying two states at once, which is the ontic view. On the other hand, if there is some overlap, then it's possible that the particle exists in a state in the overlapping area, and we just can't tell the difference between the two possibilities due to the overlap. This is the epistemic view, and it removes some of the oddness of superposition by explaining that the indistinguishability of two states is a result of overlap (and human limitation) rather than of reality.

Framing the question in terms of overlap offers a way to test the two perspectives. If physicists can show that the indistinguishability of quantum states can somehow be explained by reality and not overlap, then that places tighter restrictions on the epistemic view and makes the ontic view more plausible.

A key to such tests is that the task of discriminating between two states always has a small error involved. Having complete, omniscient knowledge about reality should improve state discrimination. But by how much? This is the big question, and physicists are trying to show that the value of this "improvement due to the increased reality of the quantum states" is very large. This would mean that the overlap plays very little, if any, role in explaining why states are indistinguishable. It's not simply that physicists cannot accurately prepare the true state of reality, it's that the indistinguishability must be thought of as a fundamental property of the quantum states themselves.

Currently, the best experimental data shows that the amount of error improvement that can be attributed to overlap is about 69%. In the new paper, Knee has proposed a way to reduce this value to less than 50% with current technology. As he explains, this would mean that "overlap is doing less than half of the necessary work in explaining the indistinguishability of non-orthogonal quantum states."

"The greatest significance of the work is the new knowledge about how to conduct experiments that can show the reality of the quantum state," Knee told *Phys.org*. "The big bonuses are that experimentalists will now be able to do more with less: that is, make tighter and tighter restrictions on the possible interpretations of quantum mechanics with fewer experimental resources. These experiments typically require heroic efforts, but the theoretical progress should mean that they are now possible with cheaper equipment and in less time."

To achieve such an improvement, Knee's work addresses one of the biggest challenges in this type of test, which is to identify the types of states and measurements that optimize the error improvement. This is a very high-dimensional optimization problem—with at least 72 variables, it is extremely difficult to solve using conventional optimization methods.

Knee showed that a much better approach to this type of optimization problem is to convert it into a problem that can be studied with convex programming methods. To search for the best combinations of variables, he applied techniques from convex optimization theory, alternately optimizing one variable and then the other until the optimal values of both converge. This strategy ensures that the results are "partially optimal," meaning that no change in just one of the variables could provide a better solution. And no matter how optimal a result is, Knee explains that it may never be possible to rule out the epistemic view entirely.

"There will always be wriggle room!" he said. "Certainly with the techniques known to us at the present time, a small amount of epistemic overlap can always be maintained, because experiments must be finished in a finite amount of time, and always suffer from a little bit of noise. That is to say nothing of the more wacky loopholes that a staunch epistemicist could try and jump through: for example, one can usually appeal to retrocausality or unfair sampling to get around the results of any 'experimental metaphysics.' Nevertheless, I believe that showing the quantum state must be at least 50% real is an achievable goal that most reasonable people would not be able to wriggle out of accepting."

One especially surprising and encouraging result of the new approach is that it shows that mixed states could work better for supporting the ontic view than pure states could. Typically, mixed states are considered more epistemic and lower-performing than pure states in many quantum information processing applications. Knee's work shows that one of the advantages of the mixed states is that they are extremely robust to noise, which suggests that experiments do not need nearly as high a precision as previously thought to demonstrate the reality of the quantum state.

"I very much hope that experimentalists will be able to use the recipes that I have found in the near future," Knee said. "It is likely that the general technique that I developed would benefit from some tweaking to tailor it to a particular experimental setup (for example, ions in traps, photons or superconducting systems). There is also scope for further theoretical improvements to the technique, such as combining it with other known theoretical approaches and introducing extra constraints to learn something of the general structure of the epistemic interpretation. The holy grail from a theoretical point of view would be to find the best possible experimental recipes and prove that they are as much! That is something I will continue to work on."

**Explore further:**
Researchers describe the wavefunction of Schroedinger's cat

**More information:**
George C. Knee. "Towards optimal experimental tests on the reality of the quantum state." *New Journal of Physics*. DOI: 10.1088/1367-2630/aa54ab

## richdiggins

## Whydening Gyre

A blank sheet of paper is a perfect quantum example - all possibilities (for that piece of paper) exist...

Ergo - quantum state.

However, those possibilities lead to other possibilities in other interacting quantum entities (you for example).

Simplify.

## Whydening Gyre

They could exist because you use a wrong interpretation of "epistemic" (knowledge).

Quit trying to use words you aren't fully adept in using.

Otherwise, you're just feeding your own ego.

The trick is to simplify, not complicate.

## TheGhostofOtto1923

"George Knee obtained an M.Sc. in theoretical physics from Imperial College London in 2010, a D.Phil. from the University of Oxford in 2014"

-D.Phil. Which explains the nonsense words ontic, epistemic, and omniscient. And also existential.

Why would someone screw up a perfectly good science education? A shocking number of them even believe in god you know.

There always has to be something more than what is. There IS no correct interpretation of epistemic. Its bilge.

## peter_bilski

Scientist trying to establish position or status experimental subjects in specific given time.

This may not be right approach of Time itself is not constant and linear at this minute level. If Time for example is granular- ( consisting of NOW< PAST

## Homebrook

A particle simply cannot occupy two states at the same time. This violates the basic law of non contradiction. This is the problem with "orthodox" Quantum Mechanics. It is simply impossible. It is absurd.

If such a thing were possible we have lost the basis of all science - logic/rationality. And science itself becomes impossible.

Bohmian mechanics does not have this fundamental problem and therefore wins as a theory.

## forumid001

## Da Schneib

On the other hand, I have to say that I'm not sure that ontic and epistemic understanding really are different when it comes to quantum reality. This also might be merely a matter of one's chosen viewpoint.

## Da Schneib

What is proposed is a third state: uncertainty. It is this or it is that is classical logic; quantum logic adds a third, classically unobservable state, thisthat if you like. This state collapses, if you like Bohr collapse, if observed; by classical logic once observed it must be either this or that. But if unobserved, it can be thisthat. This state is not possible in classical logic, or in classical physics. It is a purely quantum phenomenon. The Fluctuation Theorem suggests that this becomes more true the shorter times and smaller systems you observe.

[contd]

## antialias_physorg

Well, the problem here is our everyday definition of what a 'particle' is. We think of a particle as a 'small solid ball' (or somesuch). But that is just a convenience based on extrapolation from observed, macroscopic object behavior - and not based on any fundamental understading/measurement of what stuff really is.

The contradiction you see here is not based on some impossibility, but probably on a faulty idea of what a particle should be vs. what it actually is.

Not really. It just shifts it from a deterministic to a probabilistic view. Probabilities are still logical/rational. Remember that our intuition (what you erroneously call 'logic' in your post) is a development based on (macroscopic) observations. That this may be at odds with realms we have never experienced directly is not surprising.

## Da Schneib

This makes your statement a category error; you are attempting to apply classical reasoning to quantum phenomena, and quantum states do not obey classical logic.

Accept the Born Rule; accept Feynman's statement that if you are not flabbergasted by quantum reality, you have not understood it. Accept that if classical reality is to be as we see it, quantum reality must also be as we see it. Accept that in the quantum reality uncertainty is a state. Do not impose your classical expectations on quantum reality; they do not work there. If it's small or fast or both, your prejudices are violated, and until you accept that you will not and cannot understand.

Bohm was naive; Wheeler, Feynman, and Cramer are sophisticated. I strongly recommend you review Wheeler-Feynman absorber theory and the Cramer Transactional Interpretation of Quantum Mechanics.

## sirdumpalot

## shavera

From Bell's theorem, if we assume that there's some hidden ontic state, then that state *must* transmit information about itself to the entangled partner at faster than c (which means the information can travel backwards in time for some observers). If we assume that no such information transfer happens (because information going backwards in time is problematic), then there can't be a hidden ontic state determining behaviour.

## akvadrako

Relational QM is usually taken to imply the entire quantum state is real (ontic). If this experiment succeeds, it will support that view.

## akvadrako

It's not a *must*; that's just one of the possible solutions to the puzzle. The others are super-determinism and many worlds, both of which can be formulated as local and ontic.

## Whydening Gyre

However, quantum reasoning CAN be applied to classical phenomona, because (as proven by Murphy's Law)

classical states WILL obey quantum logic...

That said, there is no knowledge we CAN'T know, just knowledge we DON'T know - yet...

## Whydening Gyre

Except for the "can't" part, agreed. (you sound like Noumenon with that one)

Not hidden, un-extrapolated.

No. C IS the rate of info exchange(plus or minus a tiny bit). You are forgetting the AMOUNT of info in an exchange - is increasing.

Guess what that means...:-)

## savvys84

## Da Schneib

Hmmmm, no, there are things we CAN'T know. For example, we CAN'T know the position and momentum of a single particle to unlimited precision at the same time.

Heisenberg uncertainty, don'cha know.

## Da Schneib

No, actually @shavera is using a very specific technical definition of "hidden." It has a very specific meaning in quantum mechanics.

I know you feel like this stuff is philosophy, @Whyde, but you're missing a lot if you don't find out the specific technical meanings here.

[contd]

## Da Schneib

I'd be really, really careful arguing with @shavera about quantum mechanics. He's very good.

I'll do a bit of arguing in just a minute, but you want to really know what you're talking about before you engage him.

His point is (my argument in a moment) pretty solid. Bell's Theorem results in a couple of contradictory interpretations of quantum mechanics, and no one has yet figured out an experiment that will differentiate between them. More in a couple minutes when I get another brewski.

## Da Schneib

Bell's Theorem results in the conclusion that either local variables are real even though they're uncertain, or that variables aren't local and can be shared faster than the speed of light. It's not even clear that this is not a dichotomy, viz., you can make an experiment that shows that uncertain variables have real values, or that variables are shared across spacelike intervals, but not an experiment that shows both.

[contd]

## Da Schneib

The actual standard statement of the conclusions drawn from Bell's Theorem is that there are no local hidden variables. What this means is that there isn't some variable on a particle that gets carried along with it to a remote location and then results in a predetermined outcome of a measurement. So, for example, you can either conclude that when you measure the positions of two entangled articles exactly as they emerge from the generation part of your experiment, their momenta were exact at that time but you couldn't measure them, and then later when you measured the momenta they agreed but they varied in flight, or you can conclude that their momenta were uncertain but were correlated faster than light when you measured one of them. But you can't conclude BOTH.

At least I hope I got that right. @shavera will no doubt correct me if not.

## Da Schneib

Gave @shavera a 5 anyway; whoever gave him a 1 is probably an EUdiot.

## Da Schneib

## Whydening Gyre

DS,

Truly appreciate your input. You and others have training/background discipline that I don't. You see it in terms of the training, I see it in terms of untrained visualization. Not "philosophy", more like - Quantum for Dummies.

So,

not trying to argue. I just get excited at the ramifications of what/how I interpret...

IE; they don't entangle, they "combine". (It's an art thing...:-)

BTW, Crown is better...:-)

## Whydening Gyre

it doesn't have to transmit. It is part of, therefore instantly privy to, all information contained within ONE quantum entity, without speed of light lag. The SOL (interesting that we call our own source of light by that name) is only applicable to info passing separate entities.

IOW - SCALE.

I guess scientifically it is most correct to not accept what you haven't observed (until it is observed).

My way is to intuit based on observations you all have provided.

Not as EXACT, but still useful....:-)

## Da Schneib

When physicists say a particle is "entangled," they mean that it and another particle have generally one common property that is uncertain (and I mean Heisenberg uncertain), and these properties are dependent upon one another. If you measure it for one particle, you know it for the other. This doesn't mean all the properties are entangled; it's one or more, and generally only one. The quantities are dependent because of a conservation law.

So two particles emerge from a situation where a conservation law forces a property of both of them to be correlated, and this property is Heisenberg uncertain. If we measure this property consistently on one of the two particles, for an ensemble of pairs that emerge from this situation, we get a probabilistic outcome: a probability distribution. This property is said to follow the Born Rule.

[contd]

## Da Schneib

But if we consistently measure this property for *both* particles, despite the fact that we get a random value, *they always come out correlated*. So how can a property that has a random value (which we can tell because we get a probability distribution) come out correlated?

This is weird.

With a classical situation like this, we can show that the values will always come out correlated too, but in that case there is no probability distribution (because it's classical, there's no uncertainty). It's the *combination* of the Born Rule and the correlation that is different between the classical and quantum situations.

Now, what Bell showed in his eponymous Theorem is that this can't be due to a local hidden property that the two particles share that we can't measure, that determines the outcomes of these measurements.

And this makes the situation even weirder.

[contd]

## Da Schneib

This leads inevitably to two possible solutions:

1. There are hidden properties, but they are not properties of the particles, but of the universe, that is, they are not *local* hidden variables but *global* hidden variables, or

2. There is a way that the two particles' entangled property gets "communicated" somehow faster than light between them.

Classical systems simply don't behave this way. It's weird, and it's something only quantum systems can do.

Some interpretations of quantum mechanics assert the first (Bohm and the TIQM) and some assert the second (Copenhagen with collapse). Lots of experiments have been done to determine which is right and which is wrong, but the problem there is, they *all* succeed! And that's the weirdest of all.

As Bohr famously said, "If you are not shocked by quantum mechanics, you haven't understood it."

## antialias_physorg

It gets even weirder because such communication does not constitute information interchange

Definition of information interchange is: you set a priori values at the source, transmit and then measure a-posteriori values at the receiver. Correlation between the a priori state and the a posteriori state is information transmission.

Now here's the weird part: you can't *set* a priori values at the reciever that are entangled. You can make two entities entangled, but you can't *set* which will have which value when you measure it a posteriori (because such 'setting' would constitute an observation and break entanglement and you'd be back to a classical state)

So this quirk cannot be used for faster than light information transmission (it can be used for encryption, though, because encryption isn't information)

## Da Schneib

The reason it can be used for encryption is because the entanglement means that two separated particles have correlated values, but only if no one has measured them. This allows secure transmission of a one-time pad cipher key that cannot be viewed by any third party.

[contd]

## Da Schneib

One must note, however, that the entangled particles must be distributed to the two parties who will communicate, and that this can only happen at the speed of light at maximum. The particles themselves contain the elements of the one-time pad; without each party receiving one of the two entangled particles for each bit of the key, they do not share a key. The key is information, and is subject to the limitations of locality. What entanglement provides in this case is security.

Lots of entanglement communication scheme advocates fail to understand this. Thus we get various proposals for ansibles and such, and improper use of terms like "teleportation."

## antialias_physorg

It's also a mathematically beautiful thing because it shows that if you encrypt a signal you aren't adding information.

On top of that it supports the idea that the speed of light is really the universal ruler (something that underpins relativity) - thereby making a neat connection between quantum mechanics and relativity.

## Da Schneib

The deep perception I see here is that there is a pattern in quantum mechanics and relativity in which classical alternatives turn out to be not as alternative as they at first seem. When physicists were dealing with the FitzGerald solutions, to explain the Michelson-Morley experiement, these solutions were described by quoting a line from a Lewis Carrol poem:

"But I was thinking of a plan:

To dye one's whiskers green,

And always use so large a fan

That they could not be seen."

I have found this pattern over and over again as I dug deeper into both relativity and quantum mechanics. Whenever I see it I know I've seen a deep principle of how the universe works.

## Whydening Gyre

The pattern is simple - for everything we observe up or down in scale, there is an opposite "scale" to balance it out.

"Find the fulcrum" is the game...

You never really will find it, because every change on one side causes a change on the other.

The only way to "win" at it, is to not play...

And where's the fun in that..:-)?

## Whydening Gyre

## Osiris1

## Osiris1

## Osiris1