Getting around the 'uncertainty principle': Physicists make first direct measurements of polarization states of light

Mar 03, 2013
Weak measurement: as light goes through a birefringent crystal the horizontally and vertically polarized components of light spread out in space, but an overlap between the two components remains when they emerge. In a “strong” measurement the two components would be fully separated. Credit: Credit: Jonathan Leach

Researchers at the University of Rochester and the University of Ottawa have applied a recently developed technique to directly measure for the first time the polarization states of light. Their work both overcomes some important challenges of Heisenberg's famous Uncertainty Principle and also is applicable to qubits, the building blocks of quantum information theory.

They report their results in a paper published this week in .

The direct was first developed in 2011 by scientists at the National Research Council, Canada, to measure the wavefunction – a way of determining the state of a system.

Such direct measurements of the wavefunction had long seemed impossible because of a key tenet of the – the idea that certain properties of a quantum system could be known only poorly if certain other related properties were known with precision. The ability to make these measurements directly challenges the idea that full understanding of a quantum system could never come from .

The Rochester/Ottawa researchers, led by Robert Boyd, who has appointments at both universities, measured the polarization states of light - the directions in which the electric and magnetic fields of the light oscillate. Their key result, like that of the team that pioneered direct measurement, is that it is possible to measure key related variables, known as "conjugate" variables, of a or state directly. The polarization states of light can be used to encode , which is why they can be the basis of qubits in quantum information applications.

"The ability to perform direct measurement of the quantum wavefunction has important future implications for ," explained Boyd, Canada Excellence Research Chair in Quantum Nonlinear Optics at the University of Ottawa and Professor of Optics and Physics at the University of Rochester. "Ongoing work in our group involves applying this technique to other systems, for example, measuring the form of a "mixed" (as opposed to a pure) quantum state."

Previously, a technique called quantum tomography has allowed researchers to measure the information contained in these quantum states, but only indirectly. Quantum tomography requires intensive post-processing of the data, and this is a time-consuming process that is not required in the direct measurement technique. Thus, in principle, the new technique provides the same information as quantum tomography but in significantly less time.

"The key to characterizing any is gathering information about conjugate variables," said co-author Jonathan Leach, who is now a lecturer at Heriot-Watt University, UK. "The reason it wasn't thought possible to measure two conjugate variables directly was because measuring one would destroy the before the other one could be measured."

The direct measurement technique employs a "trick" to measure the first property in such a way that the system is not disturbed significantly and information about the second property can still be obtained. This careful measurement relies on the "weak measurement" of the first property followed by a "strong measurement" of the second property.

First described 25 years ago, weak measurement requires that the coupling between the system and what is used to measure it be, as its name suggests, "weak", which means that the system is barely disturbed in the measurement process. The downside of this type of measurement is that a single measurement only provides a small amount of information, and to get an accurate readout, the process has to be repeated multiple times and the average taken.

Boyd and his colleagues used the position and momentum of the light as the indicator of the polarization state. To couple the polarization to the spatial degree of freedom they used birefringent crystals: when light goes through such a crystal, there is a spatial separation introduced for different polarizations. For example, if light is made of a combination of horizontally and vertically polarized component, the positions of the individual components will spread out when it goes through the crystal according to its polarization. The thickness of the crystal can control the strength of the measurement, weak or strong, and determine the degree of separation, correspondingly small or large.

In this experiment, Boyd and his colleagues passed polarized light through two crystals of differing thicknesses: the first, a very thin crystal that "weakly" measures the horizontal and vertical polarization state; the second, a much thicker crystal that "strongly" measures the diagonal and anti-diagonal polarization state. As the first measurement was performed weakly, the system is not significantly disturbed, and therefore, information gained from the second measurement was still valid. This process is repeated several times to build up accurate statistics. Putting all of this together gives a full, direct characterization of the of the .

Explore further: 'Dressed' laser aimed at clouds may be key to inducing rain, lightning

Related Stories

More certainty on uncertainty's quantum mechanical role

Oct 04, 2012

Scientists who study the ultra-small world of atoms know it is impossible to make certain simultaneous measurements, for example finding out both the location and momentum of an electron, with an arbitrarily ...

Scientists cast doubt on renowned uncertainty principle

Sep 07, 2012

Werner Heisenberg's uncertainty principle, formulated by the theoretical physicist in 1927, is one of the cornerstones of quantum mechanics. In its most familiar form, it says that it is impossible to measure ...

Playing quantum tricks with measurements

Feb 15, 2013

A team of physicists at the University of Innsbruck, Austria, performed an experiment that seems to contradict the foundations of quantum theory—at first glance. The team led by Rainer Blatt reversed a ...

Physicists are first to 'squeeze' light to quantum limit

Jan 02, 2009

(PhysOrg.com) -- A team of University of Toronto physicists have demonstrated a new technique to squeeze light to the fundamental quantum limit, a finding that has potential applications for high-precision ...

Recommended for you

Robotics goes micro-scale

Apr 17, 2014

(Phys.org) —The development of light-driven 'micro-robots' that can autonomously investigate and manipulate the nano-scale environment in a microscope comes a step closer, thanks to new research from the ...

High power laser sources at exotic wavelengths

Apr 14, 2014

High power laser sources at exotic wavelengths may be a step closer as researchers in China report a fibre optic parametric oscillator with record breaking efficiency. The research team believe this could ...

User comments : 18

Adjust slider to filter visible comments by rank

Display comments: newest first

policetac
1 / 5 (2) Mar 03, 2013
David LaPoint's "The Primer Fields Part 3" clearly describes this in detail submitted by the discoverer.
Alphonso
1.7 / 5 (3) Mar 03, 2013
"It's not nice to fool Mother Nature.."
ValeriaT
3.7 / 5 (9) Mar 03, 2013
It's actually only fooling of stupid laws, proposed with schematically thinking people in the past. The Heisenberg uncertainty principle is relevant for single observation only - it was never postulated for mutually averaged sequence of repetitive measurements. In this sense the interpretation of this article is just a sensationalist journalism - the existing laws remained perfectly valid - we just realized better the conditions, for which they were developed. BTW the principle of above experiment was proposed in 1993 already (Elitzur & Vaidman), so that this progress is not even as fast, as it may appear for unqualified readers.
Infinum
1 / 5 (3) Mar 03, 2013
The Heisenberg uncertainty principle is relevant for single observation


Exactly. This experiment does not really "save" any time as the "weak" measurement has to be repeated many times to build up the certainty about the "weak" result.

With single measurement one gets 1 "strong" i.e. certain result and 1 "weak" i.e. uncertain result, which is 100% in accordance with Heisenberg's uncertainty principle.
vacuum-mechanics
1.4 / 5 (9) Mar 03, 2013
… Such direct measurements of the wavefunction had long seemed impossible because of a key tenet of the uncertainty principle – the idea that certain properties of a quantum system could be known only poorly if certain other related properties were known with precision. The ability to make these measurements directly challenges the idea that full understanding of a quantum system could never come from direct observation.

This is interesting; unfortunately the problem is that we cannot visualize how the 'abstract' wave function. Also it is interesting to note that even we know that we cannot measure the momentum and position with a greater accuracy of the uncertainty principle, but the problem is that why it is so, or what is its mechanism? Maybe this physical view could help us to understand it.
http://www.vacuum...19〈=en
LarryD
1 / 5 (1) Mar 03, 2013
Yes, I agree with ValeriaT because that's how I(as a layman) always interpreted the uncertainty principle.
'...This process is repeated several times to build up accurate statistics...'
How I interpret the article is that have found a way to get more accurate figures from perhaps less trials. Would that be correct?
rubberman
1 / 5 (2) Mar 03, 2013
1st comment. Yes. The point is that the realization of how the energy waveform organizes itself directly correlates to its properties. Now we can manipulate them based on this in a way we couldnt before.
LarryD
1 / 5 (1) Mar 04, 2013
1st comment. Yes. The point is that the realization of how the energy waveform organizes itself directly correlates to its properties. Now we can manipulate them based on this in a way we couldnt before.

rubberman, thanks for the comment. However,since phase and group velocities are '...properties...' of the wave form surely they would affect any result...or perhaps the other way round; the method of the determination would alter the properties...
I am still tied with the 'classical' uncertainty principle in that it would be impossible to get beyond a certain limit of accuracy because we wouldn't know how the system was affected by the determination. Surely also the system under investigation would affect the properties of the 'determinant' wave thus adding complication to any method.
rubberman
1 / 5 (3) Mar 04, 2013
LarryD, a photon has many properties associated with wavelength. As a self sustaining quanta of energy it has to generate an EM field to keep it's own energy contained. The correlation is between the field intensity/configuration, spin and wavelength. With this understanding we can alter portions of the spectrum's frequencies on the quantum level with out the entire waveform "collapsing"...and measure the changes ( the weak measurement followed by the strong one). Hence each photon is capable of conveying information once the wavelengths have been isolated. The Heisenberg uncertainty principle's prime flaw is that it allows for a probability distribution of zero as a valid "position" for a waveform....this is impossible. Kind of like having us not existing as a valid aspect to a theory of our evolution.
LarryD
1 / 5 (1) Mar 04, 2013
LarryD, a photon has many properties associated with wavelength. As a self sustaining quanta of energy it has to generate an EM field to keep it's own energy contained. The correlation is between the field intensity/configuration, spin ....

Thanks again rubberman. The above quote does concern me a bit 'self sustaining', isn't that like someything being 'Perpetual'?
'...The Heisenberg uncertainty principle's prime flaw is that it allows for a probability distribution of zero as a valid "position" for a waveform...'
Are we talking about the Heisenberg Picture here and the Energy State, delta E? This would not be 0 at the extremes.

SethD
1 / 5 (5) Mar 05, 2013
So the annoying quantum mambo-jumbo was just an illusion all along.
Just as Einstein said.
rubberman
1 / 5 (2) Mar 06, 2013
"Thanks again rubberman. The above quote does concern me a bit 'self sustaining', isn't that like someything being 'Perpetual'?"

Well...perpetual is a pretty strong word. I believe the oldest ones we can detect are in the 13 billion year old range...
policetac
1 / 5 (2) Mar 09, 2013
Understanding the principle concept. Imagine a steel ball on a magnet. It rolls to the attracted side. Drill a dip into the magnet. The ball holds even if turned upside down. Make it all bigger in your mind and keep drilling until you break through. Put the ball back. In your minds microscope you should see the ball hanging in mid space. That is a representation of the fields moving. Visualize them moving to each other, as they then repel. Imagine if you had two and had one pull to the other. Then quickly switch your magnet around so it now repels. If one position is held by triangular point compression, it becomes fixed. This can then create a "Flip point" that can/will pull to itself, then with (possibly a true,) "zero point" position where the "flip of polarity occurs. The fields are created by the structure of the fields themselves, and the particle comes into existence through natural mutual attraction. But then again, what do I know. I'm banned here for life. :)
policetac
1 / 5 (2) Mar 09, 2013
I would go on to assert that my understanding so far suggests, the "Uncertainty Principle" is in fact invalid on it's face. That there "Would" in fact be a certain level of predictive "existences" that could be intentionally created and measured. Through the "creation" of these resulting structures, would be the "proof" of the experiment
ValeriaT
1 / 5 (2) Mar 16, 2013
Weak measurement is nothing less, nothing more than the application of classical physics across the time scale. Repetitive measurements maintained across time dimensions make the quantum system classical and temporal in similar way, like the increase of number of qbits during one measurement makes it more spatial and delocalized. With repetition of measurements in time or space frames (stroboscopic or tomographic measurement) the measured system becomes gradually entangled not only with observer, but with it expanding memory and it becomes classical in this way.
ValeriaT
1 / 5 (2) Mar 16, 2013
Instead of "direct measurement" I'd rather talk about "stroboscopic measurement", because 1) this denomination gives intuitive insight, how the "direct measurement" actually proceeds 2) it explains, why the "direct measurement" represents the dual time-scale counterpart to the space-scale "tomographic measurement" 3) it's better related semantically to its dual counterpart (tomography is a instrumentation method in similar way, like the stroboscopic technique).
LarryD
1 / 5 (1) Mar 16, 2013
In my simplistic view I think any attempt at a physical prediction on the Q level is bound to be a mixed result, that is the 'attempt' will cause some disruption of a wave form simply because 'size' and 'energies' have similar characteristics. Rather like throwing a (say) a tennis ball at a large rock, would cause 0 reaction whereas a small stone of similar weight to the ball might be moved.
I would agree that HUP might only be an approximation but surely in the end we would still be left pobabilities.
CQT
not rated yet Mar 25, 2013
The duration of any process is a measurement problem or a problem of measurement.

The way humans measure at a point where humans measure puts a stop to any or all probabilities in play before or after this point.

"The wavefunction in quantum mechanics evolves deterministically according to the Schrödinger equation as a linear superposition of different states, but actual measurements always find the physical system in a definite state."
http://en.wikiped..._problem]http://en.wikiped..._problem[/url]

(This "definite state" is the absence of probability.)

"If observers and their measuring apparatus are themselves described by a deterministic wave function, why can we not predict precise results for measurements, but only probabilities?"
http://en.wikiped..._problem]http://en.wikiped..._problem[/url]

We have no absolute unit of measure. An arbitrary chosen unit can not be deterministic. We undermined a deterministic wave and what that wave describes with our measure - our choice of measure is random.

More news stories

Airbnb rental site raises $450 mn

Online lodging listings website Airbnb inked a $450 million funding deal with investors led by TPG, a source close to the matter said Friday.

Health care site flagged in Heartbleed review

People with accounts on the enrollment website for President Barack Obama's signature health care law are being told to change their passwords following an administration-wide review of the government's vulnerability to the ...