Quantum measurement precision approaches Heisenberg limit

Feb 26, 2010 By Lisa Zyga feature
This illustration shows an adaptive feedback scheme being used to measure an unknown phase difference between the two red arms in the interferometer. A photon (qubit) is sent through the interferometer, and detected by either c1 or c0, depending on which arm it traveled through. Feedback is sent to the processing unit, which controls the phase shifter in one arm so that, when the next photon is sent, the device can more precisely measure the unknown phase in the other arm, and calculate a precise phase difference. Image credit: Hentschel and Sanders.

(PhysOrg.com) -- In the classical world, scientists can make measurements with a degree of accuracy that is restricted only by technical limitations. At the fundamental level, however, measurement precision is limited by Heisenberg’s uncertainty principle. But even reaching a precision close to the Heisenberg limit is far beyond existing technology due to source and detector limitations.

Now, using techniques from machine learning, physicists Alexander Hentschel and Barry Sanders from the University of Calgary have recently shown how to generate measurement procedures that can outperform the best previous strategy in achieving highly precise quantum measurements. The new level of precision approaches the Heisenberg limit, which is an important goal of quantum measurement. Such quantum-enhanced measurements are useful in several areas, such as atomic clocks, gravitational wave detection, and measuring the optical properties of materials.

“The precision that any measurement can possibly achieve is limited by the so-called Heisenberg limit, which results from Heisenberg's uncertainty principle,” Hentschel told PhysOrg.com. “However, classical measurements cannot achieve a precision close to the Heisenberg limit. Only quantum measurements that use can approach the Heisenberg limit. Yet, devising quantum measurement procedures is highly challenging.”

Heisenberg's uncertainty principle ultimately limits the achievable precision depending on how many quantum resources are used for the measurement. For example, gravitational waves are detected with laser interferometers, whose precision is limited by the number of photons available to the within the duration of the gravitational wave pulse.

In their study, Hentschel and Sanders used a computer simulation of a two-channel interferometer with a random phase difference between the two arms. Their goal was to estimate the relative phase difference between the two channels. In the simulated system, photons were sent into the interferometer one at a time. Which input port the photon entered was unknown, so that the photon (serving as a qubit) was in a superposition of two states, corresponding to the two channels. When exiting the interferometer, the photon was detected as leaving one of the two output ports, or not detected at all if it was lost. Since photons were fed into the interferometer one at a time, no more than one bit of information could be extracted at once. In this scenario, the achievable precision is limited by the number of photons used for the measurement.

As previous research has shown, the most effective quantum measurement schemes are those that incorporate adaptive feedback. These schemes accumulate information from measurements and then exploit it to maximize the information gain in subsequent measurements. In an interferometer with feedback, a sequence of photons is successively sent through the interferometer in order to measure the unknown phase difference. Detectors at the two output ports measure which way each of the photons exits, and then transmit this information to a processing unit. The processing unit adapts the value of a controllable phase shifter after each photon according to a given policy.

However, devising an optimal policy is difficult, and usually requires guesswork. In their study, Hentschel and Sanders adapted a technique from the field of artificial intelligence. Their algorithm autonomously learns an optimal policy based on trial and error - replacing guesswork by a logical, fully automatic, and programmable procedure.

Specifically, the new method uses a machine learning algorithm called particle swarm optimization (PSO). PSO is a “collective intelligence” optimization strategy inspired by the social behavior of birds flocking or fish schooling to locate feeding sites. In this case, the physicists show that a PSO algorithm can also autonomously learn a policy for adjusting the controllable phase shift.

As Hentschel and Sanders show, after a sequence of input qubits have been sent through the interferometer, the measurement procedure learned by the PSO algorithm delivers a measurement of the unknown phase shift that scales closely to the Heisenberg limit, setting a new precedent for quantum measurement precision. The new high level of precision could have important implications for the gravitational wave detection.

“Einstein’s theory of General Relativity predicts gravitational waves,” Hentschel said. “However, a direct detection of has not been achieved. Gravitational wave detection will open up a new field of astronomy that augments electromagnetic wave and neutrino observations. For example, gravitational wave detectors can spot merging black holes or binary star systems composed of two neutron stars, which are mostly hidden to conventional telescopes.”

Explore further: Breakthrough in light sources for new quantum technology

More information: Alexander Hentschel and Barry C. Sanders. “Machine Learning for Precise Quantum Measurement.” Physical Review Letters 104, 063603 (2010). DOI:10.1103/PhysRevLett.104.063603

4.8 /5 (46 votes)

Related Stories

Physicists are first to 'squeeze' light to quantum limit

Jan 02, 2009

(PhysOrg.com) -- A team of University of Toronto physicists have demonstrated a new technique to squeeze light to the fundamental quantum limit, a finding that has potential applications for high-precision ...

Observing a Photon no Longer a Seek-and-Destroy Mission

Jun 02, 2004

A team of University of Queensland, Australia physicists has devised a sophisticated measurement system for single particles of light, or photons, enabling them to investigate fascinating behaviour in the quantum world. ...

Japanese Underground Gravitational Wave Detector

Apr 16, 2004

A laser interferometer gravitational wave antenna with a baseline length of 20 m (LISM) was developed at a site 1000 m underground, near Kamioka, Japan. This project was a unique demonstration of a prototype laser inter ...

Recommended for you

Breakthrough in light sources for new quantum technology

5 hours ago

One of the most promising technologies for future quantum circuits are photonic circuits, i.e. circuits based on light (photons) instead of electrons (electronic circuits). First, it is necessary to create ...

User comments : 16

Adjust slider to filter visible comments by rank

Display comments: newest first

broglia
1.8 / 5 (5) Feb 26, 2010
If gravitational waves are forming longitudinal waves of vacuum environment, then they cannot be detected just because of uncertainty principle - they would form just the longitudinal portion of CMB noise, which prohibits us to observe objects more exactly.
axemaster
5 / 5 (7) Feb 26, 2010
Broglia, I'm taking quantum in college right now, and I have no idea what are you talking about...

CMB noise is microwaves, emitted quite a while after the Big Bang, when the first stars ionized the interstellar hydrogen and made the universe transparent to EM radiation. Gravitational waves and the CMB shouldn't be related at all.

Moreover, any gravity waves generated around that time would have stretched to very long wavelengths by now, making them pretty much impossible to detect.

The gravitational waves they are hoping to detect are those emitted by black holes, particularly when they coalesce (they make a big pulse). The CMB is irrelevant.
seneca
2 / 5 (4) Feb 26, 2010
..gravitational waves and the CMB shouldn't be related at all.
You can use the water surface model to understand the relation between light and gravitational waves. Water surface corresponds the space-time, formed by vacuum foam: it's so large, just because energy spreads so slowly around in in transversal waves, which are corresponding light waves in vacuum. After then the gravitational waves are density (scalar) waves of sound, spreading through underwater. They're so weak just because they're spreading so fast. But waves faster then the surface waves cannot behave like true waves at the water surface, instead of it, they would appear like noise. This model doesn't exclude existence of harmonic waves though, in analogy to sound spreading through SOFAR channel.

This model is supported by holographic model of Universe, too. In this model Universe cannot behave like holographic projection, until speed of gravitational waves wouldn't highly superluminal.
seneca
2.3 / 5 (3) Feb 26, 2010
..any gravity waves generated around that time would have stretched to very long wavelengths by now..
This stance can be objected again by using of surface water wave model of space-time, where speed of transversal waves decreases with increasing frequency, while speed of longitudinal waves (which are forming analogy of GW here) behaves in the opposite way.

Again, this model is supported for example by prof. Turok/Steinhardt's cyclic cosmology, in which gravitational waves are blue-shifted due to the propagation of gravity through the dimensional manifold.

http://arxiv.org/...59v2.pdf
http://arxiv.org/...53v1.pdf
seneca
2.3 / 5 (3) Feb 26, 2010
Although this model of gravitational waves has a robust meaning in dense aether theory, there is only subtle experimental evidence of it. The passage of gravity waves would appear like less or more temporal change in CMB intensity, as some events from GEO600 interferometer indicate:

http://startswith...mg35.gif

We can compare it to our experience from underwater nuclear explosions, where passage of sound wave from underwater appears like wave of noise at water surface.

http://www.youtub...XJuv8tDM
baudrunner
5 / 5 (1) Feb 26, 2010
The problem with detecting gravitational waves is that no wave, as such, is actually produced. Rather, a shift in the gravitational field equilibrium, which is not reflected (very important), will occur when massive bodies interact in space. The process preserves that equilibrium. It's like an ongoing, web-like action. You pull the web in a number of directions, but no oscillating function is produced, ergo - no wave.
seneca
5 / 5 (1) Feb 26, 2010
..shift in the gravitational field equilibrium, which is not reflected, will occur when massive bodies interact in space..
Maybe you're right, maybe not - but I don't see any logic in this sentence. What the "gravitational field equilibrium" is supposed to be, for example?

In general you should explain/predict unknown phenomena by using of known phenomena, not by introduction of new unknown concepts - because it just introduces tautology into such explanation.
copernicus
Feb 26, 2010
This comment has been removed by a moderator.
seneca
5 / 5 (1) Feb 26, 2010
Why not - the derivation of the number of Kaluza spheres in our Universe by using of well known constants is nice - but I don't see any way, how such result could be tested.

This approach is somewhat analogous to previous post of baudrunner: you're deriving untestable numbers by using of well testable physical constants, thus increasing entropy of human understanding. After all, this is the reason, why string theory is considered a fring theory: it predicts lotta numbers, but these numbers cannot be tested so easily, being quite abstract. It lacks robust predictions at logical level.
Question
1 / 5 (3) Feb 27, 2010
It would appear they are using a computer simulation and the photon (pulse) may or may not be detected. What this tells me is that they would be using such a small quantity of light that when it enters the splitter and is reduced by 1/2 it may register at either detector or NO detector at all. It is such a small pulse it would not be detected by both detectors because if it were they would automatically assume two or more photons were used and the pulse would be reduced in strength.

The experiment is flawed from start to finish.
Thrasymachus
not rated yet Feb 27, 2010
Seneca, in an earlier post, you write "You can use the water surface model to understand the relation between light and gravitational waves. Water surface corresponds the space-time, formed by vacuum foam: it's so large, just because energy spreads so slowly around in in transversal waves, which are corresponding light waves in vacuum."

Later, you write, " ...where speed of transversal waves decreases with increasing frequency, while speed of longitudinal waves (which are forming analogy of GW here) behaves in the opposite way."

However, light waves travel at the same speed c, regardless of their frequency or wavelength. This has been measured to be true to an arbitrary degree of precision. Please explain. (dance crazy monkey, dance!)
seneca
Feb 27, 2010
This comment has been removed by a moderator.
seneca
2.3 / 5 (3) Feb 27, 2010
The relativity model in light speed understanding was forced in 1983 by introduction of SI meter unit based on the wavelength of krypton-86 light. While time unit is defined by frequency of light, it effectively means, the light speed is invariant by its very definition (in fact isn't, because second unit is based on radiation of cesium 133 atoms, so that dispersion of vacuum foam still could apply here - but the differences will be really small).

But at the moment, when we use for example iridium prototype, then the changes in light speed in vacuum emerge, because the space-time inside of matter expands faster, then the rest of Universe (space-time inside of matter is condensed). This behavior manifests itself by dilatation and evaporation of iridium kilogram and meter prototypes, for example.

http://www.physor...s64.html
http://www.physor...759.html

These subtle phenomena render our Universe as a much more dynamic environment, then it appears at the first sight.
seneca
3 / 5 (4) Feb 28, 2010
..This has been measured to be true to an arbitrary degree of precision..
Well, only locally: light speed measured locally is really invariant. But the observation of gravitational lensing means, the light propagates in different speed through lens. The relativist would say, it's because, the space-time is deformed there. But strictly speaking, such claim is just a conjecture, because the measurement of space-time deform would require to place clock into this gravitational lens. In another words, we are assuming measurement, which was never done.

Gravitational lensing and time dilatation can never occur at the same moment: when we would visit galactic cluster with clock to prove space-time deform, we would observe, the relativistic aberration would disappear, because space-time is homogeneous at the center of cluster.

This principal inconsistency in measurements brings a conceptual problem into relativity. The quite similar, just dual problem exists with quantum mechanics.
KBK
1 / 5 (1) Mar 01, 2010
The problem with detecting gravitational waves is that no wave, as such, is actually produced. Rather, a shift in the gravitational field equilibrium, which is not reflected (very important),..........an ongoing, web-like action. You pull the web in a number of directions, but no oscillating function is produced, ergo - no wave.


Massive bodies in space are a secondary function, not primary. Which leads to situation where the effects of the interactions are a minimum of a full logarithmically (and spherical-progressive from all individual wave interactives) calculated magnitude less.

Makes for interesting results, but the Newtonian gross calcs that Einstein used are known, even by him, to have been incorrect. (Corrected by him in 1927 via his proper inclusion of Maxwell's full works not the edited ones) to achieve a proper and working Unified Field theory, which since 1927, has slowly been buried/removed, due to it's extreme ramifications.
KBK
1 / 5 (1) Mar 01, 2010
As far as i can tell, tho, this article is not about precise measurement, but rather is about enforced determination from the superposition dual wave state..to a historical point of the past.

Time is linear, it moves, according to our observation, in one direction only. The past is locked and permanent, but the future has a minimum of two indeterminate states, which are fused to the single, upon observation.., thus, in the moment, CREATING the flow of time and the past, from the 'prior' moments of 'indeterminacy'.

The Hitachi experiment in the 1980's proved this. So to utilize a system of measurement that attempts to do a loopback and remove the open possibility of the future and fuse it into the locked character of the past results in nothing but some bizarre form of attempting to force potentiality to a permanent state that reflects the past. ie, to erase possibility and/or potential--- from the future.

And THAT, is very seriously messed up.
broglia
1 / 5 (1) Mar 01, 2010
All these seemingly miraculous phenomena have their very trivial analogy at water surface, where transversal waves are always affected by Brownian noise. We can arrange 2D analogy of observation by using of pair of surface wave detectors - the precision in detection of location and speed of objects observed would be affected by this background noise in similar way, like during observation in vacuum by CMB noise.

Such model enables to think about quantum phenomena from perspective, which is unachievable for us, because in vacuum we can use only transversal waves for observation - the gravitational waves are too weak and low distance ones for being available for more objective observations.

http://www.physor...511.html
VerGreeneyes
not rated yet Mar 01, 2010
Particle Swarm Optimization, eh? Milkyway@Home also uses that - they're also trying Differential Evolution, and have tried a Genetic search in the past. I wonder if these scientists have considered those alternatives as well?