(PhysOrg.com) -- In the classical world, scientists can make measurements with a degree of accuracy that is restricted only by technical limitations. At the fundamental level, however, measurement precision is limited by Heisenberg’s uncertainty principle. But even reaching a precision close to the Heisenberg limit is far beyond existing technology due to source and detector limitations.

Now, using techniques from machine learning, physicists Alexander Hentschel and Barry Sanders from the University of Calgary have recently shown how to generate measurement procedures that can outperform the best previous strategy in achieving highly precise quantum measurements. The new level of precision approaches the Heisenberg limit, which is an important goal of quantum measurement. Such quantum-enhanced measurements are useful in several areas, such as atomic clocks, gravitational wave detection, and measuring the optical properties of materials.

“The precision that any measurement can possibly achieve is limited by the so-called Heisenberg limit, which results from Heisenberg's uncertainty principle,” Hentschel told *PhysOrg.com*. “However, classical measurements cannot achieve a precision close to the Heisenberg limit. Only quantum measurements that use quantum correlations can approach the Heisenberg limit. Yet, devising quantum measurement procedures is highly challenging.”

Heisenberg's uncertainty principle ultimately limits the achievable precision depending on how many quantum resources are used for the measurement. For example, gravitational waves are detected with laser interferometers, whose precision is limited by the number of photons available to the interferometer within the duration of the gravitational wave pulse.

In their study, Hentschel and Sanders used a computer simulation of a two-channel interferometer with a random phase difference between the two arms. Their goal was to estimate the relative phase difference between the two channels. In the simulated system, photons were sent into the interferometer one at a time. Which input port the photon entered was unknown, so that the photon (serving as a qubit) was in a superposition of two states, corresponding to the two channels. When exiting the interferometer, the photon was detected as leaving one of the two output ports, or not detected at all if it was lost. Since photons were fed into the interferometer one at a time, no more than one bit of information could be extracted at once. In this scenario, the achievable precision is limited by the number of photons used for the measurement.

As previous research has shown, the most effective quantum measurement schemes are those that incorporate adaptive feedback. These schemes accumulate information from measurements and then exploit it to maximize the information gain in subsequent measurements. In an interferometer with feedback, a sequence of photons is successively sent through the interferometer in order to measure the unknown phase difference. Detectors at the two output ports measure which way each of the photons exits, and then transmit this information to a processing unit. The processing unit adapts the value of a controllable phase shifter after each photon according to a given policy.

However, devising an optimal policy is difficult, and usually requires guesswork. In their study, Hentschel and Sanders adapted a technique from the field of artificial intelligence. Their algorithm autonomously learns an optimal policy based on trial and error - replacing guesswork by a logical, fully automatic, and programmable procedure.

Specifically, the new method uses a machine learning algorithm called particle swarm optimization (PSO). PSO is a “collective intelligence” optimization strategy inspired by the social behavior of birds flocking or fish schooling to locate feeding sites. In this case, the physicists show that a PSO algorithm can also autonomously learn a policy for adjusting the controllable phase shift.

As Hentschel and Sanders show, after a sequence of input qubits have been sent through the interferometer, the measurement procedure learned by the PSO algorithm delivers a measurement of the unknown phase shift that scales closely to the Heisenberg limit, setting a new precedent for quantum measurement precision. The new high level of precision could have important implications for the gravitational wave detection.

“Einstein’s theory of General Relativity predicts gravitational waves,” Hentschel said. “However, a direct detection of gravitational waves has not been achieved. Gravitational wave detection will open up a new field of astronomy that augments electromagnetic wave and neutrino observations. For example, gravitational wave detectors can spot merging black holes or binary star systems composed of two neutron stars, which are mostly hidden to conventional telescopes.”

**Explore further:**
Japanese Underground Gravitational Wave Detector

**More information:**
Alexander Hentschel and Barry C. Sanders. “Machine Learning for Precise Quantum Measurement.” *Physical Review Letters* 104, 063603 (2010). DOI:10.1103/PhysRevLett.104.063603

## broglia

## axemaster

CMB noise is microwaves, emitted quite a while after the Big Bang, when the first stars ionized the interstellar hydrogen and made the universe transparent to EM radiation. Gravitational waves and the CMB shouldn't be related at all.

Moreover, any gravity waves generated around that time would have stretched to very long wavelengths by now, making them pretty much impossible to detect.

The gravitational waves they are hoping to detect are those emitted by black holes, particularly when they coalesce (they make a big pulse). The CMB is irrelevant.

## seneca

This model is supported by holographic model of Universe, too. In this model Universe cannot behave like holographic projection, until speed of gravitational waves wouldn't highly superluminal.

## seneca

Again, this model is supported for example by prof. Turok/Steinhardt's cyclic cosmology, in which gravitational waves are blue-shifted due to the propagation of gravity through the dimensional manifold.

http://arxiv.org/...59v2.pdf

http://arxiv.org/...53v1.pdf

## seneca

http://startswith...mg35.gif

We can compare it to our experience from underwater nuclear explosions, where passage of sound wave from underwater appears like wave of noise at water surface.

http://www.youtub...XJuv8tDM

## baudrunner

## seneca

In general you should explain/predict unknown phenomena by using of known phenomena, not by introduction of new unknown concepts - because it just introduces tautology into such explanation.

## copernicus

Feb 26, 2010## seneca

This approach is somewhat analogous to previous post of baudrunner: you're deriving untestable numbers by using of well testable physical constants, thus increasing entropy of human understanding. After all, this is the reason, why string theory is considered a fring theory: it predicts lotta numbers, but these numbers cannot be tested so easily, being quite abstract. It lacks robust predictions at logical level.

## Question

The experiment is flawed from start to finish.

## Thrasymachus

Later, you write, " ...where speed of transversal waves decreases with increasing frequency, while speed of longitudinal waves (which are forming analogy of GW here) behaves in the opposite way."

However, light waves travel at the same speed c, regardless of their frequency or wavelength. This has been measured to be true to an arbitrary degree of precision. Please explain. (dance crazy monkey, dance!)

## seneca

Feb 27, 2010## seneca

But at the moment, when we use for example iridium prototype, then the changes in light speed in vacuum emerge, because the space-time inside of matter expands faster, then the rest of Universe (space-time inside of matter is condensed). This behavior manifests itself by dilatation and evaporation of iridium kilogram and meter prototypes, for example.

http://www.physor...s64.html

http://www.physor...759.html

These subtle phenomena render our Universe as a much more dynamic environment, then it appears at the first sight.

## seneca

Gravitational lensing and time dilatation can never occur at the same moment: when we would visit galactic cluster with clock to prove space-time deform, we would observe, the relativistic aberration would disappear, because space-time is homogeneous at the center of cluster.

This principal inconsistency in measurements brings a conceptual problem into relativity. The quite similar, just dual problem exists with quantum mechanics.

## KBK

Massive bodies in space are a secondary function, not primary. Which leads to situation where the effects of the interactions are a minimum of a full logarithmically (and spherical-progressive from all individual wave interactives) calculated magnitude less.

Makes for interesting results, but the Newtonian gross calcs that Einstein used are known, even by him, to have been incorrect. (Corrected by him in 1927 via his proper inclusion of Maxwell's full works not the edited ones) to achieve a proper and working Unified Field theory, which since 1927, has slowly been buried/removed, due to it's extreme ramifications.

## KBK

Time is linear, it moves, according to our observation, in one direction only. The past is locked and permanent, but the future has a minimum of two indeterminate states, which are fused to the single, upon observation.., thus, in the moment, CREATING the flow of time and the past, from the 'prior' moments of 'indeterminacy'.

The Hitachi experiment in the 1980's proved this. So to utilize a system of measurement that attempts to do a loopback and remove the open possibility of the future and fuse it into the locked character of the past results in nothing but some bizarre form of attempting to force potentiality to a permanent state that reflects the past. ie, to erase possibility and/or potential--- from the future.

And THAT, is very seriously messed up.

## broglia

Such model enables to think about quantum phenomena from perspective, which is unachievable for us, because in vacuum we can use only transversal waves for observation - the gravitational waves are too weak and low distance ones for being available for more objective observations.

http://www.physor...511.html

## VerGreeneyes