Independent research group testing D-Wave Two finds no quantum speedup

D-Wave
Pictured is D-Wave’s current 512-qubit version. Credit: Courtesy of D-Wave Systems Inc.
(Phys.org) —An independent research team with members affiliated with several universities in the U.S. and Switzerland has concluded that the D-Wave Two computer shows no signs of quantum speedup. They've written a paper describing how they tested one of the computers that was purchased by Lockheed Martin and the results they found and have had it published in the journal Science.

Scientists would really like to build a truly quantum computer, the benefits it would offer would almost certainly be groundbreaking, leading to new discoveries in and other areas—plus it would likely allow for speeding up processer intensive applications like weather forecasting. Unfortunately, such a computer is still decades away, and that's assuming building one is really possible at all. In the meantime, researchers have made progress in building machines that are partially quantum, and one company D-Wave Systems, a startup in Burnaby, Canada has made one such machine for sale. Because of the high price, only a few have been sold, to Lockheed Martin, Google and likely some entities that have not been made public. Prime applications for such a machine are those that are analogous to seeking a lowest point or deepest valley in hilly terrain. Conventional machines must traverse all the hills and valleys to find a solution, while a machine such as the D-Wave (a quantum annealer that takes advantage of ) should be able to burrow though the hills to gain direct access to the valleys.

Since announcing its first machine, D-Wave representatives have maintained that their machine is capable of quantum speedup (running an application faster than a conventional computer) for certain applications. After running some tests on the machine it purchased, a team at Google backed up the claim.

Others, particularly in the physics community, have been skeptical, suggesting the results obtained by the Google team came about due to unfair comparisons—problems designed to run well on the D-Wave Systems were run without optimizing on conventional machines. In this new effort, the team sought to test the latest version of the D-Wave machine to see if they could determine if the machine is actually capable of quantum speedup.

The team ran 1000 random optimization problems on the 503-qubit D-Wave Two device measuring how long it took to solve them, compared with a classical PC. They report that while some ran somewhat faster, others ran a lot slower. They report that they found no evidence of quantum speedup.

To be fair, the team is not claiming the machine to be a fraud, instead, they note that the machine they tested failed to show quantum speedup under the conditions in which it was tested. They are not ruling out the possibility that the computer could show quantum speedup under other conditions.

The results found by the independent group aren't likely to spell doom for the D-Wave machines, though the company that makes them is likely to face more skepticism going forward. It appears doubts will only be erased if more research conducted covering many more conditions are able to validate the initial claims.


Explore further

Stronger benchmarks needed to fully quantify quantum speedup, physicist says

More information: 1. Paper: Defining and detecting quantum speedup, Science DOI: 10.1126/science.1252319

ABSTRACT
The development of small-scale quantum devices raises the question of how to fairly assess and detect quantum speedup. Here we show how to define and measure quantum speedup, and how to avoid pitfalls that might mask or fake such a speedup. We illustrate our discussion with data from tests run on a D-Wave Two device with up to 503 qubits. Using random spin glass instances as a benchmark, we find no evidence of quantum speedup when the entire data set is considered, and obtain inconclusive results when comparing subsets of instances on an instance-by-instance basis. Our results do not rule out the possibility of speedup for other classes of problems and illustrate the subtle nature of the quantum speedup question.

2. ETH Zurich Q&A: www.ethz.ch/en/news-and-events … /2014/06/d-wave.html

Journal information: Science

© 2014 Phys.org

Citation: Independent research group testing D-Wave Two finds no quantum speedup (2014, June 20) retrieved 26 June 2019 from https://phys.org/news/2014-06-independent-group-d-wave-quantum-speedup.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
1358 shares

Feedback to editors

User comments

Jun 20, 2014
Pffft... Hocus pocus. If and when someone truly achieves this it will be obvious- just like cold fusion, dark matter, etc... Marketing gimmicks, IMO.

Jun 20, 2014
h20dr, it seems you overestimate the power of quantum speed-up. There aren't really that many useful algorithms for universal quantum computers that show the kind of significant speed-up that for instance Shor's algorithm can achieve for factorization.

The D-Wave machines performs quantum annealing and operates with qubits (another Troyer et al. paper already showed that). But this doesn't automatically translate to magic super-performance. On the other hand this is new technology with chip structures much larger than our mature CMOS technology.

Jun 20, 2014
This comment has been removed by a moderator.

Jun 20, 2014
This comment has been removed by a moderator.

Jun 20, 2014
Considering that the quantum machine is inherently probabilistic, one has to repeat the calculation many times to achieve a certain level of certainty over the result.

The question then becomes, why not design a conventional algorithm that uses a monte-carlo principle of guessing where the optimum point is by using a reduced number of pseudo-random searches over the problem space and simply letting it have a certain degree of error over the solution to reduce the number of calculations to the point where the margin of error is identical to the quantum computer - to achieve the same speedup over the quantum algorithm.

It's like finding the lowest point of the valley not by plumbing each and every point, but by dropping the line every mile, and then searching around a bit around the points. The quantum algorithm -might- be confident that there's a deep well somewhere if you ran the calculation a million times, but the chances of finding it over a million random drops could be equal.

Jun 20, 2014
@Eikka There is such algorithm - it is called Simulated annealing.

Jun 21, 2014
@Eikka There is such algorithm - it is called Simulated annealing.


So, how does it stack up against real quantum algorithms?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more