Stronger benchmarks needed to fully quantify quantum speedup, physicist says

February 5, 2014 by John German
Credit: Texas Advanced Computing Center.

( —Texas A&M University physicist Helmut G. Katzgraber's research takes him to the crossroads of physics, computer science, quantum information theory and statistical mechanics. For more than a decade, he has been pushing the frontiers of computational physics to study hard optimization problems and disordered materials, applying his expertise to problems in the fascinating and fast-evolving field of quantum computing.

This past week, his work caught the attention of the global research community because of a study related to a particular commercial quantum computing device, the USD 10M D-Wave Two—more specifically, its documented failure to outperform traditional computers in head-to-head speed tests by Ronnow et al.

Not so fast, says Katzgraber, whose own National Science Foundation-funded research points to an intriguing possible explanation: Benchmarks used by D-Wave and research teams alike to detect the elusive quantum speedup might not be the best to do so and, therefore, not up to the test.

In a paper submitted earlier this month, Katzgraber details his team's innovative results on quantum speedup. Among other findings, he proposes potentially hard benchmark problems to detect and quantity such a mysterious target as quantum speedup, which Katzgraber says is highly dependent on the combination of the chosen benchmark and optimization algorithm. In particular, his results suggest that the current benchmarks might not be best suited to truly showcase the potential of the algorithm, a quantum version of thermal simulated annealing and the technology upon which the D-Wave machine is based.

Simulated annealing borrows its name from a type of heat treatment that involves altering a material's properties by heating it to above its critical temperature, maintaining the temperature and then cooling it slowly with the hope of improving its ductility, Katzgraber explains. In using simulated annealing as an optimization method, the system is heated to a high temperature and then gradually cooled in the hope of finding the optimal solution to a particular problem. Similarly, in quantum annealing, quantum fluctuations are applied to a problem and then slowly quenched again in the hope of finding the optimum of the problem.

Katzgraber's work, primarily done by simulating spin-glass-like systems (disordered magnetic systems) on D-Wave's chip topology using the facilities at the Texas A&M Supercomputing Facility and the Stampede Cluster at the Texas Advanced Computing Center (TACC), shows that the energy landscape of these particular benchmark instances might often be simple, with one dominant basin of attraction. Optimization algorithms such as simulated annealing excel in studying these type of problems. Not surprisingly, he advocates for additional testing and better benchmark design prior to proclaiming either defeat or victory for the D-Wave Two machine.

"Simulated annealing works well when the system has one large basin in the energy landscape," Katzgraber said. "Think of a beach ball on a golf course with only one sand pit. You let it go, and it will just roll downhill to the lowest part of the pit without really getting stuck on the way. But if you have something with one dominant pit embedded in a landscape with many other hills and valleys, then the ball might get stuck on its way to the deepest pit and therefore miss the true minimum of the problem.

"My results seem to indicate that the current benchmarks might not have the complex landscape needed for quantum annealing to clearly excel over simulated annealing; i.e., a landscape with deep valleys and large barriers where the quantum effects can help the system tunnel through these barriers to find the optimum (i.e., the deepest pit) efficiently. This, of course, does not mean that quantum annealing does not perform well in the current benchmarks, but the signal over simulated annealing could be stronger by using better benchmarks. I am merely proposing benchmark problems that have an energy landscape more reminiscent of the Texas Hill Country versus the comparatively flat terrain in the College Station area—benchmarks where we know that simulated annealing will fail quickly."

The D-Wave machine currently in use by Google and NASA was benchmarked by a team of scientists from the University of Southern California, ETH Zurich, Google, the University of California at Santa Barbara and Microsoft Research in work that was independent of Katzgraber's but submitted near-simultaneously. The two teams do agree on one important point: The jury's still out because better benchmarks need to be developed.

"While on the one hand, D-Wave wants to dismiss the tests, and on the other, scientists have shown the machine is only faster for certain instances, I am proposing potentially harder tests before any concrete conclusions are drawn," Katzgraber said.

Explore further: D-Wave uses quantum method to solve protein folding problem

More information: Read the paper online:

Related Stories

D-Wave uses quantum method to solve protein folding problem

August 21, 2012

( -- While there has been some skepticism as to whether the Canadian company D-Wave’s quantum computing system, the D-Wave One, truly involves quantum computing, the company is intent on proving that the system ...

Large-scale quantum chip validated

June 28, 2013

A team of scientists at USC has verified that quantum effects are indeed at play in the first commercial quantum optimization processor.

D-Wave sells first commercial quantum computer

June 1, 2011

( -- Last week, Burnaby, British Columbia-based company D-Wave Systems, Inc., announced that it sold its first commercial quantum computer. Global security company Lockheed Martin, based in Bethesda, Maryland, ...

Quantum algorithm breakthrough

February 24, 2013

An international research group led by scientists from the University of Bristol, UK, and the University of Queensland, Australia, has demonstrated a quantum algorithm that performs a true calculation for the first time. ...

Recommended for you

Two teams independently test Tomonaga–Luttinger theory

October 20, 2017

(—Two teams of researchers working independently of one another have found ways to test aspects of the Tomonaga–Luttinger theory that describes interacting quantum particles in 1-D ensembles in a Tomonaga–Luttinger ...

Using optical chaos to control the momentum of light

October 19, 2017

Integrated photonic circuits, which rely on light rather than electrons to move information, promise to revolutionize communications, sensing and data processing. But controlling and moving light poses serious challenges. ...

Black butterfly wings offer a model for better solar cells

October 19, 2017

(—A team of researchers with California Institute of Technology and the Karlsruh Institute of Technology has improved the efficiency of thin film solar cells by mimicking the architecture of rose butterfly wings. ...

Terahertz spectroscopy goes nano

October 19, 2017

Brown University researchers have demonstrated a way to bring a powerful form of spectroscopy—a technique used to study a wide variety of materials—into the nano-world.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.