Complexity, fidelity, application: Researchers outline plan for quantum supremacy

April 12, 2018 by Sonia Fernandez, University of California - Santa Barbara
This superconducting chip, with a total area of one square centimeter, consists of nine qubits in a 1D array. Microwave pusles are applied to control their states and their interaction, and consequently control the dynamics of the system. Such Josephson-junction based superconducting systems are a leading physical implementations for quantum computation and simulation processing. Credit: Eric Lucero/Google, Inc.

Things are getting real for researchers in the UC Santa Barbara John Martinis/Google group. They are making good on their intentions to declare supremacy in a tight global race to build the first quantum machine to outperform the world's best classical supercomputers.

But what is supremacy in a field where horizons are being widened on a regular basis, in which teams of the brightest quantum computing minds in the world routinely up the ante on the number and type of quantum bits ("qubits") they can build, each with their own range of qualities?

"Let's define that, because it's kind of vague," said Google researcher Charles Neill. Simply put, he continued, "we would like to perform an algorithm or computation that couldn't be done otherwise. That's what we actually mean."

Neill is lead author of the group's new paper, "A blueprint for demonstrating quantum supremacy with superconducting qubits," now published in the journal Science.

Fortunately, nature offers up many such complex situations, in which the variables are so numerous and interdependent that classical computers can't hold all the values and perform the operations. Think chemical reactions, fluid interactions, even quantum phase changes in solids and a host of other problems that have daunted researchers in the past. Something on the order of at least 49 qubits—roughly equivalent to a petabyte (one million gigabytes) of classical random access memory—could put a quantum computer on equal footing with the world's supercomputers. Just recently, Neill's Google/Martinis colleagues announced an effort toward quantum supremacy with a 72-qubit chip possessing a "bristlecone" architecture that has yet to be put through its paces.

But according to Neill, it's more than the number of qubits on hand.

"You have to generate some sort of evolution in the system which leads you to use every state that has a name associated with it," he said. The power of quantum computing lies in, among other things, the superpositioning of states. In , each bit can exist in one of two states—zero or one, off or on, true or false—but qubits can exist in a third state that is a superposition of both zero and one, raising exponentially the number of possible states a quantum system can explore.

Additionally, say the researchers, fidelity is important, because massive processing power is not worth much if it's not accurate. Decoherence is a major challenge for anyone building a quantum computer—perturb the system, the information changes. Wait a few hundredths of a second too long, the information changes again.

"People might build 50 qubit systems, but you have to ask how well it computed what you wanted it to compute," Neill said. "That's a critical question. It's the hardest part of the field." Experiments with their superconducting qubits have demonstrated an error rate of one percent per qubit with three- and nine-qubit systems, which, they say, can be reduced as they scale up, via improvements in hardware, calibration, materials, architecture and machine learning.

Building a qubit system complete with error correction components—the researchers estimate a range of 100,000 to a million qubits—is doable and part of the plan. And still years away. But that doesn't mean their system isn't already capable of doing some heavy lifting. Just recently it was deployed, with spectroscopy, on the issue of many-body localization in a quantum phase change—a quantum computer solving a quantum statistical mechanics problem. In that experiment, the nine- system became a quantum simulator, using photons bouncing around in their array to map the evolution of electrons in a system of increasing, yet highly controlled, disorder.

"A good reason why our fidelity was so high is because we're able to reach complex states in very little time," Neill explained. The more quickly a system can explore all possible states, the better the prediction of how a system will evolve, he said.

If all goes smoothly, the world should be seeing a practicable UCSB/Google quantum computer soon. The researchers are eager to put it through its paces, gaining answers to questions that were once accessible only through theory, extrapolation and highly educated guessing—and opening up a whole new level of experiments and research.

"It's definitely very exciting," said Google researcher Pedram Roushan, who led the many-body quantum simulation work published in Science in 2017. They expect their early work to stay close to home, such as research in condensed matter physics and quantum statistical mechanics, but they plan to branch out to other areas, including chemistry and materials, as the technology becomes more refined and accessible.

"For instance, knowing whether or not a molecule would form a bond or react in some other way with another molecule for some new technology... there are some important problems that you can't roughly estimate; they really depend on details and very strong computational power," Roushan said, hinting that a few years down the line they may be able to provide wider access to this computing power. "So you can get an account, log in and explore the quantum world."

Explore further: Researchers develop prototype of advanced quantum memory

More information: C. Neill et al, A blueprint for demonstrating quantum supremacy with superconducting qubits, Science (2018). DOI: 10.1126/science.aao4309

Related Stories

Researchers develop prototype of advanced quantum memory

April 12, 2018

Employees of Kazan Federal University and Kazan Quantum Center of Kazan National Research Technical University demonstrated an original layout of a prototype of multiresonator broadband quantum-memory interface.

A new kind of quantum computer

November 6, 2017

Quantum mechanics incorporates some very non-intuitive properties of matter. Quantum superposition, for example, allows an atom to be simultaneously in two different states with its spin axis pointed both up and down, or ...

Recommended for you

ATLAS experiment observes light scattering off light

March 20, 2019

Light-by-light scattering is a very rare phenomenon in which two photons interact, producing another pair of photons. This process was among the earliest predictions of quantum electrodynamics (QED), the quantum theory of ...

How heavy elements come about in the universe

March 19, 2019

Heavy elements are produced during stellar explosion or on the surfaces of neutron stars through the capture of hydrogen nuclei (protons). This occurs at extremely high temperatures, but at relatively low energies. An international ...

Trembling aspen leaves could save future Mars rovers

March 18, 2019

Researchers at the University of Warwick have been inspired by the unique movement of trembling aspen leaves, to devise an energy harvesting mechanism that could power weather sensors in hostile environments and could even ...

Quantum sensing method measures minuscule magnetic fields

March 15, 2019

A new way of measuring atomic-scale magnetic fields with great precision, not only up and down but sideways as well, has been developed by researchers at MIT. The new tool could be useful in applications as diverse as mapping ...


Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Apr 12, 2018
The author of the article doesn't seem to be aware of the concept of non-polynomial time problems, and the effect of an NP-complete algorithm for quantum computers. If a problem is in the family of problems with NP time complete solutions, a quantum algorithm which solves any of those problems in polynomial time will allow all such problems to be solved in polynomial time. More important, any solution to an NP-time problem can be proven correct in polynomial time, and often very quickly.

Say your company has hundreds of warehouses, and millions of orders to be filled. Not all warehouses have sufficient supplies to meet all (local) requirements, but the company as a whole does. If the goods you are selling are say gasoline and other petroleum products, the problem can be solved by linear programming, and in fact oil companies all do that. But what if you have warehouses full of goods that can't be divided, computers or rubber ducks. Now you have an integer programming problem..
not rated yet Apr 12, 2018
(Yes, I'm getting into what is very hairy math which doesn't need calculus.)
Integer programming problems have cases that can't be solved in polynomial time. You can hope the solution doesn't require splitting several ducks, and try linear programming. If the solution does require splitting ducks, extract the split deliveries from the rest and solve this as an integer programming problem. Won't necessarily give you a least cost solution, but it should be close.

What if the LP solution splits all deliveries? You don't have a few million years to solve the problem, so you use heuristics and hope they give a decent result. Or, once quantum computers are available, you use one to solve the problem. Proving that all deliveries are made, and no warehouse is left with a negative balance of anything is easy. (If you want, you can use the Kuhn-Tucker conditions to test if it is optimal.)

What if there are additional constraints? All logic problems are in NP, including yours.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.