Quantum computing may actually be useful, after all

October 9, 2009 by Larry Hardesty,

Patrick Gillooly
(PhysOrg.com) -- In recent years, quantum computers have lost some of their luster. In the 1990s, it seemed that they might be able to solve a class of difficult but common problems — the so-called NP-complete problems — exponentially faster than classical computers. Now, it seems that they probably can't. In fact, until this week, the only common calculation where quantum computation promised exponential gains was the factoring of large numbers, which isn't that useful outside cryptography.

In a paper appearing today in , however, MIT researchers present a new algorithm that could bring the same type of efficiency to systems of linear equations — whose solution is crucial to image processing, video processing, signal processing, robot control, weather modeling, and population analysis, to name just a few applications.

Quantum computers are computers that exploit the weird properties of matter at extremely small scales. Where a bit in a classical computer can represent either a "1" or a "0," a , or qubit, can represent "1" and "0" simultaneously. Computations performed on a string of qubits are thus the equivalent of computations performed on every possible value of a string of ordinary bits. Since eight classical bits can represent 256 different values, a single eight-qubit calculation is the equivalent of 256 classical calculations.

Quantum computers with maybe 12 or 16 qubits have been built in the lab, but quantum computation is such a young field, and the physics of it are so counterintuitive, that researchers are still developing the theoretical tools for thinking about it.

Systems of linear equations, on the contrary, are familiar to almost everyone. We all had to solve them in algebra class: given three distinct equations featuring the same three variables, find values for the variables that make all three equations true.

Computer models of weather systems or of complex chemical reactions, however, might have to solve millions of equations with millions of variables. Under the right circumstances, a classical can solve such equations relatively efficiently: the solution time is proportional to the number of variables. But under the same circumstances, the time required by the new quantum algorithm would be proportional to the logarithm of the number of variables.

That means that for a calculation involving a trillion variables, "a supercomputer's going to take trillions of steps, and this algorithm will take a few hundred," says mechanical engineering professor Seth Lloyd, who along with Avinatan Hassidim, a postdoc in the Research Lab of Electronics, and the University of Bristol's Aram Harrow '01, PhD '05, came up with the new algorithm.

Because the result of the calculation would be stored on qubits, however, "you're not going to have the full functionality of an algorithm that just solves everything and writes it all out," Lloyd says. To see why, consider the eight qubits that can represent 256 values simultaneously. Nine qubits could represent 512 values, 10 qubits 1,024, and so on. This doubling rapidly yields astronomical numbers. The trillion solutions to a trillion-variable problem would be stored on only about 40 qubits. But extracting all trillion solutions from the qubits would take a trillion steps, eating up all the time that the quantum algorithm saved.

With qubits, however, "you can make any measurement you like," Lloyd says. "You can figure out, for instance, their average value. You can say, okay, what fraction of them is bigger than 433?" Such measurements take little time but may still provide useful information. They could, Lloyd says, answer questions like, "In this very complicated ecosystem with, like, 10 to the 12th different species, one of which is humans, in the steady state for this particular model, do humans exist? That's the kind of question where a classical algorithm can't even provide anything."

Greg Kuperberg, a mathematician at the University of California, Davis, who works on quantum algebra, says that the MIT algorithm "could be important," but that he's "not sure yet how important it will be or when." Kuperberg cautions that in applications that process empirical data, loading the data into quantum memory could be just as time consuming as extracting it would be. "If you have to spend a year loading in the data," he says, "it doesn't matter that you can then do this linear-algebra step in 10 seconds."

But Hassidim argues that there could be applications that allow time for data gathering but still require rapid calculation. For instance, to yield accurate results, a weather prediction model might require data from millions of sensors transmitted continuously over high-speed optical fibers for hours. Such quantities of data would have to be loaded into quantum memory, since they would overwhelm all the conventional storage in the world. Once all the data are in, however, the resulting forecast needs to be calculated immediately to be of any use.

Still, Hassidim concedes that no one has yet come up with a "killer app" for the algorithm. But he adds that "this is a tool, which hopefully other people are going to use. Other people are going to have to continue this work and understand how to use this in different problems. You do have to think some more."

Indeed, researchers at the University of London have already expanded on the MIT researchers' approach to develop a new quantum algorithm for solving differential equations. Early in their paper, they describe the MIT algorithm, then say, "This promises to allow the solution of, e.g., vast engineering problems. This result is inspirational in many ways and suggests that quantum computers may be good at solving more than linear equations."

Provided by Massachusetts Institute of Technology (news : web)

Explore further: A Quantum CPU: the Pentium Q?

Related Stories

A Quantum CPU: the Pentium Q?

May 23, 2006

A new design scheme for a quantum processor core makes potential quantum computers more technically feasible, more efficient, and in many cases faster by keeping all of the quantum bits active all the time, rather than switching ...

Quantum computers could excel in modeling chemical reactions

November 20, 2008

Quantum computers would likely outperform conventional computers in simulating chemical reactions involving more than four atoms, according to scientists at Harvard University, the Massachusetts Institute of Technology, and ...

2 qubits in action, new step towards the quantum computer

June 14, 2007

Researchers at Delft University of Technology have succeeded in carrying out calculations with two quantum bits, the building blocks of a possible future quantum computer. The Delft researchers are publishing an article about ...

Physicists perform the first ever quantum calculation

December 11, 2007

University of Queensland researchers are part of an international team to have made the first ever execution of a quantum calculation, a major step towards building the first quantum computers.

Recommended for you

Mechanism helps explain the ear's exquisite sensitivity

January 16, 2019

The human ear, like those of other mammals, is so extraordinarily sensitive that it can detect sound-wave-induced vibrations of the eardrum that move by less than the width of an atom. Now, researchers at MIT have discovered ...

Topological quantities flow

January 14, 2019

Topology is an emerging field within many scientific disciplines, even leading to a Nobel Physics Prize in 2016. Leiden physicist Marcello Caio and his colleagues have now discovered the existence of topological currents ...


Adjust slider to filter visible comments by rank

Display comments: newest first

1 / 5 (1) Oct 09, 2009
Gives new meaning to the phrase 'spaghetti code'.
not rated yet Oct 09, 2009
Where even qubits are limited in operation it seems optics can be dynamically allocated values given a hardcoded modulation system and a decoding algorithm based on divisioning signatures.
not rated yet Oct 10, 2009
Sorry but as a person with vast experience in binary programming, I still do not see the point or rationa'l in having a bit which can be 1,0 or both. how can you tell what the information is to be interpretted as if it can be both. or should this really be described as 1,0, and 2. I just dont see the value in having a bit which is undefined.
Oct 10, 2009
This comment has been removed by a moderator.
not rated yet Oct 11, 2009
I still do not see the point or rationa'l in having a bit which can be 1,0 or both. how can you tell what the information is to be interpretted as if it can be both.

In our weird quantum world everything that is allowed to happen happens until a measurement is made. The more quantum bits in your system the power of the quantum computer increases exponentially with each bit because there are more possible things that can happen (more combinations of quantum bit states) able to occur simultaneously.

Think of a quantum program as a transaction. You start the transaction by setting up the quantum system. You commit the transaction by making a measurement and get a real answer (1 or 0) out of the system.

Noone currently knows how to keep a quantum system with enough possible states together (coherence) enough to do useful work that cannot be reasonably performed on a normal computer. We don't even know if its possible -- Its nice to stock up on algorithms just in case.
not rated yet Oct 12, 2009
may actually be useful? Isn't the human mind a quantum computer in that consciousness collapses the potentiality wave to become a thought? Not to mention some new age ideas that thoughts create reality. An artificial mind sounds like a useful application.
not rated yet Oct 12, 2009
Thanks for that Foolish1
I must appoligise for a minor error in my previous post a "bit" is never 1 or 0 it is really set or unset i.e. a pulse exists in that position or not just as a light switch can be breaking a circuit or making it. Try switching a light both on and off at the same momment.
Your first para' sounds like the "cat in a box" theory which I also believe is stupid.
not rated yet Oct 12, 2009
Quantum computers may be interesting in theory, but I doubt that in practice they can do what is claimed by theory. Quantum theory has serious paradoxes, and the real explanation for this is, in my opinion, that, while quantum mechanics is very accurate (some of its predictions have been verified, I believe, up to twelve decimal accuracy), it is not a fundamentally correct theory. To use 40 qubits to represent a trillion variables would, in my opinion, require quantum mechanic to be accurate up to twenty five billion decimal places, and it probably is not that accurate. Besides, I believe all these quantum algorithms use nonrelativistic quantum mechanics, and I expect relativistic effects would come into play long before twenty five billion decimal place accuracy is reached. Not to mention that relativistic quantum mechanics may also fall short unless relativity is an absolutely correct theory.
not rated yet Oct 12, 2009
Thanks for that Foolish1
Your first para' sounds like the "cat in a box" theory which I also believe is stupid.

People misunderstand cat in the box asking "how can cat be dead and alive". Thats never true because in reality the wave function of the quantum system has collapsed when the cat killing switch is triggered.

This example more than any other I believe just causes confusion and keeps one from understanding conceptually how the quantum world works.

Check this out:

If you excuse the eyeball mysticisim
at the end its fairly good. (Observing/registering electrons requires some kind of interaction/disturbance to detect their presence)

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.