Large-scale quantum chip validated

Jun 28, 2013

A team of scientists at USC has verified that quantum effects are indeed at play in the first commercial quantum optimization processor.

The team demonstrated that the D-Wave processor housed at the USC-Lockheed Martin Quantum Computing Center behaves in a manner that indicates that mechanics plays a functional role in the way it works. The demonstration involved a small subset of the chip's 128 qubits.

This means that the device appears to be operating as a quantum processor – something that scientists had hoped for but have needed extensive testing to verify.

The quantum processor was purchased from Canadian manufacturer D-Wave nearly two years ago by Lockheed Martin and housed at the USC Viterbi Information Sciences Institute (ISI). As the first of its kind, the task for scientists putting it through its paces was to determine whether the quantum computer was operating as hoped.

"Using a specific test problem involving eight qubits we have verified that the D-Wave processor performs optimization calculations (that is, finds lowest energy solutions) using a procedure that is consistent with quantum annealing and is inconsistent with the predictions of classical annealing," said Daniel Lidar, scientific director of the Quantum Computing Center and one of the researchers on the team, who holds joint appointments with the USC Viterbi School of Engineering and the USC Dornsife College of Letters, Arts and Sciences.

Quantum annealing is a method of solving optimization problems using – at a large enough scale, potentially much faster than a traditional processor can.

Research institutions throughout the world build and use quantum processors, but most only have a few , or "qubits."

Qubits have the capability of encoding the two digits of one and zero at the same time – as opposed to traditional bits, which can encode distinctly either a one or a zero. This property, called "superposition," along with the ability of quantum states to "tunnel" through energy barriers, are hoped to play a role in helping future generations of the D-Wave processor to ultimately perform optimization calculations much faster than traditional processors.

With 108 functional qubits, the D-Wave processor at USC inspired hopes for a significant advance in the field of when it was installed in October 2011 – provided it worked as a quantum information processor. Quantum processors can fall victim to a phenomenon called "decoherence," which stifles their ability to behave in a quantum fashion.

The USC team's research shows that the chip, in fact, performed largely as hoped, demonstrating the potential for quantum optimization on a larger-than-ever scale.

"Our work seems to show that, from a purely physical point of view, play a functional role in information processing in the D-Wave processor," said Sergio Boixo, first author of the research paper, who conducted the research while he was a computer scientist at ISI and research assistant professor at the USC Viterbi School of Engineering.

Boixo and Lidar collaborated with Tameem Albash, postdoctoral research associate in physics at USC Dornsife; Federico M. Spedalieri, computer scientist at ISI; and Nicholas Chancellor, a recent physics graduate at USC Dornsife. Their findings will be published in Nature Communications on June 28.

The news comes just two months after the Quantum Computing Center's original D-Wave processor—known commercially as the "Rainier" chip—was upgraded to a new 512-qubit "Vesuvius" chip. The Quantum Computing Center, which includes a magnetically shielded box that is kept frigid (near absolute zero) to protect the computer against decoherence, was designed to be upgradable to keep up with the latest developments in the field.

The new Vesuvius chip at USC is currently the only one in operation outside of D-Wave. A second such chip, owned by Google and housed at NASA's Ames Research Center in Moffett Field, California, is expected to become operational later this year.

Next, the USC team will take the Vesuvius chip for a test drive, putting it through the same paces as the Rainier chip.

Explore further: New method to distinguish between neighboring quantum bits may bring us closer to large-scale quantum computers

Related Stories

A quantum leap in computing

Jan 04, 2012

When American physicist Richard Feynman in 1982 proposed creating a quantum computer that could solve complex problems, the idea was merely a theory scientists believed was far off in the future.

D-Wave sells first commercial quantum computer

Jun 01, 2011

(PhysOrg.com) -- Last week, Burnaby, British Columbia-based company D-Wave Systems, Inc., announced that it sold its first commercial quantum computer. Global security company Lockheed Martin, based in Bethesda, ...

Quantum computer built inside a diamond

Apr 04, 2012

Diamonds are forever – or, at least, the effects of this diamond on quantum computing may be. A team that includes scientists from USC has built a quantum computer in a diamond, the first of its kind to include protection ...

Recommended for you

User comments : 17

Adjust slider to filter visible comments by rank

Display comments: newest first

flashgordon
1.7 / 5 (6) Jun 28, 2013
The D-Wave computer can be used to simulate quantum mechanics; hence, it can be used to bootstrap to a more general purpose quantum computer; it'll be interesting to see how fast they get to be being able to do that.

I'm wondering if they could use it to optimize the solution to bootstrapping from dna-nanotech to making a more robust nano-assembler. They've already solved one hard protein folding problem, why not dna and more proteins?
Matthewwa25
2 / 5 (8) Jun 28, 2013
This is huge....A real quantum computer. ;)
vacuum-mechanics
1.4 / 5 (10) Jun 28, 2013
Qubits have the capability of encoding the two digits of one and zero at the same time – as opposed to traditional bits, which can encode distinctly either a one or a zero. This property, called "superposition," along with the ability of quantum states to "tunnel" through energy barriers, are hoped to play a role in helping future generations of the D-Wave processor to ultimately perform optimization calculations much faster than traditional processors.

It is interesting to note that we know that probability is the nature of quantum mechanics, how could we use it in computing system in which the traditional computer was based on deterministic counting system!
LarryD
5 / 5 (1) Jun 28, 2013
It is interesting to note that we know that probability is the nature of quantum mechanics, how could we use it in computing system in which the traditional computer was based on deterministic counting system!

Was wondering the same thing. However, would such processes 'wait' for the outcome then move on? Since on the QM level the 'waiting' time would still much less than full derterministic procedures. As an analogy, crime busters might use statistical methods to 'home in' on a serial killer narrowing the field down before a final 'deterministic' catch.
Of course, as a layman I might be barking up the wrong tree and in the wrong park...I'm sure someone will post me right.
DonGateley
2.7 / 5 (7) Jun 29, 2013
So long all existing cryptography.
ValeriaT
1.4 / 5 (10) Jun 29, 2013
IMO the computational power (i.e. the product of processing speed and precision) cannot exceed this one of classical computers from single reason, which is the Heisenberg uncertainty principle. So that the quantum computers can gain profit only under the situation, when their inherently low precision wouldn't the obstacle of their application. It's true that the determinism of Von-Neumann architecture is overly redundant for many applications, where only approximative result can be expected (face recognition from Google glass images and video stream, for example). So at the moment, when you have fuzzy input data, then the inherently low precision of quantum computers can get an advantage.
maxb500_live_nl
1.8 / 5 (5) Jun 29, 2013
Like with the first traditional computers this represents the start of a new age of computing. One that will eventually mean a quantum leap for all mankind. And all thanks to these great people in Canada! Keeping things in the labratories for too long is a mistake by all the other researchers and universities in the world working on this. Failing to generate business opertunities and jobs that help pay for their research. Many scientist even create open free platforms. Zero return on investment. Yet they take in truck loads of funding. Scientists and Universities should be more concerned with turning ideas into profit and actual jobs. They are supposed to represent that forefront of innovation that will generate wealth. Great spin-offs were very popular in the past but many universities seem to be failing. More concerned with getting the next bag filled with research money while failing to put in the hardest work of all to generate actual jobs and economic succes.
TheGhostofOtto1923
1.8 / 5 (5) Jun 29, 2013
So long all existing cryptography.
At least we wont need to memorize passwords anymore. Except the password to our password generating algorithm software that is.

If we rely on face-recognition for security, how long before these machines can generate enough facial variations to crack that as well?
Pressure2
1.5 / 5 (8) Jun 29, 2013
Quote from article: "The team demonstrated that the D-Wave processor housed at the USC-Lockheed Martin Quantum Computing Center behaves in a manner that indicates that quantum mechanics plays a functional role in the way it works. The demonstration involved a small subset of the chip's 128 qubits.
This means that the device appears to be operating as a quantum processor – something that scientists had hoped for but have needed extensive testing to verify."

What? "It appears to be operating as a quantum processor." It doesn't sound like even they are convinced it is a true quantum computer. The article also mention that the chip has to operate near absolute zero.

PoppaJ
1.5 / 5 (8) Jun 29, 2013
This is one area where Star Trek was technologically deficient. I remember an episode where it was explained that voyager used a basic binary system and the aliens were using a much more advanced trinary. Who would have though we would have jumped passed it with a coexistent states system.
IamVal
1.7 / 5 (6) Jun 30, 2013
the quantum computing idea in general represents a trinary system, and we could easily store a system of much more than 2 or even 10 places quite easily- Actually, we have for more than 50 years. Hexadecimal (the language OF turing machines) is 16 digits. 'trinary' would just require a system that could have 3 possible states, where ours does.
Superimposed, 1 and 0.

What most people don't quite grasp about the fundimental process behind quantum computing is that it is not a calculation machine, it's a guessing machine.

it takes a rough set of inputs and turns those into an 'optimized' output.
so, following in the footsetps of it's ancestors, these will not be used as 'answer generators', but guideline producers for the answer generators(then people) They will be used to consider pentillions of options simultaneously and return the one that 'fits best' by using the properties of spin and atomic force to assert a 'path of least resistance' for the quantum data to flow through.
IamVal
1.7 / 5 (6) Jun 30, 2013
but all of this first requires that we be able to encode the 'option array' into the qubit system hard-ware style- that is, each set of qubits can be set up to solve 1 problem and 1 problem only, and each subset must be 'set up' with the 'answer' (which is really the question) and using the uncertainty principle (Which is affected by the relationship between the qubits to a measurable extent) allows electrons to tunnel through the path of least resistance as it were.

this example is NOT a quantum computer, but an interface which would allow us to encode traditional binary information onto a qubit system in order to 'enforce' a measurment.
These systems also use the schrodinger principle- which in and of itself is a silly thought experiment not meant to be used literally- and must compensate for the schrodinger affect- a much less silly phenominon that seems to only have 2 common thread in the thought experiment).
both the uncertainty principle and the schrodinger effect (continue
IamVal
1.7 / 5 (6) Jun 30, 2013
become 'predictable' in a certain sense at the sub-atomic scale at absolute zero, and they call this predictability 'coherence' .. whereas the ideas behind them are still quite valid, they can be compensated for at near absolute zero.

all flaws of a poly-temporal classical system- Having to describe 1 before you can describe 2 or .5 per literal example.'

with a quantum computer (a true one) the qubits would be encoded- by spin and tunnelling chirality- by an interfacing device such as this one, and then a measurement would be be applied to the system in the form of electrons being allowed to tunnel from one qubit to the next. the presence of the electrons themselves will create decoherence through the schrodinger effect and we'll be left with a number of 'guesses' related to the outcome of the 'measurement' at which point the qbits would need to be reset for the next set of 'answers'

a well place analogy would be the traveling salesman- each qbit can represent up to 1 city.
JohnGee
1.6 / 5 (7) Jun 30, 2013
the quantum computing idea in general represents a trinary system, and we could easily store a system of much more than 2 or even 10 places quite easily- Actually, we have for more than 50 years. Hexadecimal (the language OF turing machines) is 16 digits. 'trinary' would just require a system that could have 3 possible states, where ours does.
Superimposed, 1 and 0.

Hexadecimal is base 16. That has nothing to do with whether the computer is binary or not. Binary computers tend to use hexadecimals because it simplifies some calculations when you have 8 bits to a byte. If the computer were "hexadecimary" or whatever the term would be, each bit would have 16 possible values.

Quantum computing doesn't represent a trinary system either. Classical mechanics is more than happy to deal with trinary computers. We just don't have an engineer capable of designing one.
CapitalismPrevails
1.4 / 5 (9) Jun 30, 2013
Will they be able to hack Bitcoin with this?
IamVal
1.7 / 5 (6) Jul 02, 2013
bitcoins are safe untill the actual quantum computer is built and they have millions of qubits.
ValeriaT
1.5 / 5 (8) Jul 02, 2013
from a purely physical point of view, quantum effects play a functional role in information processing in the D-Wave processor
This is not the actual core of the problem of D-wave classification. For example, you can solve the Laplace equation (the heat conduction) in real time with conductive paper, connected to different potential at different places. At the microscopic level each electron within paper will move in agreement with quantum mechanics, but this still doesn't make a quantum computer from this classical analog computer. The important criterion here is, if all functional components of computer are entangled in mutual quantum correlation. So far it was proven, that the D-wave processor is entangled just at the level of its 8-bit modules, not at the level of the whole processor, which is still behaving like the classical computer as a whole.

More news stories