Researchers develop the first-ever quantum device that detects and corrects its own errors

March 4, 2015 by Sonia Fernandez
A photograph of the nine qubit device. The device conists of nine superconducting 'Xmon' transmon in a row. Qubits interact with their nearest neighbors to detect and correct errors. Credit: Julian Kelly

When scientists develop a full quantum computer, the world of computing will undergo a revolution of sophistication, speed and energy efficiency that will make even our beefiest conventional machines seem like Stone Age clunkers by comparison.

But, before that happens, physicists like the ones in UC Santa Barbara's physics professor John Martinis' lab will have to create circuitry that takes advantage of the marvelous computing prowess promised by the quantum bit (""), while compensating for its high vulnerability to environmentally-induced error.

In what they are calling a major milestone, the researchers in the Martinis Lab have developed quantum circuitry that self-checks for errors and suppresses them, preserving the qubits' state(s) and imbuing the system with the highly sought-after reliability that will prove foundational for the building of large-scale superconducting quantum computers.

It turns out keeping qubits error-free, or stable enough to reproduce the same result time and time again, is one of the major hurdles scientists on the forefront of quantum computing face.

"One of the biggest challenges in quantum computing is that qubits are inherently faulty," said Julian Kelly, graduate student researcher and co-lead author of a research paper that was published in the journal Nature. "So if you store some information in them, they'll forget it."

Unlike classical computing, in which the computer bits exist on one of two binary ("yes/no", or "true/false") positions, qubits can exist at any and all positions simultaneously, in various dimensions. It is this property, called "superpositioning," that gives quantum computers their phenomenal computational power, but it is also this characteristic which makes qubits prone to "flipping," especially when in unstable environments, and thus difficult to work with.

"It's hard to process information if it disappears," said Kelly.

However, that obstacle may just have been cleared by Kelly, postdoctoral researcher Rami Barends, staff scientist Austin Fowler and others in the Martinis Group.

The error process involves creating a scheme in which several qubits work together to preserve the information, said Kelly. To do this, information is stored across several qubits.

"And the idea is that we build this system of nine qubits, which can then look for errors," he said. Qubits in the grid are responsible for safeguarding the information contained in their neighbors, he explained, in a repetitive error detection and correction system that can protect the appropriate information and store it longer than any individual qubit can.

"This is the first time a quantum device has been built that is capable of correcting its own errors," said Fowler. For the kind of complex calculations the researchers envision for an actual quantum computer, something up to a hundred million qubits would be needed, but before that a robust self-check and error prevention system is necessary.

Key to this quantum error detection and correction system is a scheme developed by Fowler, called the surface code. It uses parity information—the measurement of change from the original data (if any)—as opposed to the duplication of the original information that is part of the process of in classical computing. That way, the actual original information that is being preserved in the qubits remains unobserved.

Why? Because quantum physics.

"You can't measure a quantum state, and expect it to still be quantum," explained Barends. The very act of measurement locks the qubit into a single state and it then loses its superpositioning power, he said. Therefore, in something akin to a Sudoku puzzle, the parity values of data qubits in a qubit array are taken by adjacent measurement qubits, which essentially assess the information in the data qubits by measuring around them.

"So you pull out just enough information to detect errors, but not enough to peek under the hood and destroy the quantum-ness," said Kelly.

This development represents a meeting of the best in the science behind the physical and the theoretical in quantum computing—the latest in qubit stabilization and advances in the algorithms behind the logic of .

"It's a major milestone," said Barends. "Because it means that the ideas people have had for decades are actually doable in a real system."

The Martinis Group continues to refine its research to develop this important new tool. This particular has been proved to protect against the "bit-flip" error, however the researchers have their eye on correcting the complimentary error called a "phase-flip," as well as running the error correction cycles for longer periods to see what behaviors might emerge.

Martinis and the senior members of his research group have, since this research was performed, entered into a partnership with Google.

Explore further: Superconducting qubit array points the way to error-free quantum computers

More information: State preservation by repetitive error detection in a superconducting quantum circuit, DOI: 10.1038/nature14270

Related Stories

Scientists track quantum errors in real time

July 14, 2014

(Phys.org) —Scientists at Yale University have demonstrated the ability to track real quantum errors as they occur, a major step in the development of reliable quantum computers. They report their results in the journal ...

At Yale, quantum computing is a (qu)bit closer to reality

February 15, 2012

(PhysOrg.com) -- Physicists at Yale University have taken another significant step in the development of quantum computing, a new frontier in computing that promises exponentially faster information processing than the most ...

New qubit control bodes well for future of quantum computing

January 14, 2013

(Phys.org)—Yale University scientists have found a way to observe quantum information while preserving its integrity, an achievement that offers researchers greater control in the volatile realm of quantum mechanics and ...

Flying qubits make for a highly resilient quantum memory

October 31, 2014

(Phys.org) —In a quantum memory, the basic unit of data storage is the qubit. Because a qubit can exist in a superposition state of both "1" and "0" at the same time, it can process much more information than a classical ...

Recommended for you

Electron highway inside crystal

December 8, 2016

Physicists of the University of Würzburg have made an astonishing discovery in a specific type of topological insulators. The effect is due to the structure of the materials used. The researchers have now published their ...

Researchers improve qubit lifetime for quantum computers

December 8, 2016

An international team of scientists has succeeded in making further improvements to the lifetime of superconducting quantum circuits. An important prerequisite for the realization of high-performance quantum computers is ...

11 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

wrestling_guy159
1.3 / 5 (4) Mar 04, 2015
Quantum computers will be good at some task, but not all of them unless a better quantum algorithm is developed for a specific task that beats current implementations. We will absolutely still need normal computers in the future, regardless if quantum computers can be realized or not. Besides, no ordinary person will be able to use them until someone invents a room temperature quantum computer small enough to fit in a chassis. That won't happen for several decades after the first full scale quantum computer is created at best.
kamcoautomotive
Mar 04, 2015
This comment has been removed by a moderator.
Zera
4.8 / 5 (8) Mar 04, 2015
That won't happen for several decades after the first full scale quantum computer is created at best.


How are you estimating this timeframe? By the 1940's we had devices recognisable as computers, by the 1970's we had personal computers, by the 90's we had devices that were in every household, by 2010, the device in my pocket is many magnitudes faster with larger storage capacity than anything commerically available in 1990.

We as a culture demand speed, we demand the next bigger, better device, and if quantum computing exists, i don't imagine it will be neccessary for each individual to have that much computational power. With the more likely being that information is pre-processed and then delivered to each user in perfectly packaged bits.
hillmeister
not rated yet Mar 04, 2015
This is AWESOME! It's the first leap to getting us universal quantum computers! :D
Whydening Gyre
3 / 5 (2) Mar 04, 2015
Wrestling guy - It's simple economics...
Osiris1
not rated yet Mar 04, 2015
That system might also be one day used in quantum error free teleportation of matter across light years of space in an instant.....as well as quantum communication the same way. This may be the way 'E.T. the Xtratesticle communicates in fact. Which may also be the reason our EM spectrum is soooooo quiet.
Cole
3.8 / 5 (5) Mar 05, 2015
Hopefully this technology won't be in the peoples hands for at least say, 15 or 20 more years. The reason? I am a computer programmer and I want to retire before I have to get into all this crap. People thought the emergence of C++ and Net Centric computing were complicated lol wait until you have to learn to write a program in a world were boolean isn't boolean anymore...
Cole
5 / 5 (2) Mar 05, 2015
Zera , that equation has long slowed, if you think you are getting 10 x the computing capacity from the I3 to the I7, you are severely mislead. At best an modified I7 could pull 1.5x that of an I3 and 1.2 more than an I5. It has all slowed , simply put, we are reaching a point in computing that we must move on to Quantum computing. Steadily we are finding ways to keep close but now we are falling behind that equation. By 2020, normal CPU's will have stalled in advancement to the point you could actually go 4-5 years without upgrading. Right now you can go about 2 , maybe 3 if you pimped it out. In 1999, you couldn't go 6 months, always something new, always something bigger, you got a 256 memory card? Dedicated graphics, awesome! but this new game requires a 512 and we just dropped a 1 gig, oh you just bought that card? Tough luck man.... Thankfully , we are no longer in that stage, though windows still keeps pushing bs on us.
DemoniWaari
not rated yet Mar 05, 2015
That system might also be one day used in quantum error free teleportation of matter across light years of space in an instant.....as well as quantum communication the same way. This may be the way 'E.T. the Xtratesticle communicates in fact. Which may also be the reason our EM spectrum is soooooo quiet.


You still need a classical channel in order to teleport information, so I doubt this would explain why the EM spectrum seems quiet.

Hopefully this technology won't be in the peoples hands for at least say, 15 or 20 more years...


Yeah don't worry. It will take way longer for something actually useful to come out of the lab (assuming no paradigm shifts). Granted that the dwave might have some uses already.
wrestling_guy159
1 / 5 (2) Mar 05, 2015
How are you estimating this timeframe?

I estimate by how hard the brightest quantum physicists have been working without any major breakthroughs yet. Every announcement is "one step closer," yet there still isn't end in sight. It is estimated that we will need millions of qubits just to compete with current computers. The D-Wave "quantum computer" is the size about the size of 4 refrigerators.

Wrestling guy - It's simple economics...

Economics won't be the driving force for personal quantum computers. There is a pretty high possibility that only government labs or the CIA will ever use them. Current computers are already fast enough for 99.9% of all things we do.
antialias_physorg
5 / 5 (4) Mar 05, 2015
By the 1940's we had devices recognisable as computers, by the 1970's we had personal computers, by the 90's we had devices that were in every household, by 2010, the device in my pocket is many magnitudes faster with larger storage capacity than anything commerically available in 1990.

Add to that that in that timeframe all the peripherals (graphics cards, etc.) had to be invented...which do not have to be reinvented for a quantum computer. The quantum computer 'just' neds a new CPU type and a new OS type/software to work on it. Once the Hardware issue is solved the software will come along quickly. Even for the software side only some core concepts need to be changed (where quantum computing actually confers an advantage). You're not going to use quantum operations to design your GUI.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.