Toward mass-producible quantum computers

May 26, 2017 by Larry Hardesty
A team of researchers from MIT, Harvard University, and Sandia National Laboratories reports a new technique for creating targeted defects in diamond materials, which is simpler and more precise than its predecessors and could benefit diamond-based quantum computing devices. Credit: Massachusetts Institute of Technology

Quantum computers are experimental devices that offer large speedups on some computational problems. One promising approach to building them involves harnessing nanometer-scale atomic defects in diamond materials.

But practical, diamond-based will require the ability to position those defects at precise locations in complex diamond structures, where the defects can function as qubits, the basic units of information in quantum computing. In today's of Nature Communications, a team of researchers from MIT, Harvard University, and Sandia National Laboratories reports a new technique for creating targeted defects, which is simpler and more precise than its predecessors.

In experiments, the defects produced by the technique were, on average, within 50 nanometers of their ideal locations.

"The dream scenario in is to make an optical circuit to shuttle photonic qubits and then position a quantum memory wherever you need it," says Dirk Englund, an associate professor of electrical engineering and computer science who led the MIT team. "We're almost there with this. These emitters are almost perfect."

The new paper has 15 co-authors. Seven are from MIT, including Englund and first author Tim Schröder, who was a postdoc in Englund's lab when the work was done and is now an assistant professor at the University of Copenhagen's Niels Bohr Institute. Edward Bielejec led the Sandia team, and physics professor Mikhail Lukin led the Harvard team.

Appealing defects

Quantum computers, which are still largely hypothetical, exploit the phenomenon of quantum "superposition," or the counterintuitive ability of small particles to inhabit contradictory physical states at the same time. An electron, for instance, can be said to be in more than one location simultaneously, or to have both of two opposed magnetic orientations.

Where a bit in a conventional computer can represent zero or one, a "qubit," or quantum bit, can represent zero, one, or both at the same time. It's the ability of strings of qubits to, in some sense, simultaneously explore multiple solutions to a problem that promises computational speedups.

Diamond- qubits result from the combination of "vacancies," which are locations in the diamond's crystal lattice where there should be a carbon atom but there isn't one, and "dopants," which are atoms of materials other than carbon that have found their way into the lattice. Together, the dopant and the vacancy create a dopant-vacancy "center," which has free electrons associated with it. The electrons' magnetic orientation, or "spin," which can be in superposition, constitutes the qubit.

A perennial problem in the design of quantum computers is how to read information out of qubits. Diamond defects present a simple solution, because they are natural light emitters. In fact, the light particles emitted by diamond defects can preserve the superposition of the qubits, so they could move quantum information between computing devices.

Silicon switch

The most-studied diamond defect is the nitrogen-vacancy center, which can maintain superposition longer than any other candidate . But it emits light in a relatively broad spectrum of frequencies, which can lead to inaccuracies in the measurements on which relies.

In their new paper, the MIT, Harvard, and Sandia researchers instead use silicon-vacancy centers, which emit light in a very narrow band of frequencies. They don't naturally maintain superposition as well, but theory suggests that cooling them down to temperatures in the millikelvin range—fractions of a degree above absolute zero—could solve that problem. (Nitrogen-vacancy-center qubits require cooling to a relatively balmy 4 kelvins.)

To be readable, however, the signals from light-emitting qubits have to be amplified, and it has to be possible to direct them and recombine them to perform computations. That's why the ability to precisely locate defects is important: It's easier to etch optical circuits into a diamond and then insert the defects in the right places than to create defects at random and then try to construct optical circuits around them.

In the process described in the new paper, the MIT and Harvard researchers first planed a synthetic diamond down until it was only 200 nanometers thick. Then they etched optical cavities into the diamond's surface. These increase the brightness of the light emitted by the defects (while shortening the emission times).

Then they sent the diamond to the Sandia team, who have customized a commercial device called the Nano-Implanter to eject streams of silicon ions. The Sandia researchers fired 20 to 30 silicon ions into each of the optical cavities in the diamond and sent it back to Cambridge.

Mobile vacancies

At this point, only about 2 percent of the cavities had associated silicon-vacancy centers. But the MIT and Harvard researchers have also developed processes for blasting the diamond with beams of electrons to produce more vacancies, and then heating the diamond to about 1,000 degrees Celsius, which causes the vacancies to move around the crystal lattice so they can bond with silicon atoms.

After the researchers had subjected the diamond to these two processes, the yield had increased tenfold, to 20 percent. In principle, repetitions of the processes should increase the yield of silicon vacancy centers still further.

When the researchers analyzed the locations of the silicon-vacancy centers, they found that they were within about 50 nanometers of their optimal positions at the edge of the cavity. That translated to emitted light that was about 85 to 90 percent as bright as it could be, which is still very good.

Explore further: Toward practical quantum computers: Technique extends duration of fragile quantum states

Related Stories

Recommended for you

Tiny magnetic tremors unlock exotic superconductivity

June 26, 2017

Deep within solids, individual electrons zip around on a nanoscale highway paved with atoms. For the most part, these electrons avoid one another, kept in separate lanes by their mutual repulsion. But vibrations in the atomic ...

14 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

rrrander
3 / 5 (2) May 27, 2017
10 years ago, a Canadian company claimed to have a working quantum computer. They even sold some to credulous (institutional, government) buyers. They are no closer to quantum computing now than 2 decades ago.
vacuumforce
not rated yet May 27, 2017
I would dare reckon there is no such thing as Quantum computing.

Any particle with a spin adheres to counterintuitive logic.

Our brains simply aren't wired for it as we live in an inter-quantum world.
somefingguy
5 / 5 (2) May 27, 2017
10 years ago, a Canadian company claimed to have a working quantum computer. They even sold some to credulous (institutional, government) buyers. They are no closer to quantum computing now than 2 decades ago.


Actually it is a real quantum computer; Google and NASA both bought a D-Wave system. The only drawback is that the current system only supports 1,000 qubits, anything above that and the desired tunnel effect becomes too unreliable.

http://www.popula...omputer/
derphys
not rated yet May 28, 2017
Any large molecule or any microscopic system is a quantum computer, with uncontroled qubits, that it is impossible to simulate with classical computers of any size.
antialias_physorg
5 / 5 (3) May 28, 2017
Any large molecule or any microscopic system is a quantum computer

No, any small unit is a quantum system. To be a quantum computer you have to be able to do quantum computations (which include input and output).
Dingbone
not rated yet May 28, 2017
They are no closer to quantum computing now than 2 decades ago.
What are physical limits of processing speed of classical computers? Of course, quantum uncertainty... And what are limits of speed of quantum computers? Well, quantum un... oops - what did we wrong?
Da Schneib
not rated yet May 28, 2017
They gotta get over that temperature problem to make it mass producible, or at least mass usable. Anything for mass market use has to be robust.
Parsec
not rated yet May 29, 2017
Dingbone - the physical limits of current computers is not quantum uncertainty. It is the fixed representation of the numbers used. Quantum computers use, for the lack of a better description, the equivalent of floating point numbers, so that each qbit represents an entire range of numbers. This allows SOME algorithm's to change the big O of their execution time, making for example some algorithms that run in O(n!) time able to run in O(p(n)) time. That isn't just orders of magnitude speedup. That makes some algorithms that would take a regular computer the size of the universe (assuming each computing element was the size of a single atom) running at the speed of light longer than the entire time the universe has existed and converts it into an afternoon's work.

But a quantum computer is NOT a device which can be programmed to do anything like a regular computer. Only certain algorithms will run on it.
Parsec
not rated yet May 29, 2017
Da Scheib - it all depends on the number of qbits we are talking about and the computer's size. After all, very low temperatures just beens a refrigeration unit must be included. If the part holding the qbits is the size of a thimble, then it would probably be commercially viable with a helium cooler (4 degrees kelvin). That's still higher than what is presented here, but not outlandishly unreasonable.
Da Schneib
not rated yet May 29, 2017
@Parsec, you can't let the power go out. This is not robust.
Dingbone
not rated yet May 29, 2017
Dingbone - the physical limits of current computers is not quantum uncertainty. It is the fixed representation of the numbers used. Quantum computers use, for the lack of a better description, the equivalent of floating point numbers, so that each qbit represents an entire range of numbers
The speed of computers is defined like the volume of information processed in a moment given. The volume of information expressed in numbers is given by their precission. The classical computers can process very long and exact numbers, but with relatively low speed. The quantum computers can be very fast, but their results are rough and indeterminist. For to achive the same precission like the classical computers do the same calculation must be repeated many times and the speed advantage of quantum computing will be over. After all, this theorem has been proven already.
Dingbone
not rated yet May 29, 2017
That means, that the quantum computers are sorta technological hype: the massive cooling and overclocking of classical computers would provide similar indeterministic results. There are paralel applications, where such an indeterminism is actually preffered (optimization algorithms are less sensitive to local optimas after then, for example).
Dingbone
not rated yet May 29, 2017
From more general perspective it can be proven, that both purely classical, both purely quantum approach have their limit in uncertainty principle and as such they're actually slightly suboptimal. The uncertainty principle limit can be beaten with proper combination of quantum and classical approaches (1, 2, 3). Not quite surprisingly the Nature (which utilizes evolution for its thorough optimization) utilizes this trick too. The brains work like mixture of classical and quantum computers: they utilize one-dimensional quantum-like solitons - but these solitons are macroscopical and as such they're not so sensitive to environmental noise and temperature.
Whydening Gyre
not rated yet May 29, 2017
Dingbone - the physical limits of current computers is not quantum uncertainty. It is the fixed representation of the numbers used. Quantum computers use, for the lack of a better description, the equivalent of floating point numbers, so that each qbit represents an entire range of numbers. This allows SOME algorithm's to change the big O of their execution time, making for example some algorithms that run in O(n!) time able to run in O(p(n)) time. That isn't just orders of magnitude speedup. That makes some algorithms that would take a regular computer the size of the universe (assuming each computing element was the size of a single atom) running at the speed of light longer than the entire time the universe has existed and converts it into an afternoon's work.

But a quantum computer is NOT a device which can be programmed to do anything like a regular computer. Only certain algorithms will run on it.

Wow.. You just described the Universe...

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.