For years, quantum computers have been the holy grail of quantum technology. When a normal computer has to solve a number of problems, it can only execute them one after the other. In contrast, a quantum computer could occupy several different states at the same time – and that way it could try out different possible solutions of a problem at once, finding the correct answer much faster than a normal computer ever could. Diamonds could now bring physicists one important step closer to the quantum computer. At Vienna University of Technology, microwaves have now been coupled to the quantum states of a diamond. The results of this research project were now published in the scientific journal *Physical Review Letters*.

**Different Quantum Technologies in One Chip**

For a long time, scientists have been looking for suitable building blocks to construct a quantum computer – but without much success. Several ideas for systems which can store quantum mechanical information have been put forward, but quantum information is usually very fragile and easily destroyed. A component of a computer has to meet different criteria. It should be able to switch its state very rapidly, and it has to conserve its quantum state for a sufficient amount of time, so that calculations can be carried out. “There is no single quantum system which meets all the requirements”, Johannes Majer says. He and his team coupled two completely different kinds of quantum systems, in order to use the advantages of both sides: Microwaves and Diamonds.

**Photons and Diamonds**

Our usual computers have a processor and a memory. The processor carries out fast calculations, the memory is supposed to remember the results for a long time. The relation between the two different quantum systems unified on one quantum chip at TU Vienna is quite similar: fast manipulations are possible due to a so called microwave resonator. Its quantum state is defined by photons in the microwave regime. This microwave resonator is coupled to a thin layer of diamond, in which quantum states can be stored.

**Desirable Flaws**

For jewellery, diamonds are supposed to be pure and flawless, but for quantum experiments, the opposite is required. Here, flaws in the diamond are desirable. When nitrogen atoms slip into the regular carbon structure of the diamond, the diamond becomes almost black, but it gains the ability to store quantum states. “We could show that in our quantum chip, quantum states can actually be transferred between the microwaves and the nitrogen-centers in the diamond”, Robert Amsüss (TU Vienna) explains. The more nitrogen atoms take part in this transfer of quantum information, the more stable the diamonds “memory” becomes.

Surprisingly, it turned out that also the angular momentum of the atomic nuclei can store quantum information. “This could be the first step towards a nuclear memory device”, Johannes Majer suggests. But first, the diamond quantum chip in its present form should be optimized. All the necessary parts are now there, creating the oppurtunity for reliable operations.

**Explore further:**
Yale scientists bring quantum optics to a microchip

**More information:**
R. Amsüss et al., *Phys. Rev. Lett* 107 (2011) prl.aps.org/abstract/PRL/v107/i6/e060502

## Techno1

This never seems to get explained properly in articles, and is actually a fallacy.

there is a detailed presentation from a university lecture available, I think on youtube, which explains this relationship, and it is a precise mathematical relationship. The computer cannot "try all possibilities simultaneously".

http://www.youtub...e=relmfu

and

http://www.youtub...e=relmfu

See the abstract.

it depends on the algorithm and application, but there is a finite, real limit to how many calculations the computer can do, and how many calculations are required to solve a problem.

## SincerelyTwo

Yep, it's already been proven that the set of complexity classes for which quantum computers will outperform digital computers is not a very large. We'll see great improvements in solving problems in critical cases, but the benefits won't truly be as widespread as most journalists casually like to claim.

It's so incredibly easy to misunderstand details of quantum mechanics, and I'm nothing close to a physicists, but I do study mathematics regularly and am fully aware of what's involved with interpreting the meaning of equations so that you can make intuitive breakthroughs or describe an abstract system.

Interpretation requires such extreme care, correctness, clarity ... I have no doubt there are still many crucial details about quantum mechanics that we are not correctly interpreting still to this day.

My usual criticisms are of physicists taking statistics and probability too literally. Look at quantum mechanics documentaries from just five years ago, they're absurd.

## Techno1

N vs sqrt(N).

is a big difference, but it's not infinite, and it's not solving all possibilities simultaneously.

That's:

N vs sqrt(N)

100 vs 10

1E6 vs 1E3

1E12 vs 1E6

1E18 vs 1E9

etc.

While these numbers are mind-numbingly incredible returns, they are far from what is commonly reported.

Some other classes of problems will not that big of a return, like the winner of a game:

N^0.753 vs N^0.5

Which is again, the square root of N on the right side.

This will probably make computers unbeatable in chess, but then again, on max difficulty they are nearly unbeatable already, and they already do the calculations in less than a second.

For searches and sorts, which is what most computing does, it will only be useful for certain classes of search and sort

## Techno1

I think it's:

M*N vs sqrt(M*N)

Which it turns out doesn't even matter, since it obeys the simple substitution rule, not the power rule.

Also, real world algorithms probably won't be as optimized as theoretical "pure math" algorithms,w hich means you may not actually get as low as "sqrt(N)" result.

First of all, as would be obvious, "Sqrt(N)" needs to be rounded up to the nearest whole number of clock cycles.

Second, the real algorithm may not be ideal. The best possible real world algorithm may require "C*sqrt(N)" or "sqrt(C*N)" or "Sqrt(N C)" or "Sqrt(N) C", where "C" is a constant, probably 1 or some other number.

In all of those cases, the number of cycles will still be smaller than a classical computer, just probably not as much smaller...

## Techno1

## eachus

Let me show this using Shor's Algorithm. This is a quantum algorithm for finding the factors of an integer N. The time required is proportional to the cube of log N. If if takes you 1 day to factor a thousand digit number, it takes 8 days to factor a two thousand digit number. (More likely it takes 4 times as long, on a QC with twice as many q-bits. But I digress.)

How long does it take to factor a (hard) thousand digit number using a classical computer? It hasn't been done, and it probably never will be. The best algorithm (number field sieve) known takes exponential time in relation to log N, the size of the problem statement.

## Techno1

I am aware of that as well, but what you are talking about is only going to be used in Cryptography and a few other fields.

Moreover, having the ability to crack such problems is more likely to favor HACKERS than legitimate purposes.

Can you imagine what the world would be like when hackers, terrorists, and rogue states could literally crack ANY computer code or communication protocol or encryption scheme? Absolutely no communication would be secure, adn they would steal your bank account at will too...