Quantum computer emulated by a classical system

May 27, 2015 by Lisa Zyga feature
Drs. Granville Ott (left) and Brian La Cour (center) with student Michael Starkey (right) beside their prototype quantum emulation device. Credit Applied Research Laboratories, The University of Texas at Austin

(Phys.org)—Quantum computers are inherently different from their classical counterparts because they involve quantum phenomena, such as superposition and entanglement, which do not exist in classical digital computers. But in a new paper, physicists have shown that a classical analog computer can be used to emulate a quantum computer, along with quantum superposition and entanglement, with the result that the fully classical system behaves like a true quantum computer.

Physicist Brian La Cour and electrical engineer Granville Ott at Applied Research Laboratories, The University of Texas at Austin (ARL:UT), have published a paper on the classical emulation of a quantum computer in a recent issue of The New Journal of Physics. Besides having fundamental interest, using classical systems to emulate quantum computers could have practical advantages, since such quantum emulation devices would be easier to build and more robust to decoherence compared with true quantum computers.

"We hope that this work removes some of the mystery and 'weirdness' associated with quantum computing by providing a concrete, classical analog," La Cour told Phys.org. "The insights gained should help develop exciting new technology in both classical analog computing and true quantum computing."

As La Cour and Ott explain, quantum computers have been simulated in the past using software on a classical computer, but these simulations are merely numerical representations of the quantum computer's operations. In contrast, emulating a quantum computer involves physically representing the qubit structure and displaying actual quantum behavior. One key quantum behavior that can be emulated, but not simulated, is parallelism. Parallelism allows for multiple operations on the data to be performed simultaneously—a trait that arises from and entanglement, and enables quantum computers to operate at very fast speeds.

To emulate a quantum computer, the physicists' approach uses electronic signals to represent qubits, in which a qubit's state is encoded in the amplitudes and frequencies of the signals in a complex mathematical way. Although the scientists use electronic signals, they explain that any kind of signal, such as acoustic and electromagnetic waves, would also work.

Even though this classical system emulates quantum phenomena and behaves like a quantum computer, the scientists emphasize that it is still considered to be classical and not quantum.

The video will load shortly
Emulation of quantum superpositions using classical signals. Credit: Applied Research Laboratories, The University of Texas at Austin

"This is an important point," La Cour explained. "Superposition is a property of waves adding coherently, a phenomenon that is exhibited by many classical systems, including ours.

"Entanglement is a more subtle issue," he continued, describing entanglement as a "purely mathematical property of waves."

"Since our classical signals are described by the same mathematics as a true quantum system, they can exhibit these same properties."

He added that this kind of entanglement does not violate Bell's inequality, which is a widely used way to test for entanglement.

"Entanglement as a statistical phenomenon, as exhibited by such things as violations of Bell's inequality, is rather a different beast," La Cour explained. "We believe that, by adding an emulation of quantum noise to the signal, our device would be capable of exhibiting this type of entanglement as well, as described in another recent publication."

In the current paper, La Cour and Ott describe how their system can be constructed using basic analog electronic components, and that the biggest challenge is to fit a large number of these components on a single integrated circuit in order to represent as many qubits as possible. Considering that today's best semiconductor technology can fit more than a billion transistors on an integrated circuit, the scientists estimate that this transistor density corresponds to about 30 qubits. An increase in transistor density of a factor of 1000, which according to Moore's law may be achieved in the next 20 to 30 years, would correspond to 40 qubits.

This 40-qubit limit is also enforced by a second, more fundamental restriction, which arises from the bandwidth of the signal. The scientists estimate that a signal duration of a reasonable 10 seconds can accommodate 40 qubits; increasing the duration to 10 hours would only increase this to 50 qubits, and a one-year duration would only accommodate 60 qubits. Due to this scaling behavior, the physicists even calculated that a signal duration of the approximate age of the universe (13.77 billion years) could accommodate about 95 qubits, while that of the Planck time scale (10-43 seconds) would correspond to 176 qubits.

Considering that thousands of qubits are needed for some complex tasks, such as certain encryption techniques, this scheme clearly faces some insurmountable limits. Nevertheless, the scientists note that 40 qubits is still sufficient for some low-qubit applications, such as quantum simulations. Because the quantum emulation device offers practical advantages over quantum computers and performance advantages over most classical computers, it could one day prove very useful. For now, the next step will be building the device.

"Efforts are currently underway to build a two-qubit prototype device capable of demonstrating ," La Cour said. "The enclosed photo [see above] shows the current quantum emulation device as a lovely assortment of breadboarded electronics put together by one of my students, Mr. Michael Starkey. We are hoping to get future funding to support the development of an actual chip. Leveraging quantum parallelism, we believe that a coprocessor with as few as 10 could rival the performance of a modern Intel Core at certain computational tasks. Fault tolerance is another important issue that we studying. Due to the similarities in mathematical structure, we believe the same quantum error correction algorithms used to make quantum computers fault tolerant could be used for our quantum emulation device as well."

Explore further: New principle sets maximum limit on quantum information communication

More information: Brian R. La Cour and Granville E. Ott. "Signal-based classical emulation of a universal quantum computer." New Journal of Physics. DOI: 10.1088/1367-2630/17/5/053017

Related Stories

Quantum teleportation on a chip

April 1, 2015

The core circuits of quantum teleportation, which generate and detect quantum entanglement, have been successfully integrated into a photonic chip by an international team of scientists from the universities of Bristol, Tokyo, ...

Recommended for you

Understanding nature's patterns with plasmas

August 23, 2016

Patterns abound in nature, from zebra stripes and leopard spots to honeycombs and bands of clouds. Somehow, these patterns form and organize all by themselves. To better understand how, researchers have now created a new ...

Measuring tiny forces with light

August 25, 2016

Photons are bizarre: They have no mass, but they do have momentum. And that allows researchers to do counterintuitive things with photons, such as using light to push matter around.

Light and matter merge in quantum coupling

August 22, 2016

Where light and matter intersect, the world illuminates. Where light and matter interact so strongly that they become one, they illuminate a world of new physics, according to Rice University scientists.

A new study looks for the cortical conscious network

August 26, 2016

New research published in the New Journal of Physics tries to decompose the structural layers of the cortical network to different hierarchies enabling to identify the network's nucleus, from which our consciousness could ...

17 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Eikka
4.4 / 5 (7) May 27, 2015
An increase in transistor density of a factor of 1000, which according to Moore's law may be achieved in the next 20 to 30 years, would correspond to 40 qubits.


Won't happen. The current technology node is at 14 nm and Moore's law cannot undercut the size of individual atoms. A single-electron transistor is about 1.5 nm in size, so the linear scaling reaches up to 10 times, and the chip transistor density can therefore increase a maximum of 100 times.

In practice, you won't see a 100 times increase in transistor density because more transistors also require more interconnects, and the number of interconnects increases faster than the number of transistors because of the connectivity problem: if one transistor connects to two others, then the number of wires grows twice as fast as the number of transistors, and the wires take vastly more space than the transistors.

DarkLordKelvin
4.2 / 5 (5) May 27, 2015
Ummm ... I am pretty sure that if you're not violating Bell's inequality (and the scientists themselves explicitly say that they haven't), then you haven't really simulated quantum entanglement.
SciTechdude
not rated yet May 27, 2015
Any particular reason they couldn't just run 100 or 1000 machines in parallel as a Cerberus Cluster?
javjav
not rated yet May 27, 2015
increasing the density 100 to 1000 times means to build a Quantum computer in itself, as any bit of that subatomic size would necessarily be a Quantum bit. So that proposal would become a Quantum computer emulating a classical computer that emulates a Quantum computer
El_Nose
5 / 5 (1) May 27, 2015
@eikka

if we were to progress with transistors in their current state then yes you are correct. I can only offer that cutting edge tech is looking towards 3 goals 1) stacking layers of components vertically so we would now have say towers of cpus 2) molecular transistors 3) spintronics

all offer greater density it is yet to be determined if we would see speed increases

@javjav

not really, if we had 1 nm gates for transistors it would still be a transistor and have either on or off... spintronics and QC's can be both On AND OFF at the same time. This is only possible in a spintronic of QC -- and QC provide a method of using it for calculations. Very different sir.
Urgelt
5 / 5 (1) May 27, 2015
I'll expand a bit on what El_Nose said.

Progress is being made in optical computing, which also offers potential for sustaining Moore's Law by reducing power consumption and raising speeds of components. On top of that, mechanical hard discs are slowly being phased out in favor of steadily-improving solid state discs with accompanying speed and power gains. And we haven't yet done very much with memristors. We probably will. Toss in quantum computing gains - if they happen - and 3D chip designs, spintronics, high-temperature superconductor research, and materials research, all of which stand to produce gains, and it's pretty obvious to anyone who pays attention that Moore's Law still has a long, long way to go before it poops out.

Naysayers have been predicting the imminent demise of Moore's Law for over forty years. They were wrong. They're still wrong. Ignore them.
qquax
1 / 5 (1) May 28, 2015
@DarkLordKelvin, bingo

This is just embarrassing. Bell's inequality loopholes have been discussed ever since Bell found this relation which he didn't think would hold, he was the first to try to find loopholes, and the discussion, search and subsequent experimental falsification has been occupying physics ever since.

Yet, they think they can make the issue go away with a bit of hand waving and 'noise'. They are not only breathtakingly ignorant of the nature of entanglement but also the entire research history around testing Bell's inequality. I guess everything is truly bigger in Texas, and that includes academic ignorance and ego. While on the other hand degrees are apparently easy to come by.

Tried to find a CV of the one physicist cited in the article, but only found this:
http://texas-empl...-La-Cour

Sorry, buddy but you're not worth that salary, you should have known better.
http://www.scienc...pr98.pdf
del2
5 / 5 (1) May 28, 2015
QC's can be both On AND OFF at the same time

No, they are in a superposition of ON and OFF states. That's not the same thing.
There is what Feyman calls an amplitude (related to probability) for ON and an amplitude for OFF. It is neither ON nor OFF. If you measure it, it will then be either ON or OFF (the probability for one of the states will become 1). In the Copenhagen Interpretation terminology, the wavefunction has collapsed.
Eikka
not rated yet May 28, 2015
if we were to progress with transistors in their current state then yes you are correct. I can only offer that cutting edge tech is looking towards 3 goals 1) stacking layers of components vertically so we would now have say towers of cpus 2) molecular transistors 3) spintronics


1) stacked chips have heat disspation issues and we're already building vertical transistors to make them smaller in area.
2) The smallest possible transistor is still about 1.5 nm and it works with a single electron
3) spintronics has the same size limitation too

And we're talking about analog circuits built on transistors here, so spintronics doesn't really apply since it's a take on digital information processing by using the spins of electrons to represent bits.

In analog circuits you also need resistances, capacitances and inductances, which take up a tremendous amounts of chip surface and dissapate heat.
Urgelt
not rated yet May 28, 2015
Eikka, the study was transistor-based, but the subject is more general: emulated quantum computing. And you further broadened the subject by predicting the imminent demise of Moore's Law. Now you seem to assert that Moore's Law is only about lithographically-etched 2D silicon.

That stipulation is your only route to defending your prediction of Moore's Law's imminent demise. Fortunately for us, engineers and researchers do not feel a need to constrain themselves to 2D silicon. And so Moore's Law is nowhere near dead.
gralp
not rated yet May 28, 2015
@DarkLordKelvin, sorry but no: Entanglement is not something that can't be simulated nor even emulated by classical means. It is pretty straightforward to design a probabilistic framework of 2N p-bits which will cover the state space of N q-bits. The main problem is that the relative volume of subspace corresponding to quantum system in such immersion diminishes faster than exponentially with N. Thus control of more than couple of simulated qubits is as difficult as real ones. This is the spatial approach.

Alternatively one can simulate quantum states as temporal signals, and this was undertaken by the authors of this paper. Waves can be superposed, hence one can design "entangled" states as well - no problem. It has been emphasized many times in the past that Bell's inequality is a mathematical, not physical statement, therefore if one is able to violate it, then one simply violates its assumptions. The authors are certainly ready to point out which one is not satisfied.
qquax
5 / 5 (1) May 28, 2015
After taking a closer look at the original papers I have to retract my earlier vitriol. The authors refer to Spreeuw '98 in their paper and Brian R. La Cour, clearly shows that he knows the literature in
http://link.sprin...ext.html

So he's definitely worth the salary he draws, and the outlandish sounding comment about 'just adding noise' to engineer a violation of Bell's inequalities is not just hand waving.

Of course if you make such a bold statement, you better be prepared to show that it can be demonstrated to hold experimentally.

Overall I think the authors would have been well advised to follow the more careful wording as employed by Spreeuw.
DarkLordKelvin
not rated yet May 28, 2015
Entanglement is not something that can't be simulated nor even emulated by classical means. It is pretty straightforward to design a probabilistic framework of 2N p-bits which will cover the state space of N q-bits. The main problem is that the relative volume of subspace corresponding to quantum system in such immersion diminishes faster than exponentially with N. Thus control of more than couple of simulated qubits is as difficult as real ones. This is the spatial approach.


Well, since the "quantum" part of entanglement has to do with the statistics of measurements made on entangled states, I am having a hard time seeing how these classical models are capable of reproducing that phenomenon. I know little about quantum computing, so maybe these simulations have some utility in that field, but as models of quantum entanglement, they seem rather flawed. Superposition is one thing; that's just a general property of waves. Quantum entanglement is a deeper physical phenomenon.
qquax
not rated yet May 29, 2015
@DarkLordKelvin, pretty much any physicists "will have a hard time seeing how these classical models are capable of reproducing that phenomenon".

In the second paper linked, La Cour proposes his own hidden variable theory, a new Bell theorem loophole. Unless he can back this up with some actual experimental data hardly anybody will take another look.

http://link.sprin...ext.html
Eikka
not rated yet May 29, 2015
Eikka, the study was transistor-based, but the subject is more general: emulated quantum computing. And you further broadened the subject by predicting the imminent demise of Moore's Law.


Then suggest another way to emulate it. Their claims were based on transistors - other means to create the functional analogs have to be evaluated on other terms.

Now you seem to assert that Moore's Law is only about lithographically-etched 2D silicon.


It is. Re-defining what "Moore's Law" means every few years makes it not a law at all but simply a buzzword.

Fortunately for us, engineers and researchers do not feel a need to constrain themselves to 2D silicon. And so Moore's Law is nowhere near dead.


A transistor is nevertheless limited to the size of individual atoms and molecules. 3D circuits with transistors suffer from heat dissapation problems that require reduced density, so they can't reach anywhere near the 1000 fold improvement.
Eikka
not rated yet May 29, 2015
The limitation of how many transistors there are on a chip has historically followed the trend of how many Watts of heat can be reasonably removed from the chip.

The original formulation of Moore's law actually held up until about 1990 when the cost of adding more transistors became less than the cost of their power dissapation, and CPUs went up from a couple watts to first a couple dozen and then to 100 Watts. After that, the number of transistors on a chip has been limited by our inability to cool them and Moore's law was reformulated by Intel to mean something else.

Hence why, building 3D circuits in semiconductors, the number of transistors you can afford to include depends not on how small you can make them but how far away from the surface they are. The 2D chip is the best form in terms of heat dissapation.

If you made a CPU the size of a sugar cube with a thousand layers, 4 billion transistors each, it would draw kilowatts of power and melt instantly
Urgelt
not rated yet May 29, 2015
Eikka wrote, "...the number of transistors you can afford to include depends not on how small you can make them but how far away from the surface they are. The 2D chip is the best form in terms of heat dissapation."

You're thinking inside the box. Cooling is a challenge, but there will be ways to achieve it. Take optical transistors as an example - waste heat is orders of magnitude less than with electrons traveling through silicon semiconductors.

"Re-defining what "Moore's Law" means every few years makes it not a law at all but simply a buzzword."

Moore never intended for his 'Law' to be regarded as anything but a rule of thumb for planning purposes - a prediction about the rate at which computers would double in speed and capability. His prediction has held fairly true for far longer than even he expected - and it's still working, roughly.

No-one claims that Moore's Law is a law of nature. Proving that it isn't does not win you points.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.