(Phys.org)—Quantum computers are inherently different from their classical counterparts because they involve quantum phenomena, such as superposition and entanglement, which do not exist in classical digital computers. But in a new paper, physicists have shown that a classical analog computer can be used to emulate a quantum computer, along with quantum superposition and entanglement, with the result that the fully classical system behaves like a true quantum computer.
Physicist Brian La Cour and electrical engineer Granville Ott at Applied Research Laboratories, The University of Texas at Austin (ARL:UT), have published a paper on the classical emulation of a quantum computer in a recent issue of The New Journal of Physics. Besides having fundamental interest, using classical systems to emulate quantum computers could have practical advantages, since such quantum emulation devices would be easier to build and more robust to decoherence compared with true quantum computers.
"We hope that this work removes some of the mystery and 'weirdness' associated with quantum computing by providing a concrete, classical analog," La Cour told Phys.org. "The insights gained should help develop exciting new technology in both classical analog computing and true quantum computing."
As La Cour and Ott explain, quantum computers have been simulated in the past using software on a classical computer, but these simulations are merely numerical representations of the quantum computer's operations. In contrast, emulating a quantum computer involves physically representing the qubit structure and displaying actual quantum behavior. One key quantum behavior that can be emulated, but not simulated, is parallelism. Parallelism allows for multiple operations on the data to be performed simultaneously—a trait that arises from quantum superposition and entanglement, and enables quantum computers to operate at very fast speeds.
To emulate a quantum computer, the physicists' approach uses electronic signals to represent qubits, in which a qubit's state is encoded in the amplitudes and frequencies of the signals in a complex mathematical way. Although the scientists use electronic signals, they explain that any kind of signal, such as acoustic and electromagnetic waves, would also work.
Even though this classical system emulates quantum phenomena and behaves like a quantum computer, the scientists emphasize that it is still considered to be classical and not quantum.
"This is an important point," La Cour explained. "Superposition is a property of waves adding coherently, a phenomenon that is exhibited by many classical systems, including ours.
"Entanglement is a more subtle issue," he continued, describing entanglement as a "purely mathematical property of waves."
"Since our classical signals are described by the same mathematics as a true quantum system, they can exhibit these same properties."
He added that this kind of entanglement does not violate Bell's inequality, which is a widely used way to test for entanglement.
"Entanglement as a statistical phenomenon, as exhibited by such things as violations of Bell's inequality, is rather a different beast," La Cour explained. "We believe that, by adding an emulation of quantum noise to the signal, our device would be capable of exhibiting this type of entanglement as well, as described in another recent publication."
In the current paper, La Cour and Ott describe how their system can be constructed using basic analog electronic components, and that the biggest challenge is to fit a large number of these components on a single integrated circuit in order to represent as many qubits as possible. Considering that today's best semiconductor technology can fit more than a billion transistors on an integrated circuit, the scientists estimate that this transistor density corresponds to about 30 qubits. An increase in transistor density of a factor of 1000, which according to Moore's law may be achieved in the next 20 to 30 years, would correspond to 40 qubits.
This 40-qubit limit is also enforced by a second, more fundamental restriction, which arises from the bandwidth of the signal. The scientists estimate that a signal duration of a reasonable 10 seconds can accommodate 40 qubits; increasing the duration to 10 hours would only increase this to 50 qubits, and a one-year duration would only accommodate 60 qubits. Due to this scaling behavior, the physicists even calculated that a signal duration of the approximate age of the universe (13.77 billion years) could accommodate about 95 qubits, while that of the Planck time scale (10-43 seconds) would correspond to 176 qubits.
Considering that thousands of qubits are needed for some complex quantum computing tasks, such as certain encryption techniques, this scheme clearly faces some insurmountable limits. Nevertheless, the scientists note that 40 qubits is still sufficient for some low-qubit applications, such as quantum simulations. Because the quantum emulation device offers practical advantages over quantum computers and performance advantages over most classical computers, it could one day prove very useful. For now, the next step will be building the device.
"Efforts are currently underway to build a two-qubit prototype device capable of demonstrating entanglement," La Cour said. "The enclosed photo [see above] shows the current quantum emulation device as a lovely assortment of breadboarded electronics put together by one of my students, Mr. Michael Starkey. We are hoping to get future funding to support the development of an actual chip. Leveraging quantum parallelism, we believe that a coprocessor with as few as 10 qubits could rival the performance of a modern Intel Core at certain computational tasks. Fault tolerance is another important issue that we studying. Due to the similarities in mathematical structure, we believe the same quantum error correction algorithms used to make quantum computers fault tolerant could be used for our quantum emulation device as well."
Explore further:
New principle sets maximum limit on quantum information communication
More information:
Brian R. La Cour and Granville E. Ott. "Signal-based classical emulation of a universal quantum computer." New Journal of Physics. DOI: 10.1088/1367-2630/17/5/053017

Eikka
4.4 / 5 (7) May 27, 2015Won't happen. The current technology node is at 14 nm and Moore's law cannot undercut the size of individual atoms. A single-electron transistor is about 1.5 nm in size, so the linear scaling reaches up to 10 times, and the chip transistor density can therefore increase a maximum of 100 times.
In practice, you won't see a 100 times increase in transistor density because more transistors also require more interconnects, and the number of interconnects increases faster than the number of transistors because of the connectivity problem: if one transistor connects to two others, then the number of wires grows twice as fast as the number of transistors, and the wires take vastly more space than the transistors.
DarkLordKelvin
4.2 / 5 (5) May 27, 2015SciTechdude
not rated yet May 27, 2015javjav
not rated yet May 27, 2015El_Nose
5 / 5 (1) May 27, 2015if we were to progress with transistors in their current state then yes you are correct. I can only offer that cutting edge tech is looking towards 3 goals 1) stacking layers of components vertically so we would now have say towers of cpus 2) molecular transistors 3) spintronics
all offer greater density it is yet to be determined if we would see speed increases
@javjav
not really, if we had 1 nm gates for transistors it would still be a transistor and have either on or off... spintronics and QC's can be both On AND OFF at the same time. This is only possible in a spintronic of QC -- and QC provide a method of using it for calculations. Very different sir.
Urgelt
5 / 5 (1) May 27, 2015Progress is being made in optical computing, which also offers potential for sustaining Moore's Law by reducing power consumption and raising speeds of components. On top of that, mechanical hard discs are slowly being phased out in favor of steadily-improving solid state discs with accompanying speed and power gains. And we haven't yet done very much with memristors. We probably will. Toss in quantum computing gains - if they happen - and 3D chip designs, spintronics, high-temperature superconductor research, and materials research, all of which stand to produce gains, and it's pretty obvious to anyone who pays attention that Moore's Law still has a long, long way to go before it poops out.
Naysayers have been predicting the imminent demise of Moore's Law for over forty years. They were wrong. They're still wrong. Ignore them.
qquax
1 / 5 (1) May 28, 2015This is just embarrassing. Bell's inequality loopholes have been discussed ever since Bell found this relation which he didn't think would hold, he was the first to try to find loopholes, and the discussion, search and subsequent experimental falsification has been occupying physics ever since.
Yet, they think they can make the issue go away with a bit of hand waving and 'noise'. They are not only breathtakingly ignorant of the nature of entanglement but also the entire research history around testing Bell's inequality. I guess everything is truly bigger in Texas, and that includes academic ignorance and ego. While on the other hand degrees are apparently easy to come by.
Tried to find a CV of the one physicist cited in the article, but only found this:
http://texas-empl...-La-Cour
Sorry, buddy but you're not worth that salary, you should have known better.
http://www.scienc...pr98.pdf
del2
5 / 5 (1) May 28, 2015No, they are in a superposition of ON and OFF states. That's not the same thing.
There is what Feyman calls an amplitude (related to probability) for ON and an amplitude for OFF. It is neither ON nor OFF. If you measure it, it will then be either ON or OFF (the probability for one of the states will become 1). In the Copenhagen Interpretation terminology, the wavefunction has collapsed.
Eikka
not rated yet May 28, 20151) stacked chips have heat disspation issues and we're already building vertical transistors to make them smaller in area.
2) The smallest possible transistor is still about 1.5 nm and it works with a single electron
3) spintronics has the same size limitation too
And we're talking about analog circuits built on transistors here, so spintronics doesn't really apply since it's a take on digital information processing by using the spins of electrons to represent bits.
In analog circuits you also need resistances, capacitances and inductances, which take up a tremendous amounts of chip surface and dissapate heat.
Urgelt
not rated yet May 28, 2015That stipulation is your only route to defending your prediction of Moore's Law's imminent demise. Fortunately for us, engineers and researchers do not feel a need to constrain themselves to 2D silicon. And so Moore's Law is nowhere near dead.
gralp
not rated yet May 28, 2015Alternatively one can simulate quantum states as temporal signals, and this was undertaken by the authors of this paper. Waves can be superposed, hence one can design "entangled" states as well - no problem. It has been emphasized many times in the past that Bell's inequality is a mathematical, not physical statement, therefore if one is able to violate it, then one simply violates its assumptions. The authors are certainly ready to point out which one is not satisfied.
qquax
5 / 5 (1) May 28, 2015http://link.sprin...ext.html
So he's definitely worth the salary he draws, and the outlandish sounding comment about 'just adding noise' to engineer a violation of Bell's inequalities is not just hand waving.
Of course if you make such a bold statement, you better be prepared to show that it can be demonstrated to hold experimentally.
Overall I think the authors would have been well advised to follow the more careful wording as employed by Spreeuw.
DarkLordKelvin
not rated yet May 28, 2015Well, since the "quantum" part of entanglement has to do with the statistics of measurements made on entangled states, I am having a hard time seeing how these classical models are capable of reproducing that phenomenon. I know little about quantum computing, so maybe these simulations have some utility in that field, but as models of quantum entanglement, they seem rather flawed. Superposition is one thing; that's just a general property of waves. Quantum entanglement is a deeper physical phenomenon.
qquax
not rated yet May 29, 2015In the second paper linked, La Cour proposes his own hidden variable theory, a new Bell theorem loophole. Unless he can back this up with some actual experimental data hardly anybody will take another look.
http://link.sprin...ext.html
Eikka
not rated yet May 29, 2015Then suggest another way to emulate it. Their claims were based on transistors - other means to create the functional analogs have to be evaluated on other terms.
It is. Re-defining what "Moore's Law" means every few years makes it not a law at all but simply a buzzword.
A transistor is nevertheless limited to the size of individual atoms and molecules. 3D circuits with transistors suffer from heat dissapation problems that require reduced density, so they can't reach anywhere near the 1000 fold improvement.
Eikka
not rated yet May 29, 2015The original formulation of Moore's law actually held up until about 1990 when the cost of adding more transistors became less than the cost of their power dissapation, and CPUs went up from a couple watts to first a couple dozen and then to 100 Watts. After that, the number of transistors on a chip has been limited by our inability to cool them and Moore's law was reformulated by Intel to mean something else.
Hence why, building 3D circuits in semiconductors, the number of transistors you can afford to include depends not on how small you can make them but how far away from the surface they are. The 2D chip is the best form in terms of heat dissapation.
If you made a CPU the size of a sugar cube with a thousand layers, 4 billion transistors each, it would draw kilowatts of power and melt instantly
Urgelt
not rated yet May 29, 2015You're thinking inside the box. Cooling is a challenge, but there will be ways to achieve it. Take optical transistors as an example - waste heat is orders of magnitude less than with electrons traveling through silicon semiconductors.
"Re-defining what "Moore's Law" means every few years makes it not a law at all but simply a buzzword."
Moore never intended for his 'Law' to be regarded as anything but a rule of thumb for planning purposes - a prediction about the rate at which computers would double in speed and capability. His prediction has held fairly true for far longer than even he expected - and it's still working, roughly.
No-one claims that Moore's Law is a law of nature. Proving that it isn't does not win you points.