Quantum computers could greatly accelerate machine learning

March 30, 2015 by Lisa Zyga feature
Exploiting entanglement to classify high-dimensional vectors for quantum accelerated machine learning. Credit: Chao-Yang Lu and Xin-Dong Cai

(Phys.org)—For the first time, physicists have performed machine learning on a photonic quantum computer, demonstrating that quantum computers may be able to exponentially speed up the rate at which certain machine learning tasks are performed—in some cases, reducing the time from hundreds of thousands of years to mere seconds. The new method takes advantage of quantum entanglement, in which two or more objects are so strongly related that paradoxical effects often arise since a measurement on one object instantaneously affects the other. Here, quantum entanglement provides a very fast way to classify vectors into one of two categories, a task that is at the core of machine learning.

The physicists, Chao-Yang Lu, Nai-Le Liu, Li Li and colleagues at the University of Science and Technology of China in Hefei, have published a paper on the entanglement-based machine learning method in a recent issue of Physical Review Letters.

As the researchers explain, machine learning has many different uses in everyday life. One example is a spam filter that sorts email into spam and nonspam messages by comparing the incoming email with old email labeled by the user. This is an example of supervised machine learning, as the system is provided with a set of examples. In unsupervised machine learning, the system does not receive prior information. An example of unsupervised machine learning is photo editing software that attempts to classify pixels into two groups: the object and the background.

For both supervised and unsupervised machine learning, the new items to be classified (e.g., emails, pixels, etc.) are represented by vectors. The system assigns each vector to one of two categories by analyzing the vector's length and comparing it to a reference vector in each category. The new vector is assigned to the category containing the most similar reference vector.

Classifying a small number of vectors in this way can be done very quickly. However, as the amount of data in the world rapidly increases, so does the time required for machines to process it. Researchers expect that this "" problem could one day pose a challenge even for the fastest supercomputers.

Experimental setup for quantum machine learning with photonic qubits. Credit: X.-D. Cai, et al. ©2015 American Physical Society

In the new paper, the physicists addressed this challenge by representing vectors with quantum states, and then entangling the states before comparing the distance between them. As they explain, quantum computers are naturally good at manipulating vectors. So to do this, they used a small-scale quantum computer that manipulates optical qubits, or photons.

In the optical setup, one photon acts as an ancillary qubit, and is used to entangle another photon that encodes both the reference vector and the new vector. The resulting two-photon entangled state is used to classify two-dimensional vectors, whereas three- and four-photon entangled states can classify four- and eight-dimensional vectors, respectively. In general, different vector dimensions are needed to describe the properties of different real-world objects.

As the scientists explained, the quantum method can potentially accelerate machine learning far beyond what today's computers are capable of. Lu gave an example to illustrate just how powerful the exponential speed-up is:

"To calculate the distance between two large vectors with a dimension of 1021 (or, in the language of Big Data, we can call it 1 Zettabyte (ZB)), a GHz clock-rate classical computer will take about hundreds of thousands of years," Lu told Phys.org. "A GHz clock-rate quantum computer, if we can build it in the future, with the exponential speed-up, will take only about a second to estimate the distance between these two vectors after they are entangled with the ancillary qubit."

This quantum-based vector classification method can be used for both supervised and unsupervised machine learning, and so could have a wide variety of applications. Machine learning is used in fields such as computer science, financial analysis, robotics, and bioinformatics. It's also ubiquitous in our daily lives, such as in face recognition, email filtering, and recommendation systems for online shopping.

"Machine learning has been all around, and will likely play a more important role in the age of Big Data with the explosion of electronic data," Lu said. "It is estimated that every year it grows exponentially by 40%. On the other hand, we have bad news about Moore's law: If it is to continue, then in about 2020, the chip size will shrink down to the atomic level where quantum mechanics rules. Thus, the speed-up of classical computation power faces a major challenge. Today, we may still be good running machine learning and other computational tasks with our good old classical computers, but we might need to think of other ways in the long run."

In the future, the researchers hope to scale the method to larger numbers of qubits. They explain that higher-dimensional quantum states can be encoded using a photon's degree of freedom of orbital angular momentum, or by using other properties.

"We are working on controlling an increasingly large number of quantum bits for more powerful quantum ," Lu said. "By controlling multiple degrees of freedom of a single photon, we aim to generate 6-photon, 18-qubit entanglement in the near future. Using semiconductor quantum dots, we are trying to build a solid-state platform for approximately 20-photon entanglement in about five years. With the enhanced ability in quantum control, we will perform more complicated quantum artificial intelligence tasks."

Explore further: First glimpse inside a macroscopic quantum state

More information: X.-D. Cai, et al. "Entanglement-Based Machine Learning on a Quantum Computer." Physical Review Letters. DOI: 10.1103/PhysRevLett.114.110504
Also at arXiv:1409.7770 [quant-ph]

Related Stories

First glimpse inside a macroscopic quantum state

March 27, 2015

In a recent study published in Physical Review Letters, the research group led by ICREA Prof at ICFO Morgan Mitchell has detected, for the first time, entanglement among individual photon pairs in a beam of squeezed light.

3,000 atoms entangled with a single photon

March 25, 2015

Physicists from MIT and the University of Belgrade have developed a new technique that can successfully entangle 3,000 atoms using only a single photon. The results, published today in the journal Nature, represent the largest ...

Study narrows the scope of research on quantum computing

December 1, 2014

According to many scientists, quantum computers will have great importance in the future but, despite all efforts, research in this field is still in its infancy. One of the difficulties is understanding what criteria a quantum ...

Efficient distributed quantum computing

February 21, 2013

(Phys.org)—A quantum computer doesn't need to be a single large device but could be built from a network of small parts, new research from the University of Bristol has demonstrated. As a result, building such a computer ...

Recommended for you

Lightning, with a chance of antimatter

November 22, 2017

A storm system approaches: the sky darkens, and the low rumble of thunder echoes from the horizon. Then without warning... Flash! Crash!—lightning has struck.

How the Earth stops high-energy neutrinos in their tracks

November 22, 2017

Neutrinos are abundant subatomic particles that are famous for passing through anything and everything, only very rarely interacting with matter. About 100 trillion neutrinos pass through your body every second. Now, scientists ...

Quantum internet goes hybrid

November 22, 2017

In a recent study published in Nature, ICFO researchers led by ICREA Prof. Hugues de Riedmatten report an elementary "hybrid" quantum network link and demonstrate photonic quantum communication between two distinct quantum ...

Enhancing the quantum sensing capabilities of diamond

November 22, 2017

Researchers have discovered that dense ensembles of quantum spins can be created in diamond with high resolution using an electron microscopes, paving the way for enhanced sensors and resources for quantum technologies.

Study shows how to get sprayed metal coatings to stick

November 21, 2017

When bonding two pieces of metal, either the metals must melt a bit where they meet or some molten metal must be introduced between the pieces. A solid bond then forms when the metal solidifies again. But researchers at MIT ...

9 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

qquax
not rated yet Mar 30, 2015
While this is a nice illustration of the computational powers of entanglement, the real price is a universal quantum computer that can perform transparent (i.e. no black box) Bayesian learning.
You can already invest in a start-up that holds the patents on this https://angel.co/...e-qb-net
vic1248
5 / 5 (2) Mar 30, 2015
Well, Moore's Law was due to expire in 2015 but it is forecasted to continue through Intel's 7nm microchip technology and up to the year 2020, at which point microchip sizes cannot be reduced and densities cannot be increased anymore with the current 1945 Von Neumann computer architecture model. Quantum Computers cannot be a reality by the year 2020, and a new computer architecture model will be needed to replace the would be obsolete 1945 Von Neumann model.

Intel is extending Moore's Law through its 10nm, 7nm, ..., microchip lithography by using what's called compound III-V semiconductors, where "indium arsenide" (for n-channels) and "indium gallium antimonide" (for p-channels) are going to be grown as the top active layer over a silicon substrate instead of silicon over silicon, since silicon's electrical capabilities have been tapped for the most part. That III-V single crystal layer has superior electrical properties and power efficiency over silicon at smaller sizes.
Dethe
1 / 5 (2) Mar 30, 2015
The computational power of classical computers is limited with the same uncertainty principle, like the power of quantum ones. The quantum computers are fast but fuzzy - for to achieve the 32/64 bit precision of classical computers you should repeat their calculations a billion-times, which would wipe their advantage in speed. This theorem has been already proved for throughput quantum communication and the quantum processing is just a subset of quantum communication. The quantum computers are similar silly hype enforced with theorists, like the string theory. Their common problem isn't they cannot work at all, but that they cannot work better than existing approaches.
El_Nose
not rated yet Mar 30, 2015
@qquax

there is no such thing as
perform transparent (i.e. no black box) Bayesian learning.


the reasoning is this...
if it is 'learning' and what you want it decision making one can always go back and look at the coefficients of each node in the decision tree... and this tells us absolutely nothing.
Whether a neural network or an AI heuristic all you are doing is looking at a model that seems to work and asking what are the coeffiecients for each node in the model. This gives you no insights into the model itself... it is still black.

What you are really describing is an algorithm that solves a problem... and most languages don't allow algo's to rewrite themselves... that is where learning comes in... if we make the process of evaluation and the update loop and separate it from the instructions ( true black box ... different from above where you knew the coefficients) then you create something useful

El_Nose
5 / 5 (1) Mar 30, 2015
so a tenth year polysentience AI can be a priceless jewel or a psychotic wreck

@vick

while i totally agree that we will move beyond silicon but i still believe optical computers will be the true next phase of computing ... and this will lead into quantum computing.

Optical computers will be available to the general consumer, by optical i mean not only optical interconnects between fabricated chips or pieces of the board, but optical switches and the total reliance on light as the transmitting medium, replacing the electron and the need for conventional wires. Instead using wave guides.

This will hopefully reduce heat issues.
vic1248
5 / 5 (1) Mar 30, 2015
The problem with optics is that you cannot control light in the ways needed for computing. You cannot make light turn corners for example without loosing its energy, hence loss of data. That research while in full swing is way behind any transpiration by 2020.

And yes, heat is a major issue in current silicon transistor Integrated Circuits (IC) microchips that is contributing to the obsolescence of Moore's Law and current chip technologies.
qquax
not rated yet Mar 30, 2015
El_Nose, fair enough, but I'd maintain that a B-Net is still more readable than a NN. E.g. if you train a B-Net to learn stock and econometric index correlations to use for forecasting, than this is to me much more transparent than training a NN with the same data.
Relatrix
not rated yet Mar 31, 2015
Just another example of an analog quantum computing prototype that will certainly not scale to the level required to perform machine learning.
Ubaidullah Khan
not rated yet Apr 08, 2015
I was already knowing about quantum computers. But I am amazed by the idea of using quantum computers which changed the world of machine learning. I love the article because it clearly helps me in doing my research in machine learning. Thank you.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.