Quantum computers could help search engines keep up with the Internet's growth

Jun 12, 2012

Most people don't think twice about how Internet search engines work. You type in a word or phrase, hit enter, and poof – a list of web pages pops up, organized by relevance.

Behind the scenes, a lot of math goes into figuring out exactly what qualifies as most relevant page for your search. Google, for example, uses a page ranking algorithm that is rumored to be the largest numerical calculation carried out anywhere in the world. With the web constantly expanding, researchers at USC have proposed – and demonstrated the feasibility – of using quantum computers to speed up that process.

"This work is about trying to speed up the way we search on the web," said Daniel Lidar, corresponding author of a paper on the research that appeared in the journal Physical Review Letters on June 4.

As the Internet continues to grow, the time and resources needed to run the calculation – which is done daily – grow with it, Lidar said.

Lidar, who holds appointments at the USC Viterbi School of Engineering and the USC Dornsife College of Letters, Arts and Sciences, worked with colleagues Paolo Zanardi of USC Dornsife and first author Silvano Garnerone, formerly a postdoctoral researcher at USC and now of the University of Waterloo, to see whether could be used to run the Google algorithm faster.

As opposed to traditional computer bits, which can encode distinctly either a one or a zero, quantum computers use quantum bits or "qubits," which can encode a one and a zero at the same time. This property, called superposition, some day will allow quantum computers to perform certain calculations much faster than traditional computers.

Currently, there isn't a quantum computer in the world anywhere near large enough to run Google's page ranking algorithm for the entire web. To simulate how a quantum computer might perform, the researchers generated models of the web that simulated a few thousand web pages.

The simulation showed that a quantum computer could, in principle, return the ranking of the most important pages in the web faster than traditional computers, and that this quantum speedup would improve the more pages needed to be ranked. Further, the researchers showed that to simply determine whether the web's page rankings should be updated, a quantum computer would be able to spit out a yes-or-no answer exponentially faster than a traditional computer.

Explore further: Entanglement made tangible

Related Stories

Quantum computer built inside a diamond

Apr 04, 2012

Diamonds are forever – or, at least, the effects of this diamond on quantum computing may be. A team that includes scientists from USC has built a quantum computer in a diamond, the first of its kind to include protection ...

Madrid duo fire up quantum contender to Google search

Dec 14, 2011

(PhysOrg.com) -- Two Madrid scientists from The Complutense University think they have an algorithm that may impact the nature of the world's leading search engine. In essence, they are saying Hey, world, ...

A quantum leap in computing

Jan 04, 2012

When American physicist Richard Feynman in 1982 proposed creating a quantum computer that could solve complex problems, the idea was merely a theory scientists believed was far off in the future.

Recommended for you

Entanglement made tangible

Sep 30, 2014

EPFL scientists have designed a first-ever experiment for demonstrating quantum entanglement in the macroscopic realm. Unlike other such proposals, the experiment is relatively easy to set up and run with existing semiconductor ...

Putting the squeeze on quantum information

Sep 25, 2014

Canadian Institute for Advanced Research researchers have shown that information stored in quantum bits can be exponentially compressed without losing information. The achievement is an important proof of principle, and could ...

Are weak values quantum? Don't bet on it

Sep 24, 2014

(Phys.org) —New work asserts that a key technique used to probe quantum systems may not be so quantum after all, according to Perimeter postdoctoral researcher Joshua Combes and his colleague Christopher ...

User comments : 8

Adjust slider to filter visible comments by rank

Display comments: newest first

5 / 5 (1) Jun 12, 2012
The bottleneck isn't in the processor to make the calculation, but in the memory system to access all the data. The quantum computer is faced with the same bottleneck as it has to access the information from a conventional memory every time it wants to re-load the qubits for a new search query, as the superposition state is destroyed every time the result is read out.

It takes some exotic quantum memory before it becomes practically sensible to employ a quantum processor instead of just many parallel conventional processors, or perhaps a very difficult question that you ask, that justifies the amount of time you need to upload all the data to the machine.
1 / 5 (2) Jun 12, 2012
The results of quantum computers are approximate due the quantum noise. For example the Groower's algorithm has been reproduced with 96% reliability. But von Neuman computer couldn't work with such level of noise at all.

For example, for quantum computer working with reliability 96% like the 64-bit classical computer you will need to repeat the computation 109 times to achieve the result of the same level of precision. Such a precision could be achieved with parallelizing of results of 109 atoms in the role of qubits. We can realize fast, such level of parallelization of quantum bits is just achieved in classical transistors, which are constructed from 109 atoms each.

For me the quantum computers are just hype, the main reason of which is to provide the employment for researchers involved. But can they really beat the classical computers in their computational power? IMO they cannot, because this power is limited with the same laws both for classical, both for quantum computers.
5 / 5 (1) Jun 12, 2012
For example, for quantum computer working with reliability 96% like the 64-bit classical computer

What do you mean by "reliability", and what does it have to do with the word lenght of the instruction set?
not rated yet Jun 12, 2012
I meant the precision or relative error of the result. 64-bit computers are very exact (1/2^64 is the error of their calculations). You would need to average the result of quantum computer many times to achieve the same error and after then the quantum computer would lost much of its blazing speed. Therefore, the quantum computers can be suitable only for the applications, where the large error doesn't pose a problem and where it doesn't accumulate itself during iterations.
not rated yet Jun 12, 2012
I meant the precision or relative error of the result. 64-bit computer are very exact (1/2^64 is the error of their calculations).

Relative to what? A 64-bit computer can calculate things to arbitrary precision. A 1-bit computer can calculate things to arbitrary precision, although the algorithm gets a bit difficult to implement.

What is this error you're talking about?

The point of the quantum processor is that it doesn't need to iterate things, because it takes all the data, mashes it into one big quantum superposition, and then tries all alternatives at the same time.
not rated yet Jun 12, 2012

Are you aware of the quantum threshold theorem? http://en.wikiped..._theorem


It is not known how you can use a superposition of N states to get an N times speed up like you would by running N cores in parallel. For example Grover's algorithm completes its search in sqrt(N) time, much slower than you'd expect from a machine "trying all alternatives in parallel".
not rated yet Jun 12, 2012
For example, for quantum computer working with reliability 96% like the 64-bit classical computer

This is not relevant. Does it matter if a web search list lacks 4% accuracy? This is a smaller error than using standard algorithms.

The memmory issue seems to be more complex. But quantum bits could be atom-sized or even sub-atomic size and do not need to dissipate any heat (nearly by definition, otherwise the information is lost). Once you solve how to create them at commercial costs, you can increase density by millions of times in comparison with classical chips.
not rated yet Jun 17, 2012
Once you solve how to create them at commercial costs, you can increase density by millions of times in comparison with classical chips.

Don't forget you have to contain your sub-atom size qubits somehow. The whole thing flies on its head if you have to isolate each particle in a vacuum flask the size of a fridge.