Most people don't think twice about how Internet search engines work. You type in a word or phrase, hit enter, and poof – a list of web pages pops up, organized by relevance.

Behind the scenes, a lot of math goes into figuring out exactly what qualifies as most relevant web page for your search. Google, for example, uses a page ranking algorithm that is rumored to be the largest numerical calculation carried out anywhere in the world. With the web constantly expanding, researchers at USC have proposed – and demonstrated the feasibility – of using quantum computers to speed up that process.

"This work is about trying to speed up the way we search on the web," said Daniel Lidar, corresponding author of a paper on the research that appeared in the journal *Physical Review Letters* on June 4.

As the Internet continues to grow, the time and resources needed to run the calculation – which is done daily – grow with it, Lidar said.

Lidar, who holds appointments at the USC Viterbi School of Engineering and the USC Dornsife College of Letters, Arts and Sciences, worked with colleagues Paolo Zanardi of USC Dornsife and first author Silvano Garnerone, formerly a postdoctoral researcher at USC and now of the University of Waterloo, to see whether quantum computing could be used to run the Google algorithm faster.

As opposed to traditional computer bits, which can encode distinctly either a one or a zero, quantum computers use quantum bits or "qubits," which can encode a one and a zero at the same time. This property, called superposition, some day will allow quantum computers to perform certain calculations much faster than traditional computers.

Currently, there isn't a quantum computer in the world anywhere near large enough to run Google's page ranking algorithm for the entire web. To simulate how a quantum computer might perform, the researchers generated models of the web that simulated a few thousand web pages.

The simulation showed that a quantum computer could, in principle, return the ranking of the most important pages in the web faster than traditional computers, and that this quantum speedup would improve the more pages needed to be ranked. Further, the researchers showed that to simply determine whether the web's page rankings should be updated, a quantum computer would be able to spit out a yes-or-no answer exponentially faster than a traditional computer.

**Explore further:**
Breakthrough in quantum computing: Resisting 'quantum bug'

## Eikka

It takes some exotic quantum memory before it becomes practically sensible to employ a quantum processor instead of just many parallel conventional processors, or perhaps a very difficult question that you ask, that justifies the amount of time you need to upload all the data to the machine.

## Terriva

For example, for quantum computer working with reliability 96% like the 64-bit classical computer you will need to repeat the computation 109 times to achieve the result of the same level of precision. Such a precision could be achieved with parallelizing of results of 109 atoms in the role of qubits. We can realize fast, such level of parallelization of quantum bits is just achieved in classical transistors, which are constructed from 109 atoms each.

For me the quantum computers are just hype, the main reason of which is to provide the employment for researchers involved. But can they really beat the classical computers in their computational power? IMO they cannot, because this power is limited with the same laws both for classical, both for quantum computers.

## Eikka

What do you mean by "reliability", and what does it have to do with the word lenght of the instruction set?

## Terriva

## Eikka

Relative to what? A 64-bit computer can calculate things to arbitrary precision. A 1-bit computer can calculate things to arbitrary precision, although the algorithm gets a bit difficult to implement.

What is this error you're talking about?

The point of the quantum processor is that it doesn't need to iterate things, because it takes all the data, mashes it into one big quantum superposition, and then tries all alternatives at the same time.

## sigfpe

Are you aware of the quantum threshold theorem? http://en.wikiped..._theorem

Eikka,

It is not known how you can use a superposition of N states to get an N times speed up like you would by running N cores in parallel. For example Grover's algorithm completes its search in sqrt(N) time, much slower than you'd expect from a machine "trying all alternatives in parallel".

## javjav

This is not relevant. Does it matter if a web search list lacks 4% accuracy? This is a smaller error than using standard algorithms.

The memmory issue seems to be more complex. But quantum bits could be atom-sized or even sub-atomic size and do not need to dissipate any heat (nearly by definition, otherwise the information is lost). Once you solve how to create them at commercial costs, you can increase density by millions of times in comparison with classical chips.

## Eikka

Don't forget you have to contain your sub-atom size qubits somehow. The whole thing flies on its head if you have to isolate each particle in a vacuum flask the size of a fridge.