New largest number factored on a quantum device is 56,153

November 28, 2014 by Lisa Zyga report
This table shows all of the progress until now in factoring numbers using quantum computers. The last three numbers are from the current paper. Credit: Dattani and Bryans

(Phys.org)—Researchers have set a new record for the quantum factorization of the largest number to date, 56,153, smashing the previous record of 143 that was set in 2012. They have shown that the exact same room-temperature nuclear magnetic resonance (NMR) experiment used to factor 143 can actually factor an entire class of numbers, although this was not known until now. Because this computation, which is based on a minimization algorithm involving 4 qubits, does not require prior knowledge of the answer, it outperforms all implementations of Shor's algorithm to date, which do require prior knowledge of the answer. Expanding on this method, the researchers also theoretically show how the same minimization algorithm can be used to factor even larger numbers, such as 291,311, with only 6 qubits.

On top of this, in the same paper the researchers demonstrated the first quantum factorization of a "triprime," which is the product of three prime numbers. Here, the researchers used a 3-qubit factorization method to factor the triprime 175, which has the factors 5, 5, and 7.

Nike Dattani at Kyoto University and Oxford University, along with Nathaniel Bryans at Microsoft (now at the University of Calgary), have described their results in a recent paper posted at arXiv.org.

As the researchers explain, the minimization algorithm used to achieve these results has been steadily improving since its introduction in 2001, especially in comparison with Shor's algorithm, which was introduced in 1994. The largest factored by Shor's algorithm to date is only 21, and even this factorization relied on of the answer to the problem being solved in the first place.

The minimization algorithm is different than Shor's algorithm in that it turns the factorization problem into an optimization problem, and then uses a quantum device to solve for the minimum values, which encode the factors. (The minimum values are not themselves the factors. For example, if the minimum values found are p1 = 0 and p2 = 1, and one factor is p = 1, p2, p1, 1 in binary, then p = 1101, which corresponds to 13 as one of the factors.)

This computation is how, in 2012, Xu, et al., factored the number 143 (11 x 13). It's also how Dattani and Bryans factor several other numbers in the new paper, the largest of which is 56,153 (241 x 233).

"We're still a far way from outperforming classical computers," Dattani told Phys.org. "The highest RSA number factored on a classical computer was RSA-768, which has 768 bits, and took two years to compute (from 2007 to 2009)."

RSA numbers are a set of large "semiprimes"—numbers with exactly two prime factors. RSA numbers are particularly special due to the difficulty in factoring them. For this reason, they are used by governments, militaries, and banks to keep financial information secure.

"56,153 only has 16 bits," Dattani explained. "However, that's twice the number of bits in the largest number factored using Shor's algorithm to date; and while factoring 56,153 via minimization only required 4 , factoring 21 with Shor's algorithm requires 10."

Although the minimization algorithm is a true quantum method, the equations can also be quickly and easily solved by a classical computer because they contain only four variables, and therefore solving them involves only 24 = 16 queries.

In order for quantum computers to provide real, practical advantages over classical computers, the equations must have many more than four variables. For example, a case in which the equations have 512 variables, which is the number of qubits in the D-Wave Two quantum computer, would require 2512 queries. Using a brute force "guess and check" strategy, a classical computer that makes a trillion queries per second would take 10123 times the age of the universe to factor such a number.

This issue raises the question of how to tell which large numbers will reduce to a set of equations with a large number of variables, as these are the numbers that would benefit from quantum treatment. Here, Dattani and Bryans have noticed certain patterns that may help identify these numbers when they are semiprime.

Specifically, they found that the semiprimes that would gain the most benefit from the power of a quantum computer would have factors that differ at a large number of bit positions (for example, 110101 and 101010 differ by five of six bit positions). The factors of 291,311 (557 x 523) differ at a large number of bit positions, and the researchers demonstrated how to factor this number using the minimization technique with 6 qubits. (Unlike the factorization of 56,153, the factorization of 291,311 has not yet been experimentally implemented, so it doesn't count as a new record yet.)

Until now, semiprimes have been the only numbers for which quantum factorization has been successfully demonstrated. But Dattani and Bryans address this issue as well, and demonstrate the quantum factorization of the triprime 175 with three qubits (which also has not been experimentally implemented).

"Shor's algorithm can in theory factor large numbers with fewer operations than the number of classical operations required for factoring on a classical computer," Dattani said. "However, performing quantum circuit operations is mighty hard, so it's not clear which type of computer would win the race to factor, for example, RSA-896. The alternative to Shor's algorithm, that we discussed, is still a quantum computation, but it does not rely on quantum circuit operations. We want to work out whether this circuitless quantum computation can factor something that a cannot do."

Explore further: Simon's algorithm run on quantum computer for the first time—faster than on standard computer

More information: Nikesh S. Dattani and Nathaniel Bryans. "Quantum factorization of 56153 with only 4 qubits." arXiv:1411.6758 [quant-ph]

Related Stories

D-Wave and predecessors: From simulated to quantum annealing

June 23, 2014

The D-Wave computer is currently the latest link of a long chain of computers designed for the solution of optimization problems. In what sense does it realize quantum computation? We describe the evolution of such computers ...

Physicists demonstrate that 15=3x5 about half of the time

August 19, 2012

Computing prime factors may sound like an elementary math problem, but try it with a large number, say one that contains more than 600 digits, and the task becomes enormously challenging and impossibly time-consuming. Now, ...

Quantum algorithm breakthrough

February 24, 2013

An international research group led by scientists from the University of Bristol, UK, and the University of Queensland, Australia, has demonstrated a quantum algorithm that performs a true calculation for the first time. ...

Recommended for you

Solving the riddle of the snow globe

May 25, 2017

If you've shaken a snow globe, you've enjoyed watching its tiny particles slowly sink to the bottom. But do all small objects drift the same way and at the same pace?

8 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

justindadswell
1 / 5 (5) Nov 28, 2014
How long until they make quad-trillion core cpu's?

Seriously though, what on Earth are we going to use all this upcoming cpu power for?

If the theory of a universe from nothing pans out, then theoretically the information for an entire universe could be simplified into something a qubit computer can run. How odd would that be, a computer so complex that it contains a copy of everything(locations of everything minute wave structure in the universe), plus infinite copies of that.
But then that leads to the, we are a computer simulation argument vs infinite universes vs something in between argument and I am getting way off subject.
RM07
5 / 5 (4) Nov 28, 2014

Seriously though, what on Earth are we going to use all this upcoming cpu power for?

For most people, anything more than 5 years out is pure science fiction, let alone things that might be 25, 50, or 100 years away.

Processing power will always be needed, not only to do things that are currently impossible, but also to do currently possible things faster.
Mike_Massen
2.6 / 5 (5) Nov 29, 2014
Especially for justindadswell & other like minded esoteric philosophers ;-)

http://www.physic...5/LQ.pdf

Note:
From microbiology studies & being cognisant of permutation space (10^60)! [where !=factorial].

One has to wonder why there are such an astronomically immense number of solar systems across billions of galaxies of which our own is also vast, ie multiple environments (planets) subject to quantum uncertainty with power sources (suns etc) separated by vacuum with hard radiation all capped off with a built-in speed limit so colonies cannot easily cross-contaminate etc...

If the universe was a creation of some deity (see pdf) it would seem to lead to the conclusion it doesnt know what is going to happen [by any means] & thus if the deity is a scientist of sorts then we are all being studied & likely rather closely too - there are about 3 other definitive conclusions but, these are the subject of my current work, under pen name of course !
Phil DePayne
5 / 5 (1) Nov 29, 2014
Just think about this. If you had enough computing power to break the RSA scheme you would most likely become a very wealthy person.
Whydening Gyre
5 / 5 (3) Nov 29, 2014
Especially for justindadswell & other like minded esoteric philosophers ;-)
One has to wonder why there are such an astronomically immense number of solar systems across billions of galaxies of which our own is also vast, ie multiple environments (planets) subject to quantum uncertainty with power sources (suns etc) separated by vacuum with hard radiation all capped off with a built-in speed limit so colonies cannot easily cross-contaminate etc...

If the universe was a creation of some deity (see pdf) it would seem to lead to the conclusion it doesnt know what is going to happen [by any means] & thus if the deity is a scientist of sorts then we are all being studied & likely rather closely too - there are about 3 other definitive conclusions but, these are the subject of my current work, under pen name of course !

Mike - you're gonna make my head explode! :-)
dan42day
1 / 5 (1) Nov 29, 2014
Seriously though, what on Earth are we going to use all this upcoming cpu power for?


So that Microsoft can continue to make their products clunkier and more bloated.
MRBlizzard
5 / 5 (7) Nov 29, 2014
1. Protein folding.
2. In Silico, material/alloy behavior calculations from first principles. Saves decades of messy work.
3. Weather control.
4. String theory manifold calculations
vpoko
5 / 5 (5) Nov 30, 2014
How long until they make quad-trillion core cpu's?

Seriously though, what on Earth are we going to use all this upcoming cpu power for?

It's really not about quad-trillion core CPU's, which would take an impossible number of transistors and still not be able to solve problems that grow exponentially with the problem size. It's about a new way of solving a limited number of problems (factoring, discrete logs, simulating quantum mechanics, and a few others) that were classically intractable and would remain that way no matter how fast your classical computer got or how many parallel cores you had.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.