(Phys.org)—László Babai with the Department of Computer Science and Mathematics at the University of Chicago has caused a lot of excitement in the computer science community by announcing recently during a lecture that he had developed an algorithm that solves the graph isomorphism problem. Though not as well known, it is comparable to someone solving the famous traveling salesmen problem.

The new algorithm, if it passes muster with other researchers, is not likely to cause any changes that non-computer people will notice, but solving it is an impressive achievement. Many in the field have thought doing so was impossible.

Computer science at its root is very simple—get a computer to perform a calculation we want done. Developing an algorithm that sorted records in a database, for example, was a big deal many years ago because it meant humans no longer had to do it by hand. Unfortunately, it has proven to be exceptionally difficult to get a computer to perform some other calculations, like asking it to tell a salesmen which is the best route for him, given a certain number of parameters. For a computer to do it, means grinding through all the options, which is fine for simple routes with few nodes—add hundreds, thousands or millions though, and the computer will be at it longer than the person waiting for it will live. It would be better if someone could figure out a way to tell the computer how to do it in a smarter, more efficient way—and that is what lies at the heart of computer science.

Over time, computer scientists have come up with a way to separate the easy or simple types of algorithm objectives, from those that are hard, or even thought to be impossible. Those that are relatively easy are denoted by P, because they can be run in polynomial time. Those that are hard are deemed Nondeterministic Polynomial, or just NP. The hardest of all are referred to as NP complete.

The graph isomorphism problem has been labeled as NP, though some have suggested it should be NP complete—it involves trying to create an algorithm able to look at two networks (with nodes and links between them) and report back if they are the same. Again, this is pretty simple if the network is small, but adding more complexity makes it increasing difficult to churn through. To solve this problem would require developing an algorithm that finds an answer in smart way, rather than just having a computer grind through all the options—and that is the type of algorithm that Babai believes he has created. Unfortunately, an algorithm has not yet been created to test such algorithms, thus, it will have to be vetted by other experts in the field before it can be labeled as a success.

**Explore further:**
Simon's algorithm run on quantum computer for the first time—faster than on standard computer

**More information:**
people.cs.uchicago.edu/~laci/quasipoly.html

## Doug_Huffman

## thefurlong

The hard problems actually belong to NP-hard. These are problems that every problem in NP can be reduced to in polynomial time. Conceptually, every problem in NP is at least as hard as every problem in NP-hard.

A very special class of NP-hard problems is NP complete problems. These also have the property that every instance of a problem can be converted to some problem in NP in polynomial time, which means that if you know an efficient algorithm for solving an NP-complete problem, you have an efficient algorithm for solving every NP problem.

## antialias_physorg

http://math.stack...c-graphs

This - if it turns out to be true - is potentially huge, as you can use this to see e.g. if a complex theory maps onto a simpler one (i.e. whether two theories are equivalent or not). A more concrete application would be to see whether a complex encryption scheme is vulnerable to a simple 'shortcut'.

Since machine learning is a big thing now: it could be used to find simpler neural network topologies from trained systems...potentially speeding up processing by quite a lot.

## Doug_Huffman

## Torbjorn_Larsson_OM

Babai's result, which was in rumor before the press release, is an improvement in speed on existing bounds.

Yes, you can always claim the problem is "impossible" if you increase its size. But as the illustration shows, it is solvable and indeed solved for a great many cases.

[tbctd]

## Torbjorn_Larsson_OM

"Earlier today, I was tipped off to what might be the theoretical computer science result of the decade. ...

... the legendary Laszlo Babai will be giving a talk about a new algorithm that solves the graph isomorphism problem in quasipolynomial time. The previous fastest algorithm to decide whether two n-vertex graphs G and H are isomorphic—by Babai and Luks, back in 1983—ran in exp(√(n log n)) time. If we credit the announcement, Babai has now gotten that down to exp(polylog(n)), putting one of the central problems of computer science "just barely above P." (For years, I've answered questions on this blog about the status of graph isomorphism—would I bet that it's in BQP? in coNP? etc.—by saying that, as far as I and many others are concerned, it might as well just be in P. Of course I'm happy to reaffirm that conjecture tonight.)"

[ http://www.scotta.../?p=2521 ]

[tbctd]

## Torbjorn_Larsson_OM

"Babai: this is a purely theoretical enterprise. Brandon McCay have a very fast algorithm. This will not challenge theirs ...

Babai just gave the sketch proof of the algorithm-- which will probably not be faster than any of the existing algorithms in practice ...

After the talk, I spoke w Babai. Why is this result impractical? Because existing algorithms are already very fast in practical cases. ...

One of the challenges is that there aren't even any known examples where the existing heuristics perform slowly. ...

[ https://storify.c...ism-talk ]

## ogg_ogg

Did you really mean to write:"Conceptually, every problem in NP is at least as hard as every problem in NP-hard." ???

Just say it ain't so, Joe.

## ogg_ogg

## thefurlong

Haha. Thanks for catching my typo. I meant that every problem in NP is AT MOST as hard as every problem in NP-hard.