UA engineer designs better error-correction code

Oct 25, 2010 By Pete Brown

(PhysOrg.com) -- One company already has licensed the technology from the UA, and patents are pending to meet growing computer industry demand for the error-correction algorithm developed by Bane Vasic.

The inner workings of computer chips and hard drives are a mystery to most people – which is ironic, considering they work much like human brains.

Error-correcting codes have played a vital role during the last 50 years by ensuring that digital data keeps its integrity within computer communication and storage systems.

The codes programmed into act just like our brains when we try to make sense of something unfamiliar. Like human brains, these chips search for true meanings by constantly looking for errors and correcting them.

"Error correction is present everywhere, in natural as well as in man-made systems," said professor Bane Vasić of the University of Arizona's department of electrical and computer engineering. He uses his Serbian accent, and the way some people occasionally misunderstand what he is saying, as an example.

"As I am talking, your brain corrects the errors caused by my accent," he said. "Our brains correct all kind of errors in speech by analyzing the context and meaning of sentences."

"The error correction-codes that we as engineers build in communications or memory chips are a kind of grammar that computers use to understand data and keep it meaningful."

Now, more than 50 years after discovery of error-correction codes, Vasić has discovered a way to design error-correction decoders with superior performance.

At the heart of modern coding theory is the fact that is used to interpret certain error-correction codes. This artificial intelligence key is called the "belief propagation algorithm."

This algorithm is actually relatively simple. Data bits are continuously being deconstructed and reconstructed, and the algorithm does this at high speed and doesn't make too many errors. This means, for example, that data is saved or transmitted quickly and that virtually no errors are introduced into the data during deconstruction and reconstruction.

But the belief propagation algorithm does have its limitations. When the algorithm is acting on shorter correction codes, performance can abruptly drop through the floor. In fact, this loss of performance is known as the "error floor phenomenon." Vasić describes it as "arguably one of the most important problems in coding theory."

Vasić has discovered how to do error correction that is both simpler and better than belief propagation. "It is like solving Sudoku," Vasić said of his new algorithm.

Imagine that the cells in a Sudoku puzzle represent the transmitted bits that need to be reconstructed. The contents of some bits, or cells, are known, but some are blank. Worse, some are wrong.

"A neat property of this new is that there is no need for a brain or some central intelligence to solve the puzzle, because the cells solve the puzzle collectively," Vasić said. "No individual cell has a global knowledge about the solution, but collectively the cells find the solution by passing messages among one another."

This message passing is like small-town gossip, said Vasić. "Wrong cells are not good neighbors, and we spread virtual gossip to the neighbors of bad neighbors, so that cells learn who is good and bad in the neighborhood."

"Eventually, the bad neighbors see the error of their ways and become good. Everything is in harmony, and it has been achieved simply, without central command."

Vasić said the original belief was that the algorithms used to reconstruct data, or to solve the Sudoku, would have to come up with a solution that was very close to the finished puzzle in order to work. This would have made them too complex, said Vasić.

"We have demonstrated that this original belief is not the case," Vasić said. "In some cases our simple decoder outperforms belief propagation to the extent that errors are reduced by 90 percent."

Vasić described the development of these new algorithms as a hard problem that required years of research using new theoretical tools and knowledge from disciplines such as combinatorics, artificial intelligence, machine learning and statistical mechanics.

"I am lucky to have extremely talented students and collaborators," Vasić said. "This breakthrough is a result of joint work with my students, Shashi Kiran Chilappagari and Shiva Planjery, and my collaborator from France, professor David Declercq."

Vasić said his discovery "opens up a plethora of beautiful theoretical problems." The National Science Foundation agrees, and is funding his research to the tune of $675,000.

Vasic's lab is in the process of patenting these new decoders, which he plans to implement in silicon chips that will be offered to flash memory, optical communications and hard drive companies. "We already have at least three companies interested in this technology," Vasić said. "One has already licensed the technology from the University of Arizona."

The application in fault tolerant systems is also very intriguing," Vasić said. "We believe we can prove that these decoders will work well in high radiation environments such as space exploration and satellite systems, as well as in systems built of unreliable components, such as nano-scale systems."

Explore further: Modeling the ripples of health care information

Related Stories

First International Conference on Quantum Error Correction

Oct 01, 2007

Quantum error correction of decoherence and faulty control operations forms the backbone of all of quantum information processing. In spite of remarkable progress on this front ever since the discovery of quantum error correcting ...

Researcher finds optimal fix-free codes

Apr 03, 2009

(PhysOrg.com) -- More than 50 years after David Huffman developed Huffman coding, an entropy encoding algorithm used for lossless data compression in computer science and information theory, an electrical ...

Explained: Gallager codes

Jan 21, 2010

In the 1948 paper that created the field of information theory, MIT grad (and future professor) Claude Shannon threw down the gauntlet to future generations of researchers. In those predigital days, communications ...

Explained: The Shannon limit

Jan 19, 2010

It's the early 1980s, and you’re an equipment manufacturer for the fledgling personal-computer market. For years, modems that send data over the telephone lines have been stuck at a maximum rate of 9.6 kilobits ...

New technique accelerates biological image analysis

May 01, 2008

Researchers in Carnegie Mellon University’s Lane Center for Computational Biology have discovered how to significantly speed up critical steps in an automated method for analyzing cell cultures and other biological specimens.

Selling chip makers on optical computing

Nov 24, 2009

(PhysOrg.com) -- Computer chips that transmit data with light instead of electricity consume much less power than conventional chips, but so far, they've remained laboratory curiosities. Professors Vladimir ...

Recommended for you

Forging a photo is easy, but how do you spot a fake?

Nov 21, 2014

Faking photographs is not a new phenomenon. The Cottingley Fairies seemed convincing to some in 1917, just as the images recently broadcast on Russian television, purporting to be satellite images showin ...

Algorithm, not live committee, performs author ranking

Nov 21, 2014

Thousands of authors' works enter the public domain each year, but only a small number of them end up being widely available. So how to choose the ones taking center-stage? And how well can a machine-learning ...

User comments : 7

Adjust slider to filter visible comments by rank

Display comments: newest first

Sanescience
5 / 5 (1) Oct 25, 2010
I have never seen ECC likened to AI before. I'm guessing this has nothing to do with CRC? Anyone know what this is about?
jamey
5 / 5 (1) Oct 25, 2010
Yeah - I'm finding this kinda odd, myself - I thought that the theoretical best error correcting codes had been identified, and proven mathematically - and any improvement on those would result in more than a few mathematician's brains exploding. This sounds more like signal processing to extract signal from noise, than error correction.
DamienS
not rated yet Oct 25, 2010
Nice work, but I really do hate tortured analogies meant to explain things, but which instead create confusion!
Grizzled
5 / 5 (1) Oct 26, 2010
It's enough to see them talking about deconstructing and reconstructing a bit. Excuse me? Bits are indivisible. I think what they are talking about is data/signal, noise reduction, even pattern recognition (that's where AI comes into play at least marginally). But it's definitely not ECC in the traditional sense and definition.
ab3a
5 / 5 (1) Oct 26, 2010
ARGH! Would someone please stop making dumb talk and at least name the mathematics used to build this ECC? There are BCH codes, LDPC codes, and Turbo Codes, to name just a few of them more popular ones.

Where did this one get derived from? What kind is it?

I don't need a tutorial on ECC. I would like to know what this professor has discovered. Is that too much to ask here?
Quantum_Conundrum
5 / 5 (1) Oct 26, 2010
I don't need a tutorial on ECC. I would like to know what this professor has discovered. Is that too much to ask here?


Articles on this site often say as little as possible about the topic. They spend more time kissing ten people's behinds, and making bad analogies, rather than explaining the subject matter.
bugmenot23
not rated yet Oct 28, 2010
"Articles on this site often say as little as possible about the topic. They spend more time kissing ten people's behinds, and making bad analogies, rather than explaining the subject matter."

I found this site because of interesting headlines, but your analysis is spot on. Articles appear to be more about news aggregation than anything else.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.