Researcher finds optimal fix-free codes

Apr 03, 2009
Researcher finds optimal fix-free codes
Dr. Serap Savari

(PhysOrg.com) -- More than 50 years after David Huffman developed Huffman coding, an entropy encoding algorithm used for lossless data compression in computer science and information theory, an electrical and computer engineering faculty member at Texas A&M University has discovered a way to construct the most efficient fix-free codes.

Huffman coding uses a variable-length code table for choosing the representation for each symbol, resulting in a prefix code (that is, the bit string representing some particular symbol is never a prefix of the bit string representing any other symbol) that expresses the most common characters using shorter strings of bits than are used for less common source symbols. Huffman was able to design the most efficient compression method ofthis type since no other mapping of individual source symbols to unique strings of bits will produce a smaller average output size when the actual symbol frequencies agree with those used to create the code.

Dr. Serap Savari, an associate professor in the Department of Electrical and Computer Engineering at Texas A&M, has developed the first approach to finding the optimal fix-free code, variable length codes in which no codeword is the prefix or suffix of another codeword.

“My method of finding optimal fix-free codes is computationally demanding, but no one has solved the problem before even though it was first posed in 1990,” Savari said. “Earlier algorithms produced good fix-free codes in a reasonably (time) efficient way, but without the guarantee of optimality.”

While there are numerous applications for fix-free codes, the most important applications have been in communications. Fix-free codes have been investigated for joint source-channel coding and have been applied within the video standards H.263+ and MPEG-4 because their property of efficient decoding in both the forward and backward directions assists with error resilience. They are also interesting for problems in information retrieval such as searching for patterns directly in compressed text. Savari is uncertain how her discovery will impact these and other applications of fix-free codes, but hopes that her work will be used by researchers and people implementing practical systems..

“My work is like Huffman’s in that it is basic research that is motivated by practically important problems and which contributes to the theory of data compression,” she said.

Savari has already been invited to discuss her findings at numerous seminars throughout the United States, including Stanford University, the University of Illinois-Urbana- Champaign, The University of California, Berkeley, the University of California, San Diego, Caltech, the University of Southern California and possibly MIT in the fall.

Provided by Texas A&M University

Explore further: Computerized emotion detector

add to favorites email to friend print save as pdf

Related Stories

Entanglement unties a tough quantum computing problem

Sep 28, 2006

Error correction coding is a fundamental process that underlies all of information science, but the task of adapting classical codes to quantum computing has long bumped up against what seemed to be a fundamental limitation.

First International Conference on Quantum Error Correction

Oct 01, 2007

Quantum error correction of decoherence and faulty control operations forms the backbone of all of quantum information processing. In spite of remarkable progress on this front ever since the discovery of quantum error correcting ...

Cracking the secret codes of Europe's Galileo satellite

Jul 08, 2006

Members of Cornell's Global Positioning System (GPS) Laboratory have cracked the so-called pseudo random number (PRN) codes of Europe's first global navigation satellite, despite efforts to keep the codes secret. ...

Neurobiologists uncover evidence of a 'memory code'

Sep 08, 2005

By examining how sounds are registered during the process of learning, UC Irvine neurobiologists have discovered a neural coding mechanism that the brain relies upon to register the intensity of memories based on the importance ...

Recommended for you

Computerized emotion detector

6 hours ago

Face recognition software measures various parameters in a mug shot, such as the distance between the person's eyes, the height from lip to top of their nose and various other metrics and then compares it with photos of people ...

Cutting the cloud computing carbon cost

Sep 12, 2014

Cloud computing involves displacing data storage and processing from the user's computer on to remote servers. It can provide users with more storage space and computing power that they can then access from anywhere in the ...

Teaching computers the nuances of human conversation

Sep 12, 2014

Computer scientists have successfully developed programs to recognize spoken language, as in automated phone systems that respond to voice prompts and voice-activated assistants like Apple's Siri.

Mapping the connections between diverse sets of data

Sep 12, 2014

What is a map? Most often, it's a visual tool used to demonstrate the relationship between multiple places in geographic space. They're useful because you can look at one and very quickly pick up on the general ...

User comments : 5

Adjust slider to filter visible comments by rank

Display comments: newest first

QubitTamer
5 / 5 (1) Apr 03, 2009
Hot.
jonnyboy
not rated yet Apr 03, 2009
Not.
EdP
not rated yet Apr 03, 2009
Great achievement. She is part of Computer Science history now.
docknowledge
not rated yet Apr 04, 2009
Hmm. Like Huffman's? He was my professor at university. He might like this work, but maybe not her approach to self-advertising. Not sure how this discovery qualifies as "findings", either, but maybe the article isn't explaining sufficiently. Sounds more like a one-off discovery. But. We can always hope.
Nartoon
1 / 5 (1) Apr 04, 2009
zzzZZZ