Explained: The Shannon limit

Jan 19, 2010 by Larry Hardesty

It's the early 1980s, and you’re an equipment manufacturer for the fledgling personal-computer market. For years, modems that send data over the telephone lines have been stuck at a maximum rate of 9.6 kilobits per second: if you try to increase the rate, an intolerable number of errors creeps into the data.

Then a group of engineers demonstrates that newly devised error-correcting codes can boost a modem’s transmission rate by 25 percent. You scent a business opportunity. Are there codes that can drive the data rate even higher? If so, how much higher? And what are those codes?

In fact, by the early 1980s, the answers to the first two questions were more than 30 years old. They’d been supplied in 1948 by Claude Shannon SM ’37, PhD ’40 in a groundbreaking paper that essentially created the discipline of . “People who know Shannon’s work throughout science think it’s just one of the most brilliant things they’ve ever seen,” says David Forney, an adjunct professor in MIT’s Laboratory for Information and Decision Systems.

Shannon, who taught at MIT from 1956 until his retirement in 1978, showed that any communications channel — a telephone line, a radio band, a fiber-optic cable — could be characterized by two factors: bandwidth and noise. Bandwidth is the range of electronic, optical or electromagnetic frequencies that can be used to transmit a signal; noise is anything that can disturb that signal.

Given a channel with particular and noise characteristics, Shannon showed how to calculate the maximum rate at which data can be sent over it with zero error. He called that rate the channel capacity, but today, it’s just as often called the Shannon limit.

In a noisy channel, the only way to achieve zero error is to add some redundancy to a transmission. For instance, if you were trying to transmit a message with only three bits, like 001, you could send it three times: 001001001. If an error crept in, and the receiver received 001011001 instead, she could be reasonably sure that the correct string was 001.

Any such method of adding extra information to a message so that errors can be corrected is referred to as an error-correcting . The noisier the channel, the more information you need to add to compensate for errors. As codes get longer, however, the transmission rate goes down: you need more bits to send the same fundamental message. So the ideal code would minimize the number of extra bits while maximizing the chance of correcting error.

By that standard, sending a message three times is actually a terrible code. It cuts the data by two-thirds, since it requires three times as many bits per message, but it’s still very vulnerable to error: two errors in the right places would make the original message unrecoverable.

But Shannon knew that better error-correcting codes were possible. In fact, he was able to prove that for any communications channel, there must be an error-correcting code that enables transmissions to approach the Shannon limit.

His proof, however, didn’t explain how to construct such a code. Instead, it relied on probabilities. Say you want to send a single four-bit message over a noisy channel. There are 16 possible four-bit . Shannon’s proof would assign each of them its own randomly selected code — basically, its own serial number.

Consider the case in which the channel is noisy enough that a four-bit message requires an eight-bit code. The receiver, like the sender, would have a codebook that correlates the 16 possible four-bit messages with 16 eight-bit codes. Since there are 256 possible sequences of eight bits, there are at least 240 that don’t appear in the codebook. If the receiver receives one of those 240 sequences, she knows that an error has crept into the data. But of the 16 permitted codes, there’s likely to be only one that best fits the received sequence — that differs, say, by only a digit.

Shannon showed that, statistically, if you consider all possible assignments of random codes to messages, there must be at least one that approaches the Shannon limit. The longer the code, the closer you can get: eight-bit codes for four-bit messages wouldn’t actually get you very close, but two-thousand-bit codes for thousand-bit messages could.

Of course, the coding scheme Shannon described is totally impractical: a codebook with a separate, randomly assigned code for every possible thousand-bit message wouldn’t begin to fit on all the hard drives of all the computers in the world. But Shannon’s proof held out the tantalizing possibility that, since capacity-approaching codes must exist, there might be a more efficient way to find them.

The quest for such a code lasted until the 1990s. But that’s only because the best-performing code that we now know of, which was invented at MIT, was ignored for more than 30 years. That, however, is a story for the next installment of Explained.

Second part: Explained: Gallager codes.

Explore further: Ant colonies help evacuees in disaster zones

Related Stories

Researcher finds optimal fix-free codes

Apr 03, 2009

(PhysOrg.com) -- More than 50 years after David Huffman developed Huffman coding, an entropy encoding algorithm used for lossless data compression in computer science and information theory, an electrical ...

Hello, Hello, Earth?

Dec 05, 2004

If ET ever phones home, chances are Earthlings wouldn't recognize the call as anything other than random noise or a star. New research shows that highly efficient electromagnetic transmissions from our neighbors in space wou ...

First International Conference on Quantum Error Correction

Oct 01, 2007

Quantum error correction of decoherence and faulty control operations forms the backbone of all of quantum information processing. In spite of remarkable progress on this front ever since the discovery of quantum error correcting ...

NASA names new space shuttle manager

Feb 25, 2008

The U.S. space agency has named John Shannon as its new space shuttle program manager.

Optical fibre: secure in all the chaos

Jan 17, 2008

Secure messages hidden in chaotic waveforms, transmitted at up to 10 gigabits per second, is the vision behind a group of dedicated European researchers. Now they are prototyping the equipment that could make the vision a ...

GIOVE-A navigation signal available to users

Mar 02, 2007

The GIOVE-A Signal-in-Space Interface Control Document, the document that gives the technical details of the signals transmitted by the GIOVE-A satellite, has been released. This will allow receiver manufacturers ...

Recommended for you

Simplicity is key to co-operative robots

7 hours ago

A way of making hundreds—or even thousands—of tiny robots cluster to carry out tasks without using any memory or processing power has been developed by engineers at the University of Sheffield, UK.

Freight train industry to miss safety deadline

8 hours ago

The U.S. freight railroad industry says only one-fifth of its track will be equipped with mandatory safety technology to prevent most collisions and derailments by the deadline set by Congress.

Computer software accurately predicts student test performance

9 hours ago

Emotient, the leading provider of facial expression recognition data and analysis, and the University of California, San Diego announced publication of a joint study by two Emotient co-founders affiliated ...

IBM posts lower 1Q earnings amid hardware slump

9 hours ago

IBM's first-quarter earnings fell and revenue came in below Wall Street's expectations amid an ongoing decline in its hardware business, one that was exasperated by weaker demand in China and emerging markets.

Google takes hit on growth disappointment (Update)

9 hours ago

Google lost some of its luster Wednesday as quarterly results failed to meet lofty Wall Street expectations, sending its shares down sharply.

Microsoft CEO is driving data-culture mindset

10 hours ago

(Phys.org) —Microsoft's future strategy: is all about leveraging data, from different sources, coming together using one cohesive Microsoft architecture. Microsoft CEO Satya Nadella on Tuesday, both in ...

User comments : 5

Adjust slider to filter visible comments by rank

Jarek
not rated yet Jan 19, 2010
There is new error correction method very near Shannon's limit: which is finally linear (not exponential as usual), works for any noise levels (not only low as usual), allows to manipulate the amount of redundancy added in continuous way (not simple fractions ratios as usual) and allows additionally to simultaneously compress and encrypt the data well.
Here is simulator of it's processing and link to the paper:
http://demonstrat...onTrees/
antialias_physorg
not rated yet Jan 19, 2010
Best performing code is a function of your noise characteristic.

Some channels have a probabilistic approach to noise (e.g. by flipping the occasional bit with each bit flip being randomly distributed)

Some channels have noise that occurs in blocks (e.g. when lightning strikes nearby you get a whole block of data that is corrupted)

It's also a function of how probable the individual bit sequences in your message are (i.e. it may be advantageous to encode frequently used signals in shorter bit sequences at the cost of making very seldom occuring signals very long)

Shannon's papers are extremely interesting. I would recommend anyone who is into information encoding/transmission (or encrypting/decrypting) to read them.

Jarek
not rated yet Jan 20, 2010
Theoretical limit is one thing, the real question is how to get near it in practice - not requiring exponential correction time as it is supposed to..
Ok,I've understood 30 years from now, but I think they've meant from 1990 when Gallager's LDPC was reinvented by MacKay. This near Shannon's limit approach usually requires square time for placing the parity bits and exponential time for correction process and so can be used only for very low noises.

Strength of the approach from the link is that the verification for the correction process is made locally - we don't longer need to work simultaneously on the whole message as in sudoku-like LDPC correction, but we expand our 'correction tree' bit by bit and have immediate probabilistic verification, making that the tree has finite expected width and so we need linear time for correction process for any noise levels.

About Shannon's ideas for cryptography in times of quantum computers:
http://www.ecrypt...php?1,10
ThomasS
not rated yet Jan 20, 2010
The amazing thing that Shannon proved (his noisy channel coding theorem) is that you can get an *arbitrarily* small error probability on a noisy channel, without the data rate becoming zero. A highly counterinituitive result.
Jarek
not rated yet Jan 20, 2010
Why counterintuitive?
Sending such 'uncertain' bit - let say '1' with 1-p probability really denotes '1' and with p it was intended to be '0' - so the information which of theses cases it is contains
h(p)= -p lg(p)-(1-p)lg(1-p) bits
so such 'uncertain bit' contains 1-h(p) real bits of information - it's exactly the Shannon's limit.

More news stories

Simplicity is key to co-operative robots

A way of making hundreds—or even thousands—of tiny robots cluster to carry out tasks without using any memory or processing power has been developed by engineers at the University of Sheffield, UK.

Microsoft CEO is driving data-culture mindset

(Phys.org) —Microsoft's future strategy: is all about leveraging data, from different sources, coming together using one cohesive Microsoft architecture. Microsoft CEO Satya Nadella on Tuesday, both in ...

Floating nuclear plants could ride out tsunamis

When an earthquake and tsunami struck the Fukushima Daiichi nuclear plant complex in 2011, neither the quake nor the inundation caused the ensuing contamination. Rather, it was the aftereffects—specifically, ...

Patent talk: Google sharpens contact lens vision

(Phys.org) —A report from Patent Bolt brings us one step closer to what Google may have in mind in developing smart contact lenses. According to the discussion Google is interested in the concept of contact ...

Quantenna promises 10-gigabit Wi-Fi by next year

(Phys.org) —Quantenna Communications has announced that it has plans for releasing a chipset that will be capable of delivering 10Gbps WiFi to/from routers, bridges and computers by sometime next year. ...

Micro-macro entangled 'cat states' could one day test quantum gravity

(Phys.org) —In Schrödinger's famous thought experiment, a cat's quantum state becomes entangled with the quantum state of a decaying nucleus, resulting in the odd situation that the cat is both alive and ...

Key milestone for brown fat research with a ground-breaking MRI scan

The first MRI scan to show 'brown fat' in a living adult could prove to be an essential step towards a new wave of therapies to aid the fight against diabetes and obesity.

Scientists capture ultrafast snapshots of light-driven superconductivity

A new study pins down a major factor behind the appearance of superconductivity—the ability to conduct electricity with 100 percent efficiency—in a promising copper-oxide material.

Distracted driving among teens threatens public health and safety

Motor vehicle crashes rank as the leading cause of teen deaths and in 2008, 16% of all distraction-related fatal automobile crashes involved drivers under 20 years of age. These grim statistics, coupled with an increasing ...

With neutrons, scientists can now look for dark energy in the lab

It does not always take a huge accelerator to do particle physics: First results from a low energy, table top alterative takes validity of Newtonian gravity down by five orders of magnitude and narrows the ...