Rethinking networking

February 12, 2010, Massachusetts Institute of Technology

Today, data traveling over the Internet are much like crates of oranges traveling the interstates in the back of a truck. The data are loaded in at one end, unloaded at the other, and nothing much happens to them in between.

About 10 years ago, electrical engineers suggested that bundles of data could be transmitted over a network more efficiently if, instead of passing unaltered from one end to the other, they were scrambled together along the way and unscrambled at the end. In 2003, MIT electrical engineering professor Muriel Médard and her colleagues proved the counterintuitive result that, in many cases, the best way to scramble data together was to do it randomly.

This is the second article in a two-part series on MIT contributions to the fledgling field of network coding (part one is available here).

Last year, the paper in which the researchers presented their findings received an award from the Institute of Electrical and Electronics Engineers (IEEE), which recognized its seminal contributions to the field that’s come to be known as “network coding.” But the same year, the IEEE honored another network-coding paper, in which MIT researchers played the leading role: the paper that, in 2006, presented the first practical implementation of network coding.

The first article in this two-part series presented a simple example of how network coding increases efficiency. Suppose that two laptops are trying to send each other messages simultaneously over the Wi-Fi network in a coffee shop. Ordinarily, the first laptop would send its message to the coffee shop’s Wi-Fi router, which would forward it to the second laptop. The second laptop would send its message to the router, which would forward it to the first laptop. That’s four total transmissions.

But with network coding, the router would instead combine the two laptops’ messages into a hybrid message, which it would broadcast to both laptops, reducing the total number of transmissions to just three. Each laptop would then subtract the message it had sent from the hybrid to recover the message intended for it.

Stingy senders

Dina Katabi, an associate professor of and computer science; Médard; MIT grad students Sachin Katti and Hariharan Rahul (Katti now teaches at Stanford; Rahul still works with Katabi at MIT); and Cambridge University’s Jon Crowcroft and Wenjun Hu generalized this approach to a network with many different transmitters and receivers.

The researchers blanketed two floors of the MIT Computer Science and Artificial Intelligence Lab with wireless routers programmed to execute their network coding scheme. They then used wireless devices to exchange test data over the network. Like all information sent over modern networks, the test data were broken up into smaller “packets” of information for transmission. And as is typical with most wireless networks, electromagnetic interference, physical obstacles, and sheer distance from the wireless devices meant that no one router received all the packets. Unlike conventional routers, however, the ones in the MIT network mixed the packets’ content before forwarding them; they even mixed bytes in packets that belonged to different communication streams.

In the MIT network, routers told their neighbor routers which packets they had heard over the wireless medium. A router would use that information when preparing mixed (or coded) packets to send to its neighbors. Consider a router that has four packets in its memory, A, B, C and D. It knows that one of its neighbors also has A and C; another neighbor has A and D; and a third neighbor has C and D. So it creates a single mixed packet that combines packets A, C and D. The first neighbor uses that packet to recover D; the second neighbor uses it to recover C; and the third neighbor uses it to recover A. A single packet thus conveys three times as much information as it would have in an ordinary wireless network. Interestingly, such mixed packets, which combine information about multiple original packets, have the same size as the original packets, and hence can be transmitted over the wireless medium using the same resources as if they were normal unmixed packets.

In experiments, the coding scheme significantly increased the amount of data that the network could carry without requiring any more bandwidth. Although the precise figure varied according to the number of wireless devices accessing the network, which of them were sending information to which others, and the amount of interference they encountered, “in general,” Katabi says, “you can see, let’s say, threefold improvements.”

Chris Ramming, who as a program manager for the U.S. Defense Advanced Research Projects Agency (DARPA) funded several projects that investigated network coding, says that the two papers honored by the IEEE “are two very seminal papers, for completely different reasons.” According to Ramming, who’s now director of the Corporate External Research Office for chip manufacturer Intel, Katabi and Médard’s paper “was a practical result. It took a completely different kind of network coding and showed in a real wireless network on the MIT campus that it can make an important difference.”

Short gestation

Both Katabi and Médard have continued to work on network coding, sometimes apart, but frequently as collaborators. “We exactly complement each other,” says Katabi. “Muriel has the theoretical foundation, while I come to it from, ‘Let’s make it work, let’s make it practical. How can we actually make it work over an actual wireless channel?’ ”

Indeed, Katabi has found a way to make networks even more efficient by exploiting the physical characteristics of actual wireless channels. In the network described in the award-winning paper, wireless routers received electromagnetic signals, translated them into bits, mixed the bits together, and then translated the bits back into electromagnetic signals. But signals sent from different transmitters naturally blend together physically. That blending is usually a nuisance: the signals have to be separated before data can be extracted from them. But Katabi built another experimental wireless network in which routers don’t disentangle blended signals; they just amplify and forward them. It turns out that the network coding techniques that work at the level of the bit can be extended to reconstruct the original signals, without lots of intermediate translations.

Médard, too, has been concerned with the practical implementation of network coding, if at a higher level of abstraction. The Internet is designed so that a computer or phone receiving information will generally acknowledge the arrival of data packets. If an acknowledgment fails, the sender knows that it has to resend the missing packet. But if, as with network coding, all the packets are mixed together, which packet does the receiver acknowledge? Médard and her colleagues have found a practical way to answer that question, a system wherein each hybrid packet identifies the source packets it combines. The receiver then keeps the sender apprised of the source packets represented in the hybrids it’s received.

Médard and her colleagues are also working on applications of network coding in streaming video, where different means of combining data can be tailored to the different image resolutions of receiving devices. And they’re considering cases where network coding can be applied to several different networks at once. If, for instance, your phone can connect to both Wi-Fi networks and the cellular network, then for large files, you might want to download data over both networks at once. But the cellular network could be significantly more expensive to use than a local Wi-Fi network, so you would want to minimize the amount of data transmitted over it. Network coding can help coordinate transmissions over the two networks, to maximize efficiency while minimizing cost.

Although a young field, network coding has already sparked corporate interest. Defense contractor BAE built the equipment for the DARPA projects that Ramming funded, and Microsoft used network coding in a file-sharing program called Avalanche. Médard says that she is currently working with France Telecom, the French technology company Technicolor (formerly Thomson SA) and the Chinese telecommunications company Huawei on network coding applications. Katabi adds that Cisco, a maker of equipment, and South Korean conglomerate Samsung have also expressed interest in the MIT researchers’ work. “Network coding is a very cool idea,” says Katabi. “There are so many things you can do with it.”

Explore further: The power of 'random': 'Seemingly loopy' technique could dramatically improve communications networks

More information: This is the second article in a two-part series on MIT contributions to the fledgling field of network coding (part one is available here).

Related Stories

Disruption-free videos

September 5, 2008

Standardized video coding techniques still have their snags -- digitally transmitted images are not always disruption-free. An extension of the H.264/AVC coding format allows to protect the most important data packets to ...

Cisco to supply Clearwire with WiMax devices

May 13, 2009

(AP) -- Clearwire Corp., which is building a nearly nationwide wireless data network, says Cisco Systems Inc. has agreed to make devices that can use that network.

Netgear Launches A New Family Of Wireless-N Routers

September 29, 2008

Netgear today has announced a new family of Wireless-N networking solutions that will make it easy for anyone to upgrade their wireless home network to Wireless-N technology. This new technology supports the simultaneous ...

Sharing the air

September 22, 2009

( -- In the old days, when a new wireless technology came along, it got its own swath of the electromagnetic spectrum: AM radio uses 535 to 1,605 kilohertz, so television got chunks between 54 and 806 megahertz. ...

UCR Studying Self-Organizing Smart Wireless Networks

November 30, 2006

For wireless multihop networks to be used by thousands, the network has to be able to self-organize, which is what University of California, Riverside researchers are developing at the Bourns College of Engineering.

Recommended for you

Prenatal forest fire exposure stunts children's growth

February 19, 2019

Forest fires are more harmful than previously imagined, causing stunted growth in children who were exposed to smoke while in the womb, according to new research from Duke University and the National University of Singapore.

'Astrocomb' opens new horizons for planet-hunting telescope

February 19, 2019

The hunt for Earth-like planets, and perhaps extraterrestrial life, just got more precise, thanks to record-setting starlight measurements made possible by a National Institute of Standards and Technology (NIST) "astrocomb."


Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Feb 12, 2010
Wouldnt it make it easyer to figure out what the data is for hackers?
not rated yet Feb 12, 2010
I know little about hacker skills or their mind-set, but I think that if this new research is seen so quickly as "a hackers dream", then the security side will be quick to pick-up on this as well.

On another note, the passage:

"But Katabi built another experimental wireless network in which routers don’t disentangle blended signals; they just amplify and forward them. It turns out that the network coding techniques that work at the level of the bit can be extended to reconstruct the original signals, without lots of intermediate translations."

makes me think about some fundamental relationship between digital & analog, between straight & curved, etc.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.