(PhysOrg.com) -- Working separately on two different types of technology, two groups have developed a type of fiber cable capable of delivering over 100 terabits of data per second; several orders of magnitude higher than anything currently in use. The first group figured out a way to shove more data into a single signal, while the second group created a fiber cable with multiple cores, rather than just the standard one.
Both teams reported their results at the Optical Fiber Communications Conference, held last month in Los Angeles.
The first team, led by Dayou Qian, of NEC, described the process by which he and his team were able to meld the data sent by 370 lasers into just one stream of pulses, which was then sent across 165 kilometers (541 feet) of fiber cable. This was made possible by giving the data from each laser its own unique part of the infrared spectrum, with each employing differing polarities, amplitudes and phases to create the packets of code to be sent. With this method, they were able to move data at 101.7 Tbit/s.
Taking a far different, and perhaps simpler tact, Jun Sakaguchi and his team from Japan's NIICT described how they devised a means for creating a fiber cable comprised of 7 cores, each capable of carrying 15.6 Tbit/s; for a total throughput of 109 Tbit/s.
100 terabits per second is actually faster than anybody really needs right now; if you had it between you and your online video store, you could download three months of continuous viewing in just one second. Its even more than is needed to carry all the data that traverses the backbone of the Internet; the busiest line, for example, between New York and Washington D.C. only moves something like a few terabits per second. Also, there is the problem of converting both new technologies into cost effective products, as both are complex in their own way. In the short term, its likely either or both will only be deployed on short distance high traffic areas such as huge data centers like those used by Facebook, Amazon and Google.
The real benefit for such new technology is likely be in the future; in a world where users would like to have hi-def, 3-D, live feeds from everywhere, movies and perhaps even some as yet unknown type of entertainment, and where our hunger for more drives data transfer rates across the Internet to grow by 50 percent every year. Its not hard to contemplate a day when 100 terabits becomes normal as scientists go back to the lab to try to figure out a way to get that number even higher. Perhaps by combining these two new technologies?
Explore further: One man trying to bring Broadway into 21st century