Data Travels Six Times Faster in the Clouds

Feb 26, 2009
Sector is cloud computing system designed for data intensive computing. Sector is designed to run on racks of comodity computers (such as pictured). The racks may be located within a single data center or across several geographically distributed data centers. Credit: Michal Sabala, NCDM, University of Illinois at Chicago

(PhysOrg.com) -- The National Center for Data Mining (NCDM) at the University of Chicago at Illinois established a cloud computing system that can quickly compile data from widely geographically distributed data centers across high performance networks. NCDM used the Open Cloud Testbed, managed by the Open Cloud Consortium, to demonstrate the "Sector System" at the annual meeting of the American Association for the Advancement of Science conference earlier this month in Chicago.

"We demonstrated that our system is six times faster than competing technology," said Robert Grossman, NCDM director and Open Data Group managing partner. "Without the requirement of costly and combersome data transfer from various locations to one central location, this opens the way to exciting collaborative scientific discovery."

This is a diagram of Phase 1 of the Open Cloud Testbed. Phase 1 of the Open Cloud Testbed consists of 4 racks located at the University of Illinois at Chicago, the StarLight Facility in Chicago, Johns Hopkins University in Baltimore, Maryland, and Calit2 at the University of California at San Diego. The 4 racks are connected by a 10 Gb/s network, provided by the Cisco C-Wave and regional high performance networks at each of the locations. The Open Cloud Testbed is managed by the Open Cloud Consortium. Credit: Open Cloud Consortium

Grossman and his team demonstrated using a common benchmark called Terasort. They found there was less than a 5 percent performance penalty when Terasort was run across the four data centers distributed across the country compared to running the entire computation within one data center. Prior to the Sector System, such computations were rarely done, as performance penalties were as high as 30 percent.

"With the Sector System, data intensive computing can scale not only to a data center, but for the first time, across data centers," said Grossman." This enables locating data centers in areas in which power and cooling is cost-effective."

The Open Cloud Testbed consists of racks of computers located at the University of Illinois at Chicago, the StarLight facility in Chicago, Johns Hopkins University in Baltimore, Maryland, and the University of California at San Diego, all connected by a wide area 10 Gb/s network, and all running a variety of cloud computing services, including cloud storage services and cloud computing services. The technology that makes this possible uses an open architecture design, specifically the open source sector system developed by the NCDM (sector.sf.net).

Although cloud computing is becoming common, processing data by clouds today is almost always done within a single data center. Generally, data intensive computing across geographically distributed data centers is avoided due to the difficulties and cost of moving large amounts of data over long distances. Sector employs an alternative network protocol called UDT designed to swiftly and smoothly transfer data.

According to Joe Mambretti, director of the International Center of Advanced Internet Research at Northwestern University and co-director of the Open Cloud Testbed, "These innovative technologies provide unique capabilities that will enable new generations of applications that can make discoveries involving large volumes of highly distributed data."

Provided by National Science Foundation

Explore further: Innovative new supercomputers increase nation's computational capacity and capability

add to favorites email to friend print save as pdf

Related Stories

States ascend into the cloud

Oct 24, 2014

Seven years ago, the state of Delaware started moving computer servers out of closets and from under workers' desks to create a consolidated data center and a virtual computing climate.

How might climate change affect our food supply?

Jul 30, 2014

It's no easy question to answer, but prudence demands that we try. Thus, Microsoft and the United States Department of Agriculture (USDA) have teamed up to tackle "food resilience," one of several themes ...

Box's IPO delay a sign of Silicon Valley cool-down

May 12, 2014

The wild ride of hundred-million-dollar investments and soaring initial public offerings that opened the year for the technology industry has stalled thanks to the recent gyrations on Wall Street, forcing big-name software ...

Recommended for you

Forging a photo is easy, but how do you spot a fake?

Nov 21, 2014

Faking photographs is not a new phenomenon. The Cottingley Fairies seemed convincing to some in 1917, just as the images recently broadcast on Russian television, purporting to be satellite images showin ...

Algorithm, not live committee, performs author ranking

Nov 21, 2014

Thousands of authors' works enter the public domain each year, but only a small number of them end up being widely available. So how to choose the ones taking center-stage? And how well can a machine-learning ...

Professor proposes alternative to 'Turing Test'

Nov 19, 2014

(Phys.org) —A Georgia Tech professor is offering an alternative to the celebrated "Turing Test" to determine whether a machine or computer program exhibits human-level intelligence. The Turing Test - originally ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.