A Washington State University research team has designed a tiny, wireless data center that someday could be as small as a hand-held device and dramatically reduce the energy needed to run such centers.
Their idea is a paradigm shift in the management of big data, said Partha Pratim Pande, a computer engineering professor in the School of Electrical Engineering and Computer Science.
Pande, who is collaborating with WSU professor Deuk Heo and a team from Carnegie Mellon University, presented the preliminary design for a data-center-on-a-chip this week at the Embedded Systems Week conference in Pittsburgh. The researchers recently received a $1.2 million National Science Foundation grant to further develop their transformative idea.
Data centers and high performance computing clusters are energy hogs, requiring enormous amounts of power and space. Often requiring air conditioners to cool their many processors, data centers consumed about 91 billion kilowatt-hours of electricity in the U.S. in 2013, which is equivalent to the output of 34 large, coal-fired power plants, according to the National Resources Defense Council.
Large data farms run by companies like Facebook have made significant energy efficiency improvements, but many data servers at small businesses around the country still consume significant resources. Sustainable computing has become of increasing interest to researchers, industry leaders and the public.
"We have reached our power limit already," said Pande. "To address our energy efficiency challenges, this architecture and technology need to be adopted by the community."
3D chip three times more efficient
Unlike portable devices that have gone wireless, data farms that provide instant availability to text messages, video downloads and more still use conventional metal wires on computer chips, which are wasteful for relatively long-range data exchange.
Most data centers are made up of several processing cores. One of their major performance limitations stems from the multi-hop nature of data exchange. That is, data has to move around several cores through wires, slowing down the processor and wasting energy.
Pande's group in recent years designed a wireless network on a computer chip. Similar to the way a cell phone works, the system includes a tiny, low-power transceiver, on-chip antennas and communication protocols that enable wireless shortcuts.
The new work expands these capabilities for a wireless data-center-on-a-chip. In particular, the researchers are moving from two-dimensional chips to a highly integrated, three-dimensional, wireless chip at the nano- and microscales that can move data more quickly and efficiently.
For instance, the researchers will be able to run big data applications on their wireless system three times more efficiently than the best data center servers.
Personal cloud computing possibilities
As part of their grant, the researchers will evaluate the wireless data center to increase energy efficiency while also maintaining fast, on-chip communications. The tiny chips, consisting of thousands of cores, could run data-intensive applications orders of magnitude more efficiently compared to existing platforms. Their design has the potential to achieve a comparable level of performance as a conventional data center using much less space and power.
It could someday enable personal cloud computing possibilities, said Pande, adding that the effort would require massive integration and significant innovation at multiple levels.
"This is a new direction in networked system design,'' he said. "This project is redefining the foundation of on-chip communication."
Explore further: Research could lead to dramatic energy savings at data farms