A big leap toward lowering the power consumption of microprocessors

Jan 20, 2012
This shows CPU on the motherboard of Elektronika BK-0010-01 home computer. Credit: Creative Commons. Viacheslav Slavinsky.

The first systematic power profiles of microprocessors could help lower the energy consumption of both small cell phones and giant data centers, report computer science professors from The University of Texas at Austin and the Australian National University.

Their results may point the way to how companies like Google, Apple, Intel and Microsoft can make and hardware that will lower the of very small and very large devices.

"The less power cell phones draw, the longer the battery will last," says Kathryn McKinley, professor of computer science at The University of Texas at Austin. "For companies like Google and Microsoft, which run these enormous data centers, there is a big incentive to find ways to be more power efficient. More and more of the money they're spending isn't going toward buying the hardware, but toward the power the datacenters draw."

McKinley says that without detailed power profiles of how function with different software and different chip architectures, companies are limited in terms of how well they can optimize for .

The study she conducted with Stephen M. Blackburn of The Australian National University and their graduate students is the first to systematically measure and analyze application power, performance, and energy on a wide variety of hardware.

This work was recently invited to appear as a Research Highlight in the Communications of the Association for Computer Machinery (CACM). It's also been selected as one of this year's "most significant research papers in based on novelty and long-term impact" by the journal IEEE Micro.

"We did some measurements that no one else had done before," says McKinley. "We showed that different software, and different classes of software, have really different power usage."

McKinley says that such an analysis has become necessary as both the culture and the technologies of computing have shifted over the past decade.

Energy efficiency has become a greater priority for consumers, manufacturers and governments because the shrinking of processor technology has stopped yielding exponential gains in power and performance. The result of these shifts is that hardware and software designers have to take into account tradeoffs between performance and power in a way they did not ten years ago.

"Say you want to get an application on your phone that's GPS-based," says McKinley, "In terms of energy, the GPS is one of the most expensive functions on your phone. A bad algorithm might ping your GPS far more than is necessary for the application to function well. If the application writer could analyze the power profile, they would be motivated to write an algorithm that pings it half as often to save energy without compromising functionality."

McKinley believes that the future of software and hardware design is one in which power profiles become a consideration at every stage of the process.

Intel, for instance, has just released a chip with an exposed power meter, so that software developers can access some information about the power profiles of their products when run on that chip. McKinley expects that future generations of chips will expose even more fine-grained information about power usage.

Software developers like Microsoft (where McKinley is spending the next year, while taking a leave from the university) are already using what information they have to inform their designs. And device manufacturers are testing out different architectures for their phones or tablets that optimize for power usage.

McKinley says that even consumers may get information about how much power a given app on their smart phone is going to draw before deciding whether to install it or not.

"In the past, we optimized only for performance," she says. "If you were picking between two software algorithms, or chips, or devices, you picked the faster one. You didn't worry about how much power it was drawing from the wall socket. There are still many situations today—for example, if you are making software for stock market traders—where speed is going to be the only consideration. But there are a lot of other areas where you really want to consider the usage."

Explore further: Can cartoons be used to teach machines to understand the visual world?

Related Stories

Smartphone app illuminates power consumption

Nov 20, 2009

(PhysOrg.com) -- A new application for the Android smartphone shows users and software developers how much power their applications are consuming. PowerTutor was developed by doctoral students and professors ...

Recommended for you

Five ways the superintelligence revolution might happen

Sep 26, 2014

Biological brains are unlikely to be the final stage of intelligence. Machines already have superhuman strength, speed and stamina – and one day they will have superhuman intelligence. This is of course ...

Artificial intelligence that imitates children's learning

Sep 23, 2014

The computer programmes used in the field of artificial intelligence (AI) are highly specialised. They can for example fly airplanes, play chess or assemble cars in controlled industrial environments. However, a research ...

User comments : 4

Adjust slider to filter visible comments by rank

Display comments: newest first

ArtflDgr
1 / 5 (1) Jan 20, 2012
This is a Duh

Why?

Well, if information follows energy laws like thermodynamics and conservation of energy..
Then the power that a piece of software would use would vary based on that, just as doing different work with matter has different costs..

This could have been anticipated as a derivative of the same math and such that originated with turing and Shannon in information theory

Basically energy IS information, and information is energy
The different changes that energy makes to matter is memory
Etc
ArtflDgr
3 / 5 (2) Jan 20, 2012
The result of these shifts is that hardware and software designers have to take into account tradeoffs between performance and power in a way they did not ten years ago. [no, but they had to do it constantly 20 years ago ie the advances in programming and requirements have caught up to the lead that hardware gained. In the early days of computers in business and such, the same problems were daily things to think of. ]
ArtflDgr
1 / 5 (1) Jan 20, 2012
engineers been benchmarking since before i was born
the makers of these devices know exactly how much power they use as thats part of the EE calculations to make the thing

i give up when such old stuff, is celebrated as incredible and new...
that_guy
not rated yet Feb 02, 2012
I would like to point out that benchmarking for power used by processors and benchmarking the software are generally done independently - in fact, software is generally only designed to work within the parameters of current hardware. Streamlining is only done for the purpose of adding feature bloat. (Here's looking at you microsoft)

In addition, add the complication that even chips with identical instructions set may process things less efficiently or more efficiently than one another due to design differences.

This may have some utility, especially for data centers. However, for phones, given the huge range of architectures, software, and processors, I don't see it as a substantial benefit over the current process.

Generally, the fastest software is going to be biased toward more efficient software that uses resources efficiently, and therefore can produce the fastest results with the hardware at hand.

The situations where the opposite is true is the minority fraction.