Smartphone app illuminates power consumption

Nov 20, 2009

(PhysOrg.com) -- A new application for the Android smartphone shows users and software developers how much power their applications are consuming. PowerTutor was developed by doctoral students and professors at the University of Michigan.

Battery-powered cell phones serve as hand-held computers and more these days. We run power-hungry applications while we depend on the phones to be available in emergencies.

"Today, we expect our phones to realize more and more functions, and we also expect their batteries to last," said Lide Zhang, a doctoral student in the Department of Electrical Engineering and Computer Science and one of the application's developers. "PowerTutor will help make that possible."

PowerTutor will enable software developers to build more efficient products, said Birjodh Tiwana, a doctoral student in the Department of Electrical Engineering and Computer Science and another of the program's developers. Tiwana said PowerTutor will allow users to compare the power consumption of different applications and select the leanest version that performs the desired task. Users can also watch how their actions affect the phone's battery life.

PowerTutor shows in real time how four different phone components use power: the screen, the network interface, the processor, and the global positioning system receiver.

To create the application, the researchers disassembled their phones and installed electrical current meters. Then they determined the relationship between the phone's internal state (how bright the screen is, for example) and the actual power consumption. That allowed them to produce a software model capable of estimating the power use of any program the phone is running with less than 5 percent error.

PowerTutor can also provide a history. It is available free at the Market at http://www.android.com/market/ .

PowerTutor was developed under the direction of associate professor Robert Dick and assistant professor Morley Mao, both in the Department of and , and Lei Yang, a software engineer at Google. The work is supported by Google and the National Science Foundation, and was done in collaboration with the joint University of Michigan and Northwestern University Empathic Systems Project.

Provided by University of Michigan (news : web)

Explore further: SHORE facial analysis spots emotions on Google Glass

add to favorites email to friend print save as pdf

Related Stories

Researchers double cell phone memory through software alone

Sep 27, 2007

Cell phones are increasingly sophisticated -- sporting such features as cameras, music players, games, video clips, Internet access and, lest we forget, the capability to phone someone -- but these features come at a price: ...

Google Gets Ready For The Next Version of Android

Apr 14, 2009

(PhysOrg.com) -- Android 1.5 is right around the corner and this version promises better camera and GPS performance, support for video recording and Bluetooth stereo. Also included in this new version is support ...

Motorola, in need of hit, shows off Android phone

Sep 10, 2009

(AP) -- Struggling phone maker Motorola unveiled its first device using Google's Android system Thursday, banking on it to power features that will attract consumers looking to use their phones to connect ...

Recommended for you

Watching others play video games is the new spectator sport

9 hours ago

As the UK's largest gaming festival, Insomnia, wrapped up its latest event on August 25, I watched a short piece of BBC Breakfast news reporting from the festival. The reporter and some of the interviewees appeared baff ...

SHORE facial analysis spots emotions on Google Glass

Aug 28, 2014

One of the key concerns about facial recognition software has been over privacy. The very idea of having tracking mechanisms as part of an Internet-connected wearable would be likely to upset many privacy ...

Does your computer know how you're feeling?

Aug 22, 2014

Researchers in Bangladesh have designed a computer program that can accurately recognize users' emotional states as much as 87% of the time, depending on the emotion.

User comments : 0