New 'Koomey’s Law' of power efficiency parallels Moore'e Law

Sep 15, 2011 by Bob Yirka report
Transistor counts for integrated circuits plotted against their dates of introduction. The curve shows Moore's law - the doubling of transistor counts every two years. Image: Wikipedia.

(PhysOrg.com) -- For most of the computer age, the central theme in computer hardware architecture has been: create more computational power using the same amount of chip space. Intel founder Gordon Moore even came up with a “law” based on what he’d seen up to that point to predict how things would go in the future; that computing power would double every year and a half. Now Jonathan Koomey, a consulting professor at Stanford has led a study that shows that the electrical energy efficiency of computers has been following roughly the same path. He and his colleagues from Microsoft and Intel have published the results of their study in EEE Annals of the History of Computing that shows that the energy efficiency of computers has doubled nearly every eighteen months (now called appropriately enough, Koomey’s Law) going all the way back to the very first computers built in the 1950’s.

This is not the first time Koomey’s name has been in the news, just last month he was the lead author of a paper that showed that electricity consumed by data centers in the U.S. and around the world grew at a slower pace (from 2005 to 2010) than had been predicted by a 2007 U.S. EPA report. This time around, Koomey, in collaboration, with and Microsoft has been studying how much electricity is used relative to processing power, by computers in a historical context. Way back in 1956, for example, ENIAC, one of the first true computers, used approximately 150 kilowatts of electricity to perform just a few hundred calculations per second.

Using historical data, the team created a graph comparing the amount of computing power of the average (from supercomputers to laptops) with the amount of electricity it needed and found that over time, improvements from the 1950’s till now, have moved in virtual lockstep with increases in the amount of processing power: energy efficiency, they found effectively doubled every 1.57 year. Because of this, they predict that the trend is likely to continue into the foreseeable future.

This is important as computing platforms have become more mobile and end users increasingly tend to place more value in power efficiency (because it means longer battery life) than in how fast their Smartphone or tablet is able to produce results. Thus, it’s possible that Koomey’s Law will become the rallying cry on into the future, much as Moore’s Law has been in the past. Though hopefully, new engineers won’t start to fudge on Moore’s to get these results, as that could lead to small devices that last for weeks on batteries alone, but are sluggish.

Explore further: DARPA technology identifies counterfeit microelectronics

More information: Implications of Historical Trends in the Electrical Efficiency of Computing, July-September 2011 (vol. 33 no. 3)
pp. 46-54. doi.ieeecomputersociety.org/10.1109/MAHC.2010.28

Abstract
The electrical efficiency of computation has doubled roughly every year and a half for more than six decades, a pace of change comparable to that for computer performance and electrical efficiency in the microprocessor era. These efficiency improvements enabled the creation of laptops, smart phones, wireless sensors, and other mobile computing devices, with many more such innovations yet to come. The Web Extra appendix outlines the data and methods used in this study.

via Technology Review

Related Stories

U.S. Data Centers Consume 45 Billion kWh Annually, Study

Feb 16, 2007

In a keynote address at the LinuxWorld OpenSolutions Summit in New York yesterday, Randy Allen, corporate vice president, Server and Workstation Division, AMD, revealed findings from a study that comprehensively ...

New device may revolutionize computer memory

Jan 20, 2011

(PhysOrg.com) -- Researchers from North Carolina State University have developed a new device that represents a significant advance for computer memory, making large-scale "server farms" more energy efficient and allowing ...

NVIDIA GPUs power world's fastest supercomputer

Oct 29, 2010

(PhysOrg.com) -- NVIDIA has built the worldэs fastest supercomputer using 7,000 of its graphics processor chips. With a horsepower equivalent to 175,000 laptop computers, its sustained performance is ...

Recommended for you

Analyzing gold and steel – rapidly and precisely

1 minute ago

Optical emission spectrometers are widely used in the steel industry but the instruments currently employed are relatively large and bulky. A novel sensor makes it possible to significantly reduce their size ...

More efficient transformer materials

21 minutes ago

Almost every electronic device contains a transformer. An important material used in their construction is electrical steel. Researchers have found a way to improve the performance of electrical steel and ...

Sensor network tracks down illegal bomb-making

31 minutes ago

Terrorists can manufacture bombs with relative ease, few aids and easily accessible materials such as synthetic fertilizer. Not always do security forces succeed in preventing the attacks and tracking down ...

DARPA technology identifies counterfeit microelectronics

31 minutes ago

Advanced software and equipment to aid in the fight against counterfeit microelectronics in U.S. weapons and cybersecurity systems has been transitioned to military partners under DARPA's Integrity and Reliability ...

User comments : 21

Adjust slider to filter visible comments by rank

Display comments: newest first

Eikka
3.8 / 5 (8) Sep 15, 2011
Another null result due to a misunderstanding of the Moore's law.

Gordon Moore even came up with a law based on what hed seen up to that point to predict how things would go in the future; that computing power would double every year and a half


He predicted that the number of transistors on an affordable chip would double. NOT computing power.

The number of transistors does not translate to computing power. Increasing the number of cores on a chip doesn't yield you linear increases in computing power, which is why supercomputers have to have hundreds of thousands of CPU cores to perform marginally better than the previous one.
Eikka
3 / 5 (4) Sep 15, 2011
Besides, since heat dissapation is a major issue in processors, it only makes sense that an increase in transistor density is sided with an equal increase in power efficiency, because otherwise the chips would simply burn up or require impractical cooling systems.

The power efficiency is therefore a requisite for increased transistor density. Without it, Moore's law wouldn't work.
Nanobanano
3.3 / 5 (3) Sep 15, 2011
Eikka:

Multi-core really only helps if you are running applications or models that are multi-threaded.

Once you get beyond 1 processor core per thread, you really aren't going to see much benefit, if any, from adding additional cores.

Currently, weather models are limited by the resolution of data input, not computational power, nor even software limitations.

Some of the newer experimental models are actually run on massive arrays of video cards that they reprogram to do weather calculations, because video cards are usually cheaper than CPU processors, and they now have so much memory and hundreds or even thousands of stream processors.

But the weather is something that can be scaled almost indefinitely, and broken down into smaller packets in space and time almost indefinitely, so it's ideal for multi-core computing.
rawa1
3.7 / 5 (3) Sep 15, 2011
The power efficiency is therefore a requisite for increased transistor density. Without it, Moore's law wouldn't work.

You're right, the Koomey law can be derived from Moore's law easily in such way. Actually we can see, the power efficiency limits the power generated with processor long time - they all consume about 150 - 200 W a long time, because this is just a value, which allows their effective cooling per unit of chip area with passive heat transfer. The so-called green processors (mobile platform, etc..) are just smaller and less powerfull, but not actually more effective, than the energy hungry ones.
El_Nose
4 / 5 (4) Sep 15, 2011
Moore's Law was an off the cuff comment made in an interview that he had not even thought out fully... He was just trying to make conversation -- and He has stated several times he's feel really lucky that he was close to being right because his name is attached to one sentence in an interview and it's the only thing remembered about his legacy to computing.
Eikka
2.7 / 5 (7) Sep 15, 2011
Plus, the Moore's law has been revised every time reality didn't follow it.

The doubling rate has actually been slowing down over time, but don't tell that to the singularity geeks. They'll just get mad at you.
fmfbrestel
5 / 5 (1) Sep 15, 2011
Eikka -- that is just objectively not true. Moore's "law" states that transistors will double every two years. the 18 months thing was a prediction by the founder of Intel on the power of chips, not the transistor density. The faster doubling from intel has to do with increases in transistor speed in conjunction with density.

The graph at the top of this page is the objective reality concerning transistor density, and it is clearly not slowing down (although it did for a short while during the Pentium line, it has since caught up).
fmfbrestel
5 / 5 (4) Sep 15, 2011
The "law" doesn't matter, only what actually gets built, and what actually gets built has been a doubling of transistors every two years. But yes, back to the point, claiming that power efficiency also follows a doubling law is not news worthy. Of course it does, otherwise (as others have already pointed out) the chips would melt.
Eikka
2.2 / 5 (6) Sep 15, 2011
Eikka -- that is just objectively not true. Moore's "law" states that transistors will double every two years.


He originally stated it would double every year, and then revised his prediction in 1975.

The problem of the prediction is, that the industry has set a target on following it, so it doesn't really have any predictive power. I can make a prediction that I will walk to the corner shop to buy some beer in the next 30 minutes, and what do you know, it comes true!

As a side effect though, the capital cost of semiconductor factories is also following an exponential curve to keep pace with the Moore's law.
Eikka
2 / 5 (5) Sep 15, 2011
And as the capital costs of making the chips increases, so does the number of customers buying the chips have to increase in order to keep them affordable.

That means, if the industry can't keep pushing the processors into exponentially more and more devices, Moore's law will grind to a halt very quickly.
fmfbrestel
5 / 5 (1) Sep 15, 2011
the capital cost of semiconductor factories is also following an exponential curve to keep pace with the Moore's law.


...and now your just making stuff up.

For the last five years intel's capital expenditures: (couldnt find it going back further)

http://www.fabtec...in_2011/

big jump in 2011 because they added a brand new fab which they haven't done in a while. but outside the jump for 2011, their capital expenditures were actually trending DOWN from 2006-2010.

Please quit making up stuff to sound smart.
fmfbrestel
5 / 5 (2) Sep 15, 2011
The smaller chips get the CHEAPER they get because the cost is a function of how many they can print per wafer. They get cheaper first, and THEN they get put into more and more things (because they are cheaper), not the other way around.

That said, OF COURSE Moore's law will grind to a halt. probably 5-6 years from now actually. beyond a certain limit quantum tunneling will render a transistor incapable of shutting off.

Intel isn't building better chips because they want to keep pace with Moore's law, they are building better chips because there is competition in the marketplace and if they stand still they might as well liquidate the company. Demand for better chips pushes moores law, not Intel's desire to be trendy.

Eikka
1 / 5 (1) Sep 15, 2011

...and now your just making stuff up.


Nope.

http://en.wikiped...%27s_law
fmfbrestel
not rated yet Sep 15, 2011
Except that Rocks law failed 8 years ago. It's just someones guess, and it DIDNT hold. You are wrong.
Eikka
1 / 5 (1) Sep 15, 2011
They get cheaper first, and THEN they get put into more and more things (because they are cheaper), not the other way around.


Nope. They are designed to be cheap.

Nobody builds a fab just to see how cheap they can make the chips, or to put bets on how many they can sell. Their intention all along is to make a certain number of chips at a cost that they belive can be sold. Without a very good reason to believe that they can sell N units at M price, nobody would fund the factory.

So, as the factories become more expensive with more advanced machinery to make more advanced chips, they can only be built if there's a market for the products. If there isn't enough potential buyers, then the price goes up, and if the price is unpalatable to the potential buyers then the factory won't go up.

If somebody builds the factory regardless, they risk going bankcrupt with a loss that rivals the GDP of a small nation because the products didn't sell.
fmfbrestel
not rated yet Sep 15, 2011
Without a very good reason to believe that they can sell N units at M price, nobody would fund the factory.


exactly.

If there isn't enough potential buyers, then the price goes up, and if the price is unpalatable to the potential buyers then the factory won't go up.


Read that again, really. The logic is so circular it hurts to think about. Intel isnt building new fabs just because they have preorders for chips in 2013. They are building fabs because the expect the market to exist, and yeah, if the market disappears then they go bankrupt. It is expensive and risky.

And yeah, they are designed to be cheap. Despite the mammoth costs of a chip fab, each chip is ridiculously cheap. The cost per transistor has fallen faster than transistor density has risen. That's the objective reality you are missing.

gimpypoet
1 / 5 (1) Sep 16, 2011
And yeah, they are designed to be cheap. Despite the mammoth costs of a chip fab, each chip is ridiculously cheap. The cost per transistor has fallen faster than transistor density has risen. That's the objective reality you are missing.
chip failure due to heat damage could be controlled,but if they lasted longer,turnover rates would decrease profits.profits drive corporations, not dependability of their products.people usually work their computers until they fail, most will try repairs before replacement. some do buy because of advances in tech, and manufacturers know this.i have a tube stereo from the fifties that still works fine, and have killed many transistor based ones over the last thirty years. they don't build things to last anymore as that would cut profit. they (computer manufaturers) don't give away older tech, they sell ti in third world markets,and that proves they don't care about anything but profits.

fmfbrestel
5 / 5 (2) Sep 16, 2011
people usually work their computers until they fail


That's at least the third wrong thing you said, but I have a limited amount of time for trolls like you.

The vast majority of computers are thrown away working perfectly fine. The desire to keep up with the Jones's is what drives turnover. If your cpu fries, you either overclocked it without knowing what you were doing, blocked the air vents on your case, never opened the case to dust off stuff, or failed to use a decent surge protector. Computers get slow and die because users click on a million mall-ware links and their computer is too busy being part of a bot-net to run simple tasks.

I have had 7 home computers and 4 work computers. One of them died to a lightning strike, and another had a hard drive failure. The other 9 were discarded because something new and shiny came around the corner.
hikenboot
not rated yet Sep 28, 2011
hikenboot's law Weather predictions will become more and more accurate on an exponential scale...yipppee do da...that was really hard..of course I have nothing to back it up with! The weather still seems as unpredictable as it ever was!
powerup1
1 / 5 (1) Oct 10, 2011
Plus, the Moore's law has been revised every time reality didn't follow it.

The doubling rate has actually been slowing down over time, but don't tell that to the singularity geeks. They'll just get mad at you.


Eikka, please site your references, otherwise it is just gossip and of little value.
Eikka
1 / 5 (1) Oct 13, 2011

And yeah, they are designed to be cheap. Despite the mammoth costs of a chip fab, each chip is ridiculously cheap. The cost per transistor has fallen faster than transistor density has risen. That's the objective reality you are missing.


The cost per transistor has gone down because the process capacity has gone up faster than the investment costs, but the investment costs have still risen. That's the whole point.

That means each time you build a new fab, you need to sell more chips to offset the investment, and at some point you simply don't have the markets for such an ungodly number of chips that would pay for the new fabs, and the prices will start to increase and the demand for the new chips will vanish with the increasing price.

And that's where the upgrade cycle stops. It no longer makes sense to build the next better fab.