Researcher finds Moore's Law and Wright's Law best predict how tech improves

Researcher finds Moore’s Law and Wright’s Law best predict how tech improves
Credit: Wikipedia/Casey Fleser

Researchers at MIT and the Santa Fe Institute have found that some widely used formulas for predicting how rapidly technology will advance—notably, Moore's Law and Wright's Law—offer superior approximations of the pace of technological progress. The new research is the first to directly compare the different approaches in a quantitative way, using an extensive database of past performance from many different industries.

Some of the results were surprising, says Jessika Trancik, an assistant professor of at MIT. The findings could help industries to assess where to focus their research efforts, investors to pick high-growth sectors, and regulators to more accurately predict the of policy changes.

The report is published in the online open-access journal . Its other authors are Bela Nagy of the Santa Fe Institute, J. Doyne Farmer of the University of Oxford and the Santa Fe Institute, and Quan Bui of St. John's College in Santa Fe, N.M.

The best-known of the formulas is Moore's Law, originally formulated by Intel co-founder Gordon Moore in 1965 to describe the rate of improvement in the power of . That law, which predicts that the number of components in integrated circuit chips will double every 18 months, has since been generalized as a principle that can be applied to any technology; in its general form, it simply states that rates of improvement will increase exponentially over time. The actual rate of improvement—the exponent in the equation—varies depending on the technology.

The analysis indicates that Moore's Law is one of two formulas that best match actual over past decades. The top performer, called Wright's Law, was first formulated in 1936: It holds that progress increases with experience—specifically, that each percent increase in cumulative production in a given industry results in a fixed percentage improvement in .

To carry out the analysis, the researchers amassed an extensive set of data on actual costs and production levels over time for 62 different industry sectors; these ranged from commodities such as aluminum, manganese and beer to more advanced products like computers, communications systems, solar cells, aircraft and cars.

"There are lots of proposals out there," Trancik says, for predicting the rate of advances in technologies. "But the data to test the hypotheses is hard to come by."

The research team scoured government reports, market-research publications, research reports and other published sources to compile their database. They only used sources for which at least a decade's worth of consistent data was available, and which contained metrics for both the rate of production and for some measure of improvement. They then analyzed the data by using the different formulas in "hindcasting": assessing which of the formulas best fit the actual pace of technological advances in past decades.

"We didn't know what to expect when we looked at the performance of these equations relative to one another," Trancik says, but "some of the proposals do markedly better than others."

Knowing which models work best in forecasting technological change can be very important for business leaders and policymakers. "It could be useful in things like climate-change mitigation," Trancik says, "where you want to know what you'll get out of your investment."

The rates of change vary greatly among different technologies, the team found.

"Information technologies improve the fastest," Trancik says, "but you also see the sustained exponential improvement in many energy technologies. Photovoltaics improve very quickly. … One of our main interests is in examining the data to gain insight into how we can accelerate the improvement of technology."

Erin Baker, an associate professor of mechanical and industrial engineering at the University of Massachusetts who was not connected with this work, says, "This is a very nice paper. The result that Wright's Law and Moore's Law both fit past data equally well is surprising and useful."

Explore further

'Economy of scale laws' hold up well against observed data, study finds

More information:
Journal information: PLoS ONE

This story is republished courtesy of MIT News (, a popular site that covers news about MIT research, innovation and teaching.

Citation: Researcher finds Moore's Law and Wright's Law best predict how tech improves (2013, March 6) retrieved 26 June 2019 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments

Mar 06, 2013
OOOOOOOh!!! Where can I trade my ASUS U36S i7 for one of those sleek little numbers at the top of this article?
No, not the little black thing, that one with the nice CRT in the middle of it.

Mar 06, 2013
Interesting study. Glad it's been done. My problem with any such empirical rules is that eventually exponentials reach a point where the laws of the universe interfere with possible continuation. I'm a bit concerned that the decrease in PV costs over the last 8 years has flattened and this is despite major investments with private venture capital dollars and substantive government subsidies. Yes this may just be a normal statistics variation, or it may be an indication of an onset of diminishing returns. With all the research ongoing I find the latter hard to believe though possible.

Mar 06, 2013
Has nobody *noticed* that had Moore's law not petered out very suddenly about six years ago at 3GHz, we would now be using 30GHz processors instead of 3.6GHz ones?

Mar 06, 2013
No time for that. They are too busy pontificating about the future through the employ of Moore's law.

"Has nobody *noticed*" - Nik

You know, in 6 years we are going to be using 300 Ghz CPU's.

Mar 06, 2013
Moore's law doesn't say it will get faster, it says it will have more integrated components. So, do you have any idea how many transistors are now in your modern 8 core cpu? Guess what, it's keeping up with Moore's law rather well.

Mar 06, 2013
Intel CPU advances chart:


Mar 06, 2013
You know, in 6 years we are going to be using 300 Ghz CPU's.

I don't know about 6 years, but we'll get there with some manner of photonics and spintronics.

The bigger question is will developers make efficient use of the power, or will they just write sloppier and sloppier code for operating systems, applications, and games?!

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more