Which technologies get better faster?

May 17, 2011 by David L. Chandler

Some forms of technology — think, for example, of computer chips — are on a fast track to constant improvements, while others evolve much more slowly. Now, a new study by researchers at MIT and other institutions shows that it may be possible to predict which technologies are likeliest to advance rapidly, and therefore may be worth more investment in research and resources.

In a nutshell, the researchers found that the greater a ’s complexity, the more slowly it changes and improves over time. They devised a way of mathematically modeling complexity, breaking a system down into its individual components and then mapping all the interconnections between these components.

“It gives you a way to think about how the structure of the technology affects the rate of improvement,” says Jessika Trancik, assistant professor of engineering systems at MIT and co-author of a paper explaining the findings. The paper’s lead author is James McNerney, a graduate student at Boston University (BU); other co-authors are Santa Fe Institute Professor Doyne Farmer and BU physics professor Sid Redner. It appears online this week in the .

The team was inspired by the complexity of energy-related technologies ranging from tiny transistors to huge coal-fired powerplants. They have tracked how these technologies improve over time, either through reduced cost or better performance, and, in this paper, develop a model to compare that progress to the complexity of the design and the degree of connectivity among its different components.

The authors say the approach they devised for comparing technologies could, for example, help policymakers mitigate climate change: By predicting which low-carbon technologies are likeliest to improve rapidly, their strategy could help identify the most effective areas to concentrate research funding. The analysis makes it possible to pick technologies “not just so they will work well today, but ones that will be subject to rapid development in the future,” Trancik says.

Besides the importance of overall design complexity in slowing the rate of improvement, the researchers also found that certain patterns of interconnection can create bottlenecks, causing the pace of improvements to come in fits and starts rather than at a steady rate.

“In this paper, we develop a theory that shows why we see the rates of improvement that we see,” Trancik says. Now that they have developed the theory, she and her colleagues are moving on to do empirical analysis of many different technologies to gauge how effective the model is in practice. “We’re doing a lot of work on analyzing large data sets” on different products and processes, she says.

For now, she suggests, the method is most useful for comparing two different technologies “whose components are similar, but whose design complexity is different.” For example, the analysis could be used to compare different approaches to next-generation solar photovoltaic cells, she says. The method can also be applied to processes, such as improving the design of supply chains or infrastructure systems. “It can be applied at many different scales,” she says.

Koen Frenken, professor of economics of innovation and technological change at Eindhoven University of Technology in the Netherlands, says this paper “provides a long-awaited theory” for the well-known phenomenon of learning curves. “It has remained a puzzle why the rates at which humans learn differ so markedly among technologies. This paper provides an explanation by looking at the complexity of technology, using a clever way to model design complexity.”

Frenken adds, “The paper opens up new avenues for research. For example, one can verify their theory experimentally by having human subjects solve problems with different degrees of complexity.” In addition, he says, “The implications for firms and policymakers [are] that R&D should not only be spent on invention of new technologies, but also on simplifying existing technologies so that humans will learn faster how to improve these technologies.”

Ultimately, the kind of analysis developed in this paper could become part of the design process — allowing engineers to “design for rapid innovation,” Trancik says, by using these principles to determine “how you set up the architecture of your system.”

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Explore further: Weighing gas with sound and microwaves

Related Stories

Health-conscious future could stem from smartphones

Dec 14, 2010

The latest smartphones are equipped with a range of technologies that can pinpoint your location. It's only a matter of time before they'll also be able to detect your every movement, says Reetika Gupta, assistant professor ...

Recommended for you

Weighing gas with sound and microwaves

8 hours ago

NIST scientists have developed a novel method to rapidly and accurately calibrate gas flow meters, such as those used to measure natural gas flowing in pipelines, by applying a fundamental physical principle: ...

Undersea pipes "shoulder" anchoring duties

9 hours ago

Research off the north-west coast shows undersea pipelines tend to bury themselves in the seabed quicker than expected after they are deployed, resulting in potential cost savings for the petroleum industry.

Navy wants to increase use of sonar-emitting buoys

Jan 25, 2015

The U.S. Navy is seeking permits to expand sonar and other training exercises off the Pacific Coast, a proposal raising concerns from animal advocates who say that more sonar-emitting buoys would harm whales ...

User comments : 3

Adjust slider to filter visible comments by rank

Display comments: newest first

5 / 5 (1) May 17, 2011
Computer chips are among the most complex human-made things on the planet, and yet they have been evolving very fast. Therefore the researchers must be missing important factors.

One factor is how much improvement is theoretically possible, and another is how much money/time/effort is being thrown at improving a particular technology.

Computers have improved so fast because they had so far to go and had so much money being spent on improving them. Although they have fewer orders of magnitude left to improve, the rate at which money being spent on improvements has been increasing.
not rated yet May 18, 2011
This is not really new. Kurzweil and others have been publishing such relationships including interdependencies for several decades and possibly earlier. Acknowledgement of the earlier work should really be stated explicitly. I dont see anything new here but perhaps the devil is in the details.
5 / 5 (1) May 18, 2011
Further I agree with Realscience. Biotechnology, bioengineering, improvements to computer systems and nanotechnology are amongst the fastest growing fields and also some of the most complex and are all on exponential improvement (price/performance) some at a rate of 1.2 - 2 years to double (eg: genome sequencing and understanding).

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.