Which technologies get better faster?

May 17, 2011 by David L. Chandler

Some forms of technology — think, for example, of computer chips — are on a fast track to constant improvements, while others evolve much more slowly. Now, a new study by researchers at MIT and other institutions shows that it may be possible to predict which technologies are likeliest to advance rapidly, and therefore may be worth more investment in research and resources.

In a nutshell, the researchers found that the greater a ’s complexity, the more slowly it changes and improves over time. They devised a way of mathematically modeling complexity, breaking a system down into its individual components and then mapping all the interconnections between these components.

“It gives you a way to think about how the structure of the technology affects the rate of improvement,” says Jessika Trancik, assistant professor of engineering systems at MIT and co-author of a paper explaining the findings. The paper’s lead author is James McNerney, a graduate student at Boston University (BU); other co-authors are Santa Fe Institute Professor Doyne Farmer and BU physics professor Sid Redner. It appears online this week in the .

The team was inspired by the complexity of energy-related technologies ranging from tiny transistors to huge coal-fired powerplants. They have tracked how these technologies improve over time, either through reduced cost or better performance, and, in this paper, develop a model to compare that progress to the complexity of the design and the degree of connectivity among its different components.

The authors say the approach they devised for comparing technologies could, for example, help policymakers mitigate climate change: By predicting which low-carbon technologies are likeliest to improve rapidly, their strategy could help identify the most effective areas to concentrate research funding. The analysis makes it possible to pick technologies “not just so they will work well today, but ones that will be subject to rapid development in the future,” Trancik says.

Besides the importance of overall design complexity in slowing the rate of improvement, the researchers also found that certain patterns of interconnection can create bottlenecks, causing the pace of improvements to come in fits and starts rather than at a steady rate.

“In this paper, we develop a theory that shows why we see the rates of improvement that we see,” Trancik says. Now that they have developed the theory, she and her colleagues are moving on to do empirical analysis of many different technologies to gauge how effective the model is in practice. “We’re doing a lot of work on analyzing large data sets” on different products and processes, she says.

For now, she suggests, the method is most useful for comparing two different technologies “whose components are similar, but whose design complexity is different.” For example, the analysis could be used to compare different approaches to next-generation solar photovoltaic cells, she says. The method can also be applied to processes, such as improving the design of supply chains or infrastructure systems. “It can be applied at many different scales,” she says.

Koen Frenken, professor of economics of innovation and technological change at Eindhoven University of Technology in the Netherlands, says this paper “provides a long-awaited theory” for the well-known phenomenon of learning curves. “It has remained a puzzle why the rates at which humans learn differ so markedly among technologies. This paper provides an explanation by looking at the complexity of technology, using a clever way to model design complexity.”

Frenken adds, “The paper opens up new avenues for research. For example, one can verify their theory experimentally by having human subjects solve problems with different degrees of complexity.” In addition, he says, “The implications for firms and policymakers [are] that R&D should not only be spent on invention of new technologies, but also on simplifying existing technologies so that humans will learn faster how to improve these technologies.”

Ultimately, the kind of analysis developed in this paper could become part of the design process — allowing engineers to “design for rapid innovation,” Trancik says, by using these principles to determine “how you set up the architecture of your system.”


This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Explore further: Lifting the brakes on fuel efficiency

Related Stories

Health-conscious future could stem from smartphones

Dec 14, 2010

The latest smartphones are equipped with a range of technologies that can pinpoint your location. It's only a matter of time before they'll also be able to detect your every movement, says Reetika Gupta, assistant professor ...

Recommended for you

Lifting the brakes on fuel efficiency

Apr 18, 2014

The work of a research leader at Michigan Technological University is attracting attention from Michigan's Governor as well as automotive companies around the world. Xiaodi "Scott" Huang of Michigan Tech's ...

Large streams of data warn cars, banks and oil drillers

Apr 16, 2014

Better warning systems that alert motorists to a collision, make banks aware of the risk of losses on bad customers, and tell oil companies about potential problems with new drilling. This is the aim of AMIDST, the EU project ...

User comments : 3

Adjust slider to filter visible comments by rank

Display comments: newest first

RealScience
5 / 5 (1) May 17, 2011
Computer chips are among the most complex human-made things on the planet, and yet they have been evolving very fast. Therefore the researchers must be missing important factors.

One factor is how much improvement is theoretically possible, and another is how much money/time/effort is being thrown at improving a particular technology.

Computers have improved so fast because they had so far to go and had so much money being spent on improving them. Although they have fewer orders of magnitude left to improve, the rate at which money being spent on improvements has been increasing.
Sierracafe
not rated yet May 18, 2011
This is not really new. Kurzweil and others have been publishing such relationships including interdependencies for several decades and possibly earlier. Acknowledgement of the earlier work should really be stated explicitly. I dont see anything new here but perhaps the devil is in the details.
Sierracafe
5 / 5 (1) May 18, 2011
Further I agree with Realscience. Biotechnology, bioengineering, improvements to computer systems and nanotechnology are amongst the fastest growing fields and also some of the most complex and are all on exponential improvement (price/performance) some at a rate of 1.2 - 2 years to double (eg: genome sequencing and understanding).

More news stories

Hackers of Oman news agency target Bouteflika

Hackers on Sunday targeted the website of Oman's official news agency, singling out and mocking Algeria's newly re-elected president Abdelaziz Bouteflika as a handicapped "dictator".

Ex-Apple chief plans mobile phone for India

Former Apple chief executive John Sculley, whose marketing skills helped bring the personal computer to desktops worldwide, says he plans to launch a mobile phone in India to exploit its still largely untapped ...

Cancer stem cells linked to drug resistance

Most drugs used to treat lung, breast and pancreatic cancers also promote drug-resistance and ultimately spur tumor growth. Researchers at the University of California, San Diego School of Medicine have discovered ...