Silicon Valley marks 50 years of Moore's Law

Moore's Law
Plot of CPU transistor counts against dates of introduction; note the logarithmic vertical scale; the line corresponds to exponential growth with transistor count doubling every two years. Credit: Wikipedia

Computers were the size of refrigerators when an engineer named Gordon Moore laid the foundations of Silicon Valley with a vision that became known as "Moore's Law."

Moore, then the 36-year-old head of research at Fairchild Semiconductor, predicted in a trade magazine article published 50 years ago Sunday that computer chips would double in complexity every year, at little or no added cost, for the next 10 years. In 1975, based on industry developments, he updated the prediction to doubling every two years.

And for the past five decades, chipmakers have proved him right - spawning scores of new companies and shaping Silicon Valley to this day.

"If Silicon Valley has a heartbeat, it's Moore's Law. It drove the valley at what has been a historic speed, unmatched in history, and allowed it to lead the rest of the world," said technology consultant Rob Enderle.

Moore's prediction quickly became a business imperative for companies. Those that ignored the timetable went out of business. Companies that followed it became rich and powerful, led by Intel, the company Moore co-founded.

Thanks to Moore's Law, people carry smartphones in their pocket or purse that are more powerful than the biggest computers made in 1965 - or 1995, for that matter. Without it, there would be no slender laptops, no computers powerful enough to chart a genome or design modern medicine's lifesaving drugs. Streaming video, social media, search, the cloud-none of that would be possible on today's scale.

"It fueled the information age," said Craig Hampel, chief scientist at Rambus, a Sunnyvale semiconductor company. "As you drive around Silicon Valley, 99 percent of the companies you see wouldn't be here" without cheap computer processors due to Moore's Law.

Moore was asked in 1964 by Electronics magazine to write about the future of integrated circuits for the magazine's April 1965 edition.

The basic building blocks of the digital age, integrated circuits are chips of silicon that hold tiny switches called . More transistors meant better performance and capabilities.

Taking stock of how semiconductor manufacturing was shrinking transistors and regularly doubling the number that would fit on an integrated circuit, Moore got some graph paper and drew a line for the predicted annual growth in the number of transistors on a chip. It shot up like a missile, with a doubling of transistors every year for at least a decade.

It seemed clear to him what was coming, if not to others.

"Integrated circuits will lead to such wonders as home computers - or at least terminals connected to a central computer - automatic controls for automobiles, and personal portable communications equipment," he wrote.

California Institute of Technology professor Carver Mead coined the name Moore's Law, and as companies competed to produce the most powerful chips, it became a law of survival-double the transistors every year or die.

"In the beginning, it was just a way of chronicling the progress," Moore, now 86, said in an interview conducted by Intel. "But gradually, it became something that the various industry participants recognized. ... You had to be at least that fast or you were falling behind."

Moore's Law also held prices down because advancing technology made it inexpensive to pack chips with increasing numbers of transistors. If transistors hadn't gotten cheaper as they grew in number on a chip, would still be a niche product for the military and others able to afford a very high price. Intel's first microprocessor, or computer on a chip, with 2,300 transistors, cost more than $500 in current dollars. Today, an Intel Core i5 microprocessor has more than a billion transistors-and costs $276.

"That was my real objective-to communicate that we have a technology that's going to make electronics cheap," Moore said.

The reach of Moore's Law extends beyond personal tech gadgets.

"The really cool thing about it is it's not just iPhones," said G. Dan Hutcheson of VLSI Research, a technology market research company based in Santa Clara. "Every drug developed in the past 20 years or so had to have the computing power to get down and model molecules. They never would have been able to without that power. DNA analysis, genomes, wouldn't exist-you couldn't do the genetic testing. It all boils down to transistors."

Hutcheson says what Moore predicted was much more than a self-fulfilling prophecy. He had foreseen that optics, chemistry and physics would be combined to shrink transistors over time without substantial added cost.

As transistors become vanishingly small, it's harder to keep Moore's Law going.

About a decade ago, the shrinking of the physical dimensions led to overheating and stopped major performance boosts for every new generation of chips. Companies responded by introducing so-called multicore computers, with several processors on a PC.

"What's starting to happen is people are looking to other innovations on silicon to give them performance" as a way to extend Moore's Law, said Spike Narayan, director of science and technology at IBM's Almaden Research Center.

Then, about a year and a half ago, "something even more drastic started happening," Narayan said. The wires connecting transistors became so small that they became more resistant to electrical current. "Big problem," he said.

"That's why you see all the materials research and innovation," he said of new efforts to find alternative materials and structures for chips.

Another issue confronting Moore's Law is that the energy consumed by chips has begun to rise as transistors shrink. "Our biggest challenge" is energy efficiency, said Alan Gara, chief architect of the Aurora supercomputer Intel is building for Argonne National Laboratory near Chicago.

Intel says it sees a path to continue the growth predicted by Moore's Law through the next decade. The next generation of processors is in "full development mode," said Mark Bohr, an Intel senior fellow who leads a group that decides how each generation of Intel chips will be made. Bohr is spending his time on the generation after that, in which transistors will shrink to 7 nanometers. The average human hair is 25,000 nanometers wide.

At some point the doubling will slow down, says Chenming Hu, an electrical engineering and computer science professor at the University of California, Berkeley. Hu is a key figure in the development of a new transistor structure that's helping keep Moore's Law going.

"It's totally understandable that a company, in order to gain more market share and beat out all competitors, needs to double and triple if you can," Hu said. "That's why this scaling been going on at such a fast pace. But no exponential growth can go on forever."

Hu says what's likely is that at some point the doubling every two years will slow to every four or five years.

"And that's probably a better thing than flash and fizzle out. You really want have the same growth at lower pace."

Explore further

Project at IBM looks to carbon nanotube future

©2015 San Jose Mercury News (San Jose, Calif.)
Distributed by Tribune Content Agency, LLC

Citation: Silicon Valley marks 50 years of Moore's Law (2015, April 24) retrieved 16 September 2019 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments

Apr 24, 2015
Uh, there may be a physical limit on transistors, but software bloat has a long, long way to go...

Apr 24, 2015
There is also the concept or architecture to consider too!!!

Apr 24, 2015
Since 2003 there was no doubling of processor (core) speed but rather stagnation, somewhat before physical limit. Modern computer chips turned into microwave chips requiring separate water cooling systems taking hundreds of watts of power from the system for heat dissipation while on 3.6 GHz clock. Adding cores did not help except for 64 bit OS and specialized applications or servers and only with costly IO issues resolved. Also promising (1990-ties) RISC architecture has been mostly abandoned. Moore's law is long dead.
Where are optical chips that have no clock limitations? Where is X-Ray lithography? After 40 years, nothing.

Apr 25, 2015
Remember the INMOS Transputer from ~1987 ? And its nimble Occam language that let you parallel and/or pipeline process tasks, re-allocating multiple, multiple CPUs' assorted resources on the fly ?

Just dug my remaining books on this lost-chance out of storage, had a quiet cry...

Apr 27, 2015
When I was hired by National Semiconductor in 1972, I thought Moore's Law was in for a revision: I really doubted they could do it.


May 14, 2015
It should really be called "Mead's Law", since Carver Mead is the one that Gordon Moore went to to find out if it would be practical to continue miniaturizing transistors, and it was Mead who did the hard work in looking at the underlying physics to find that semiconductors worked better the smaller you made the circuit elements. Without that work, Moore wouldn't have been able to tell the '60s advances in integrated circuits weren't just the typical quick improvement of new industries, destined to quickly plateau.

Mead also wrote the foundational textbook of the chip-making field and trained many of the early semiconductor engineers. He is still friends with Moore, who endowed Mead's chair at Caltech.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more