Moore's Law is 50 years old but will it continue?

July 20, 2015 by Jonathan Borwein And David H. Bailey, The Conversation

It's been 50 years since Gordon Moore, one of the founders of the microprocessor company Intel, gave us Moore's Law. This says that the complexity of computer chips ought to double roughly every two years.

Now the current CEO of Intel, Brian Krzanich, is saying the days of Moore's Law may be coming to an end as the time between new innovation appears to be widening:

The last two technology transitions have signalled that our cadence today is closer to two and a half years than two.

So is this the end of Moore's Law?

Moore's Law has its roots in an article by Moore written in 1965, in which he observed the complexity of component development was doubling each year. This was later modified to become:

The number of transistors incorporated in a chip will approximately double every 24 months.

This rate was again modified to a doubling over roughly 18 months.

In its 24 month guise, Moore's Law has continued unabated for 50 years, with an overall advance of a factor of roughly 231, or 2 billion. That means memory chips today story around 2 billion times as much data as in 1965. Or, in more general terms, computer hardware today is around 2 billion times as powerful for the same cost.

It is hard to comprehend Moore's Law. Imagine airline technology advancing from 1965 to 2015 to travel nearly at the speed of light (1,080 million kph or 670 million mph), yet capacious enough to contain the entire world's population. Or imagine the cost of a jet airliner dropping from US$100 million to one dollar. Yet even these analogies fall far short of a factor of 2 billion.

Moore was originally embarrassed by his eponymous "law". This is in part because it is not at all a law in the sense a law of physics, but instead merely an observation. But on the 40th anniversary, Intel was happy to celebrate it and Moore was pleased to note that it still seemed to be accurate.

The end is nigh?

A few months ago though, Moore observed:

The original prediction was to look at 10 years, which I thought was a stretch […] The fact that something similar is going on for 50 years is truly amazing. […] But someday it has to stop. No exponential like this goes on forever.

There have been numerous other predictions that Moore's Law was soon to end.

In 1999, physicist and best-selling author Michio Kaku declared that the "Point One barrier" (meaning chip features 0.1 micron or 100 nanometers in size) would soon halt progress.

Yet the semiconductor industry sailed through the 0.1 micron level like a jetliner passing through a wispy cloud. Devices currently in production have feature sizes as small as 10 or 14 nanometers, and IBM has just announced chip with 7 nanometer features.

By comparison, a helical strand of DNA is 2.5 nanometers in diameter, thus commercial semiconductor technology is now entering the molecular and atomic realm.

A speed barrier

Not all is roses, though. By one measure – a processor's clock speed – Moore's Law has already stalled.

Today's state-of-the-art production microprocessors typically have 3 GHz clock rates, compared with 2 GHz rates five or ten years ago – not a big improvement.

But the industry has simply increased the number of processor "cores" and on-chip cache memory, so that aggregate performance continues to track or exceed Moore's Law projections. There are many, many software challenges to make sure this remains relevant.

Hewlett Packard Laboratories is hard at work developing new approaches for microelectronics. Its nanotechnology research group has developed a "crossbar architecture", a design where a set of parallel "wires" a few nanometers in width are crossed by a second set of "wires" at right angles. Where the "wires" intersect forms an electronic switch, which can be configured for either logic or memory storage use.

It is also investigating nanoscale photonics (light-based devices), which can be deployed either for conventional electronic devices or for emerging quantum computing devices.

Moore's Law is a gift to science

Moore's Law has been a great blessing to science and mathematics research. Modern laboratories are loaded with high-tech measurement and analysis devices, which become more powerful and cheaper ever year.

In addition, a broad range of modern science, mathematics and engineering has benefited from Moore's Law in the form of scientific supercomputers, which are used for applications as diverse as supernova simulation and protein folding to product design and the processing of microwave background radiation from the cosmos.

Software running these computers has advanced abreast with Moore's Law.

For example, the fast Fourier transform algorithm, which is used extensively in scientific computation, and magnetic resonance imaging (MRI), both involve substantial computation that would not be possible without Moore's Law advances.

It is not entirely coincidental that both of these algorithmic advances arose roughly 50 years ago, the same time Moore's Law was first observed.

How much more for Moore's Law?

Intel's CEO, Brian Krzanich, said the company would "strive to get back to two years" for innovation to keep Moore's Law on track.

If Moore's Law does continue for just two or three more decades, typical handheld devices may well exceed the human brain in intelligence. Some, such as author James Barrat, declare that artificially intelligent computers will be the "final invention" of mankind, after which humans may become irrelevant.

We do not subscribe to such pessimism. Rather we see a promising future with scientific knowledge, among other things, increasing at an exponential rate.

Time will tell. As physicist Richard Feynman wrote in 1959, referring to the potential for ever finer control of nature at the microscopic level, there still appears to be plenty of room at the bottom.

Explore further: Silicon Valley marks 50 years of Moore's Law

Related Stories

Silicon Valley marks 50 years of Moore's Law

April 24, 2015

Computers were the size of refrigerators when an engineer named Gordon Moore laid the foundations of Silicon Valley with a vision that became known as "Moore's Law."

Project at IBM looks to carbon nanotube future

July 2, 2014

How can miniaturization continue beyond the limits of current silicon-based device technology? A project at IBM aims to have transistors built using carbon nanotubes, ready to take over from silicon transistors in the coming ...

A $1000 genome could be reached by 2013

July 21, 2011

(PhysOrg.com) -- A new report published in the journal Nature describes the new machine created by Jonathan Rothberg of Ion Torrent Systems which uses semiconductors to decode DNA and takes them one step closer to being able ...

Recommended for you

Robots as tools and partners in rehabilitation

August 17, 2018

In future decades, the need for effective strategies for medical rehabilitation will increase significantly, because patients' rate of survival after diseases with severe functional deficits, such as a stroke, will increase. ...

Security gaps identified in internet protocol IPsec

August 15, 2018

In collaboration with colleagues from Opole University in Poland, researchers at Horst Görtz Institute for IT Security (HGI) at Ruhr-Universität Bochum (RUB) have demonstrated that the internet protocol IPsec is vulnerable ...

Researchers find flaw in WhatsApp

August 8, 2018

Researchers at Israeli cybersecurity firm said Wednesday they had found a flaw in WhatsApp that could allow hackers to modify and send fake messages in the popular social messaging app.

6 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Returners
1 / 5 (5) Jul 20, 2015
If the number of processors and memory you can put in a computer increased ten-fold over today, it would not significantly help cosmology or meteorology or climate science, because the limiting factors in those science reside in our ability to measure real world phenomenon.

You can't model a phenomenon more precisely than you can measure it, even if you had a computer theoretically capable of doing so, because you don't know what you're actually looking at any better.

A ten times more powerful super-computer would allow only sqrt(10) times (about 3.15) more precision in cosmology models, because the number of computations required for a correct n-body calculation scales as the square of the number of particles. So it takes really 9 times more computing power to improve precision by a factor of 3 given the same amount of computing time.
Returners
1 / 5 (4) Jul 20, 2015
If you wanted to model 10 times as many particles you'd need 100 times the computing power.

If you also want to sample model-time 10 times as often, then you need a further multiple of 10 times computing power.

So to model 10 times as many particles with 10 times the sampling precision in model-time requires a computer 1000 times as powerful in processing power, assuming the same amount of processing time.

To put this in perspective, NASA and other agencies currently model the entire universe using a billion particles or so, and this is nearly a quadrillion times fewer particles than the number of hydrogen atoms in a single gram of hydrogen, and between 100 to 200 times fewer particles than the number of galaxies in the observable universe...nevermind the alleged dark matter.

dirk_bruere
5 / 5 (1) Jul 21, 2015
The only form of Moore's Law worth talking about is the rate at which the cost of a given amount of computing power falls per year, because for the vast majority of people and applications that is what matters.
PhysicsMatter
not rated yet Jul 21, 2015
Simulations are very crude and primitive tool without finesse of the Analytical calculus and hence are being used for difficult to solve or nonlinear problems. Unfortunately it means that they are prone to computational instabilities, as was pointed out by von Neumann, which if not controlled result in simulation solving different problem that suppose to approximate.

We need strict solution to biggest problems of physics like multi-body gravitational/electrodynamic problem or come up with new mathematics to deal with those problems.

Otherwise even 100 to one increase in computing power won't do (X-ray lithography) since brutal force is blind and leads nowhere.

We may need new metaphysics as well.
docile
Jul 21, 2015
This comment has been removed by a moderator.
Eikka
not rated yet Jul 26, 2015
In its 24 month guise, Moore's Law has continued unabated for 50 years


No it hasn't, because that "guise" is not the correct version. It was doubling of transistors in an /affordable/ chip, which meant the maximum number of transistors per chip optimized by cost.

It was NOT how many transistors you could or would actually fit on a single chip, which is arbitrary since the size or purpose of the chip isn't defined.

All these articles that talk about the Moore's law get it wrong one way or the other deliberately in order to keep arguing that it has "held" for so and so long, but they're all just fitting some exponential over some arbitrarily cherrypicked set of data, and then calling that the Moore's law.

That's called lying.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.