Computing faces an energy crunch unless new technologies are found

November 28, 2018, The Conversation
The tools on our smartphones are enabled by a huge network of mobile phone towers, Wi-Fi networks and server farms. Credit: Shutterstock

There's little doubt the information technology revolution has improved our lives. But unless we find a new form of electronic technology that uses less energy, computing will become limited by an "energy crunch" within decades.

Even the most common events in our daily life – making a , sending a text message or checking an email – use computing power. Some tasks, such as watching videos, require a lot of processing, and so consume a lot of .

Because of the energy required to power the massive, factory-sized data centres and networks that connect the internet, computing already consumes 5% of global electricity. And that electricity load is doubling every decade.

Fortunately, there are new areas of physics that offer promise for massively reduced .

The end of Moore's Law

Humans have an insatiable demand for computing power.

Smartphones, for example, have become one of the most important devices of our lives. We use them to access , plot the best route through traffic, and watch the latest season of our favourite series.

And we expect our smartphones to become even more powerful in the future. We want them to translate language in , transport us to new locations via virtual reality, and connect us to the "Internet of Things".

The computing required to make these features a reality doesn't actually happen in our phones. Rather it's enabled by a huge network of mobile phone towers, Wi-Fi networks and massive, factory-sized data centres known as "server farms".

For the past five decades, our increasing need for computing was largely satisfied by incremental improvements in conventional, silicon-based computing technology: ever-smaller, ever-faster, ever-more efficient chips. We refer to this constant shrinking of silicon components as "Moore's Law".

Moore's law is named after Intel co-founder Gordon Moore, who observed that: "the number of transistors on a chip doubles every year while the costs are halved."

But as we hit limits of basic physics and economy, Moore's law is winding down. We could see the end of efficiency gains using current, silicon-based technology as soon as 2020.

Our growing demand for computing capacity must be met with gains in computing efficiency, otherwise the information revolution will slow down from power hunger.

Achieving this sustainably means finding a new technology that uses less energy in computation. This is referred to as a "beyond CMOS" solution, in that it requires a radical shift from the silicon-based CMOS (complementary metal–oxide–semiconductor) technology that has been the backbone of computing for the last five decades.

Why does computing consume energy at all?

Processing of information takes energy. When using an electronic device to watch TV, listen to music, model the weather or any other task that requires information to be processed, there are millions and millions of binary calculations going on in the background. There are zeros and ones being flipped, added, multiplied and divided at incredible speeds.

The fact that a microprocessor can perform these calculations billions of times a second is exactly why computers have revolutionised our lives.

But information processing doesn't come for free. Physics tells us that every time we perform an operation – for example, adding two numbers together – we must pay an energy cost.

And the cost of doing calculations isn't the only energy cost of running a computer. In fact, anyone who has ever used a laptop balanced on their legs will attest that most of the energy gets converted to heat. This heat comes from the resistance that electricity meets when it flows through a material.

It is this wasted energy due to electrical resistance that researchers are hoping to minimise.

Recent advances point to solutions

Running a computer will always consume some energy, but we are a long way (several orders of magnitude) away from computers that are as efficient as the laws of physics allow. Several recent advances give us hope for entirely new solutions to this problem via new materials and new concepts.

Very thin materials

One recent step forward in physics and materials science is being able to build and control materials that are only one or a few atoms thick. When a material forms such a , and the movement of electrons is confined to this sheet, it is possible for electricity to flow without resistance.

There are a range of different materials that show this property (or might show it). Our research at the ARC Centre for Future Low-Energy Electronics Technologies (FLEET) is focused on studying these materials.

The study of shapes

There is also an exciting conceptual leap that helps us understand this property of electricity flow without resistance.

This idea comes from a branch of mathematics called "topology". Topology tells us how to compare shapes: what makes them the same and what makes them different.

Image a coffee cup made from soft clay. You could slowly squish and squeeze this shape until it looks like a donut. The hole in the handle of the cup becomes the hole in the donut, and the rest of the cup gets squished to form part of the donut.

Topology tells us that donuts and coffee cups are equivalent because we can deform one into the other without cutting it, poking holes in it, or joining pieces together.

It turns out that the strange rules that govern how electricity flows in thin layers can be understood in terms of topology. This insight was the focus of the 2016 Nobel Prize, and it's driving an enormous amount of current research in physics and engineering.

We want to take advantage of these new materials and insights to develop the next generation of low-energy electronics devices, which will be based on topological science to allow electricity to flow with minimal resistance.

This work creates the possibility of a sustainable continuation of the IT revolution – without the huge energy cost.

Explore further: Merging memory and computation, programmable chip speeds AI, slashes power use

Related Stories

Solution for next generation nanochips comes out of thin air

November 19, 2018

Researchers at RMIT University have engineered a new type of transistor, the building block for all electronics. Instead of sending electrical currents through silicon, these transistors send electrons through narrow air ...

Research could lead to more efficient electronics

June 4, 2018

A Rutgers-led team of physicists has demonstrated a way to conduct electricity between transistors without energy loss, opening the door to low-power electronics and, potentially, quantum computing that would be far faster ...

How to make computers faster and climate friendly

September 26, 2018

Your smartphone is far more powerful than the NASA computers that put Neil Armstrong and Buzz Aldrin on the moon in 1969, but it is also an energy hog. In computing, energy use is often considered a secondary problem to speed ...

When is a coffee mug a donut? Topology explains it

October 4, 2016

A topologist is a person who cannot tell the difference between a coffee mug and a donut—so goes a joke about a little-known scientific field crowned Tuesday with a Nobel Physics Prize.

Recommended for you

Uber filed paperwork for IPO: report

December 8, 2018

Ride-share company Uber quietly filed paperwork this week for its initial public offering, the Wall Street Journal reported late Friday.

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.