'We should stop designing perfect circuits'

Oct 02, 2013
'We should stop designing perfect circuits'

Are integrated circuits "too good" for current technological applications? Christian Enz, the new Director of the Institute of Microengineering, backs the idea that perfection is overrated.

Christian Enz, the head of the Integrated Circuits Laboratory (ICLAB), explains why we should build our future devices with unreliable , and adopt the "good enough engineering" trend. Non-fully reliable circuits can lead to a substantial reduction of energy consumption. Even better, they will allow scientists to stay in the race, which has been compromised of late. The size of transistors that constitute circuits cannot be reduced boundlessly. As they get smaller and smaller, they produce more and more mistakes. Some hardware must therefore be added and additional margins taken, which annuls the benefits of miniaturization, and increases energy consumption. Imperfect circuits require less silicon area, and are therefore less energy consuming and less expensive. But Industry remains to be convinced, where giving up perfection is concerned.

How is it that sloppy chips don't adversely affect the performance of the device they're in?

Circuits are generally resilient to a certain statistically small proportion of errors, with only a negligible impact on the final output. Of course, this isn't true for all applications, but you can take a "good enough" approach for "perceptual" uses like audio and video playback. For instance, the screen on a smartphone: here, any impact on image quality will be too small to be perceived. Human sight is an extremely robust system, one that automatically corrects any small errors.

How do you build "inexact" circuits?

The crucial step is determining where you have room for error. We start by looking for spots on the circuits that are underutilized. For example, if you have a circuit dedicated to adding numbers and there aren't too many decimals in the numbers being added, we can try to get rid of the part of the circuit that handles decimal places and see what happens. This sort of "inexact" approach will of course lead to lower numbers on quality metrics like signal-to-noise or image quality, but the result will still be "good enough." This decimal-place technique is known as "inexact arithmetics", while more generally, the approach is known as "good-enough engineering".

What are the main advantages of these kinds of circuits?

Initially, we focused on the possibility to reduce energy consumption. Imperfect circuits do make a certain number of errors, but they nevertheless ultimately deliver almost the same performances as "perfect" circuits. We therefore simply tried to replace simple no-fault circuits with "good-enough" circuits that were sufficient for the minimum requirements of a given application. This reduced and size; the latter, by decreasing the need for silicon, also drove costs down.

We then realized that the robustness of imperfect circuits could actually help us deal with some of the problems inherent in modern technology, and get past the limits we're currently seeing in miniaturization.

Why has the miniaturization of perfect circuits become so difficult?

For the last four decades, every two years the semiconductor industry has doubled the number of transistors that fit onto a given silicon chip. This is in line with Moore's 1965 prediction. Miniaturization has driven the development of computers, tablets and smartphones that are at once powerful, energy efficient and increasingly compact. Today's transistors measure around 20 nanometers (i.e. 20 millionth of a millimeter). The circuits have become so dense that 100% error-free functionality is simply no longer possible, given the increase in manufacturing tolerances. This means you have to add extra circuits to correct the errors and extend the design margins, but that of course cancels out the space gains you get from miniaturization - and the energy savings. In fact, you can actually end up using more energy this way. In a word, we are beginning to hit a wall on miniaturization.

Is it difficult to promote the "good enough" engineering in the Swiss society?

Yes, because the average product designer hates the idea of getting rid of parts of his circuits and intentionally generating errors. So the standard approach today is to make sure the manufactured circuits correspond exactly to design specs. Our approach is thus completely new. It's a paradigm change, one that is difficult to push through in a society that values perfect technology. That said, the "good enough" approach has been getting some traction in the corporate sector, because chip-makers can't see any real alternative. Intel, for example, is interested in "good enough" engineering. In addition, there are teams of research scientists working on it all over the world.

The team of Christian Enz is working on "inexact" circuits as part of a SNSF project called "IneSoC." The group is building on work done at the CSEM (Swiss Center for Electronics and Microtechnology) as part of an international project undertaken with Rice University in Houston, Texas and Nanyang Technology University in Singapore.

EPFL's Andreas Burg also works on defective chips, but takes a somewhat different approach. Rather than voluntarily putting errors into circuits, Prof. Burg studies ways of "making do" with errors that naturally occur during wireless communication.

Explore further: MESA complex starts largest production series in its history

add to favorites email to friend print save as pdf

Related Stories

Using bad chips to build energy efficient smartphones

Jun 27, 2012

(Phys.org) -- Is the hardware powering the current generation of smartphones and computers "too good" for what it has been designed to do? This is the question addressed by Andreas Burg, director of the Telecommunications ...

Cost-saving computer chips get smaller than ever

Aug 27, 2013

Not so long ago, a computer filled a whole room and radio receivers were as big as washing machines. In recent decades, electronic devices have shrunk considerably in size and this trend is expected to continue, ...

'Sandwich chips' combining the best of two technologies

Dec 18, 2012

Two Leibniz institutes in Germany broke new technological ground and successfully combined their – up to now separate – technology worlds. Due to their high performance the novel chips developed within ...

Recommended for you

DARPA seeks new positioning, navigation, timing solutions

Mar 28, 2015

The Defense Advanced Research Projects Agency (DARPA), writing about GPS, said: "The military relies heavily on the Global Positioning System (GPS) for positioning, navigation, and timing (PNT), but GPS access is easily blocked by methods such as jamming. In addition, many environments in which our mil ...

Future US Navy: Robotic sub-hunters, deepsea pods

Mar 28, 2015

The robotic revolution that transformed warfare in the skies will soon extend to the deep sea, with underwater spy "satellites," drone-launching pods on the ocean floor and unmanned ships hunting submarines.

Festo has BionicANTs communicating by the rules for tasks

Mar 27, 2015

Germany-based automation company Festo, focused on technologies for tasks, turns to nature for inspiration, trying to take the cues from how nature performs tasks so efficiently. "Whether it's energy efficiency, ...

User comments : 2

Adjust slider to filter visible comments by rank

Display comments: newest first

5 / 5 (1) Oct 02, 2013
The person who wrote this article clearly has no understanding of technology as evidenced by the ridiculous comment
"if you have a circuit dedicated to adding numbers and there aren't too many decimals in the numbers being added, we can try to get rid of the part of the circuit that handles decimal places and see what happens."
Since processing of "Decimal Places" doesn't actually exist until a number is translated to a human representation, computers use floating point representation so "Decimal Places" do not literally exist in the HW processing.
5 / 5 (1) Oct 02, 2013
Reading this gave me flashes of "Harrison Bergeron", a short story by Kurt Vonnegut, where anything of quality is dumbed-down to some comfortable expectation of unimpressive averageness. The idea of being told what is "good enough" for you is demeaning and dismissive of those who have a developed taste and recognize quality. People who know better now cringe at the deficiencies of lossy-compression in video and audio. I have trouble listening to MP3's, having spent a lot of time in the 80's designing music sound effects and monitoring the results on an oscilloscope.

I certainly don't think this would be acceptable to NASA, or any realm of physics research either.

Have we finally become exceptional at designing average?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.