Study: Error prevention, rather than correction, best for future of nanoelectronic devices

May 26, 2011 By Wileen Wong Kromhout

The move toward smarter, lighter and more powerful electronics, computers and smartphones depends on whether transistor circuits, the building blocks of such devices, can process large amounts of information. As these circuits get faster and smaller, the number of errors they generate -- arising from heat dissipation, noise and structural disorder -- in the physical information they process increases, which can impede development.

Experts have debated which of two error-suppressing processes is more efficient and efficacious as these are reduced to the nanoscale: (1) physical , in which the device is scaled down in size (and number of electrons) only to the point at which it can still prevent the generation of logical errors, or (2) architectural fault-tolerance, in which the device is continuously scaled down and robust algorithms are used to correct the errors it generates.

In a new study, Vwani Roychowdhury, professor of electrical engineering at the UCLA Henry Samueli School of Engineering and Applied Science and a member of the California NanoSystems Institute at UCLA, and Thomas Szkopek, professor of electrical and computer engineering at McGill University, and colleagues quantified for the first time these error-suppressing processes for model nanoelectronic systems and estimated the minimum number of electrons necessary for reliable circuit logic. They found that physical fault-tolerance in transistor circuits suppresses the error rate per electron exponentially, while even the most efficient architectural fault-tolerance system only suppresses the error rate subexponentially. They conclude that physical fault-tolerance error prevention is better than architectural fault tolerance error correction.

The study contributes a fundamental insight into the reliability of nanoscale transistor device technologies and scaling and may impose a minimum limit on the size of devices. The findings are of immediate relevance to researchers working in transistor-scaling, through to scientists developing new device concepts.

Explore further: Black holes do not exist where space and time do not exist, says new theory

More information: The research was recently published in the peer-reviewed Physical Review Letters and is available online at: prl.aps.org/abstract/PRL/v106/i17/e176801

Abstract
The error rate in complementary transistor circuits is suppressed exponentially in electron number, arising from an intrinsic physical implementation of fault-tolerant error correction. Contrariwise, explicit assembly of gates into the most efficient known fault-tolerant architecture is characterized by a subexponential suppression of error rate with electron number, and incurs significant overhead in wiring and complexity. We conclude that it is more efficient to prevent logical errors with physical fault tolerance than to correct logical errors with fault-tolerant architecture.

add to favorites email to friend print save as pdf

Related Stories

First International Conference on Quantum Error Correction

Oct 01, 2007

Quantum error correction of decoherence and faulty control operations forms the backbone of all of quantum information processing. In spite of remarkable progress on this front ever since the discovery of quantum error correcting ...

'Self-correcting' gates advance quantum computing

Mar 12, 2009

(PhysOrg.com) -- Two Dartmouth researchers have found a way to develop more robust “quantum gates,” which are the elementary building blocks of quantum circuits. Quantum circuits, someday, will be used ...

Recommended for you

Galaxy dust findings confound view of early Universe

9 hours ago

What was the Universe like at the beginning of time? How did the Universe come to be the way it is today?—big questions and huge attention paid when scientists attempt answers. So was the early-universe ...

Seeking cracks in the Standard Model

Jan 30, 2015

In particle physics, it's our business to understand structure. I work on the Large Hadron Collider (LHC) and this machine lets us see and study the smallest structure of all; unimaginably tiny fundamental partic ...

Building the next generation of efficient computers

Jan 29, 2015

UConn researcher Bryan Huey has uncovered new information about the kinetic properties of multiferroic materials that could be a key breakthrough for scientists looking to create a new generation of low-energy, ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.