New design techniques enable extremely reliable medical devices

Mar 12, 2012
Implantable deep brain stimulators will benefit from the new Desyre approach for extremely reliable chips.

For pacemakers and other implantable medical devices there are three key factors: extreme reliability, small size, and long longevity. In the EU project Desyre, researchers tackle these issues with a new approach: building a reliable system on unreliable components.

To counter the increasing fault-rates expected in the next technology generations, Desyre develops new design techniques for future Systems-on-Chips to improve reliability while at the same time reducing power and performance overheads associated with . Ioannis Sourdis, Assistant Professor in at Chalmers, is the project leader of DeSyRe (on-Demand System Reliability).

“We focus on the design of future highly reliable Systems-on-Chips that consume far less power than other designs for high reliability systems,” he says. ”This approach allows by design devices that combine high reliability with small batteries and state-of-the-art . It is perfect for safety-critical applications such as in implantable medical devices, for example or deep brain stimulators that treat Parkinson’s disease”.

Research in reliable systems typically focuses on fail-safe mechanisms that use various redundancy schemes, in which sensitive subsystems are entirely doubled as a fail-safe. Checking for faults in the subsystem increases the energy consumption and decreases the performance of chips, as testing all subsystems cost time and energy.

The Desyre consortium takes a different approach, and separates the System-on-Chip (SoC) into two different areas: one which is extremely resistant to faults, and one area with fault-prone processing cores. The cores on the fault-prone area are interchangeable and the task of one core can easily be transferred to any of the other cores in case of a diagnosed malfunction. The fault-free part of the is responsible for monitoring the operation of the fault-prone part by performing sanity-checks of the processing cores, and for assuring that each core correctly handles an assigned sub-task.

“It sounds perhaps counterintuitive to design a highly reliable System-on-Chip on the basis of components that may fail, and yet this is exactly what we propose to do. Since our subsystems consist of small, interchangeable processing cores, we can test and exclude individual cores while the function of the whole systems stays intact”, says Gerard Rauwerda, CTO of Recore Systems, one of the industry partners of Desyre. "The beauty of the Desyre approach is that the system continues to do its job reliably, even if one or more cores fail, extending chip longevity."

The researchers expect this type of fault-tolerance to reduce energy consumption by at least ten to twenty percent compared to other redundancy schemes, while at the same time minimizing penalty on performance.

"People that need implantable will also benefit from this, as it pays off in a longer battery life and a postponed device replacement without any compromise to reliability," Ioannis Sourdis concludes.

Ioannis Sourdis explains the Desyre’s approach on Monday March 12 in a tutorial on “Hardware and software design and verification for safety critical electronic systems” during the Date 2012 conference in Dresden, Germany.

Explore further: Forging a photo is easy, but how do you spot a fake?

Provided by Chalmer's University of Technology

not rated yet
add to favorites email to friend print save as pdf

Related Stories

Nvidia says Kal-El chip will have five cores

Sep 22, 2011

(PhysOrg.com) -- Nvidia says its upcoming Kal-El chip (Tegra 3) will have five cores, not four. The news appeared this week when the Santa Clara company announced a white paper describing the architecture of this system-on-a-chip for mobile computing. ...

New hardware boosts communication speed on multi-core chips

Jan 31, 2011

Computer engineers at North Carolina State University have developed hardware that allows programs to operate more efficiently by significantly boosting the speed at which the "cores" on a computer chip communicate with each ...

AMD Planning 16-Core Server Chip For 2011 Release

Apr 27, 2009

(PhysOrg.com) -- AMD is in the process of designing a server chip with up to 16-cores. Code named Interlagos, the server chip will contain between 12 and 16 cores and will be available in 2011.

Recommended for you

Forging a photo is easy, but how do you spot a fake?

Nov 21, 2014

Faking photographs is not a new phenomenon. The Cottingley Fairies seemed convincing to some in 1917, just as the images recently broadcast on Russian television, purporting to be satellite images showin ...

Algorithm, not live committee, performs author ranking

Nov 21, 2014

Thousands of authors' works enter the public domain each year, but only a small number of them end up being widely available. So how to choose the ones taking center-stage? And how well can a machine-learning ...

Professor proposes alternative to 'Turing Test'

Nov 19, 2014

(Phys.org) —A Georgia Tech professor is offering an alternative to the celebrated "Turing Test" to determine whether a machine or computer program exhibits human-level intelligence. The Turing Test - originally ...

Image descriptions from computers show gains

Nov 18, 2014

"Man in black shirt is playing guitar." "Man in blue wetsuit is surfing on wave." "Black and white dog jumps over bar." The picture captions were not written by humans but through software capable of accurately ...

Converting data into knowledge

Nov 17, 2014

When a movie-streaming service recommends a new film you might like, sometimes that recommendation becomes a new favorite; other times, the computer's suggestion really misses the mark. Yisong Yue, assistant ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.