New design techniques enable extremely reliable medical devices

Mar 12, 2012
Implantable deep brain stimulators will benefit from the new Desyre approach for extremely reliable chips.

For pacemakers and other implantable medical devices there are three key factors: extreme reliability, small size, and long longevity. In the EU project Desyre, researchers tackle these issues with a new approach: building a reliable system on unreliable components.

To counter the increasing fault-rates expected in the next technology generations, Desyre develops new design techniques for future Systems-on-Chips to improve reliability while at the same time reducing power and performance overheads associated with . Ioannis Sourdis, Assistant Professor in at Chalmers, is the project leader of DeSyRe (on-Demand System Reliability).

“We focus on the design of future highly reliable Systems-on-Chips that consume far less power than other designs for high reliability systems,” he says. ”This approach allows by design devices that combine high reliability with small batteries and state-of-the-art . It is perfect for safety-critical applications such as in implantable medical devices, for example or deep brain stimulators that treat Parkinson’s disease”.

Research in reliable systems typically focuses on fail-safe mechanisms that use various redundancy schemes, in which sensitive subsystems are entirely doubled as a fail-safe. Checking for faults in the subsystem increases the energy consumption and decreases the performance of chips, as testing all subsystems cost time and energy.

The Desyre consortium takes a different approach, and separates the System-on-Chip (SoC) into two different areas: one which is extremely resistant to faults, and one area with fault-prone processing cores. The cores on the fault-prone area are interchangeable and the task of one core can easily be transferred to any of the other cores in case of a diagnosed malfunction. The fault-free part of the is responsible for monitoring the operation of the fault-prone part by performing sanity-checks of the processing cores, and for assuring that each core correctly handles an assigned sub-task.

“It sounds perhaps counterintuitive to design a highly reliable System-on-Chip on the basis of components that may fail, and yet this is exactly what we propose to do. Since our subsystems consist of small, interchangeable processing cores, we can test and exclude individual cores while the function of the whole systems stays intact”, says Gerard Rauwerda, CTO of Recore Systems, one of the industry partners of Desyre. "The beauty of the Desyre approach is that the system continues to do its job reliably, even if one or more cores fail, extending chip longevity."

The researchers expect this type of fault-tolerance to reduce energy consumption by at least ten to twenty percent compared to other redundancy schemes, while at the same time minimizing penalty on performance.

"People that need implantable will also benefit from this, as it pays off in a longer battery life and a postponed device replacement without any compromise to reliability," Ioannis Sourdis concludes.

Ioannis Sourdis explains the Desyre’s approach on Monday March 12 in a tutorial on “Hardware and software design and verification for safety critical electronic systems” during the Date 2012 conference in Dresden, Germany.

Explore further: First steps towards "Experimental Literature 2.0"

Provided by Chalmer's University of Technology

not rated yet
add to favorites email to friend print save as pdf

Related Stories

Nvidia says Kal-El chip will have five cores

Sep 22, 2011

(PhysOrg.com) -- Nvidia says its upcoming Kal-El chip (Tegra 3) will have five cores, not four. The news appeared this week when the Santa Clara company announced a white paper describing the architecture of this system-on-a-chip for mobile computing. ...

New hardware boosts communication speed on multi-core chips

Jan 31, 2011

Computer engineers at North Carolina State University have developed hardware that allows programs to operate more efficiently by significantly boosting the speed at which the "cores" on a computer chip communicate with each ...

AMD Planning 16-Core Server Chip For 2011 Release

Apr 27, 2009

(PhysOrg.com) -- AMD is in the process of designing a server chip with up to 16-cores. Code named Interlagos, the server chip will contain between 12 and 16 cores and will be available in 2011.

Recommended for you

First steps towards "Experimental Literature 2.0"

2 hours ago

As part of a student's thesis, the Laboratory of Digital Humanities at EPFL has developed an application that aims at rearranging literary works by changing their chapter order. "The human simulation" a saga ...

User comments : 0

More news stories

Students take clot-buster for a spin

(Phys.org) —In the hands of some Rice University senior engineering students, a fishing rod is more than what it seems. For them, it's a way to help destroy blood clots that threaten lives.

Finnish inventor rethinks design of the axe

(Phys.org) —Finnish inventor Heikki Kärnä is the man behind the Vipukirves Leveraxe, which is a precision tool for splitting firewood. He designed the tool to make the job easier and more efficient, with ...

First steps towards "Experimental Literature 2.0"

As part of a student's thesis, the Laboratory of Digital Humanities at EPFL has developed an application that aims at rearranging literary works by changing their chapter order. "The human simulation" a saga ...