Code breakthrough delivers safer computing

Sep 25, 2009

(PhysOrg.com) -- Computer researchers at UNSW and NICTA have achieved a breakthrough in software which will deliver significant increases in security and reliability and has the potential to be a major commercialisation success.

Professor Gernot Heiser, the John Lions Chair in Science in the School of Computer Science and Engineering and a senior principal researcher with NICTA, said for the first time a team had been able to prove with mathematical rigour that an operating-system kernel - the at the heart of any computer or microprocessor - was 100 per cent bug-free and therefore immune to crashes and failures.

The breakthrough has major implications for improving the reliability of critical systems such as medical machinery, military systems and aircraft, where failure due to a error could have disastrous results.

“A rule of thumb is that reasonably engineered software has about 10 bugs per thousand lines of code, with really high quality software you can get that down to maybe one or three bugs per thousand lines of code,” Professor Heiser said.

“That can mean there are a lot of bugs in a system. What we’ve shown is that it’s possible to make the lowest level, the most critical, and in a way the most dangerous part of the system provably fault free.”

“I think that’s not an exaggeration to say that really opens up a completely new world with respect to building new systems that are highly trustworthy, highly secure and safe.”

Verifying the kernel - known as the seL4 microkernel - involved mathematically proving the correctness of about 7,500 lines of in an project taking an average of six people more than five years.

“The NICTA team has achieved a landmark result which will be a game changer for security-and-safety-critical software,” Professor Heiser said.

“The verification provides conclusive evidence that bug-free software is possible, and in the future, nothing less should be considered acceptable where critical assets are at stake.”

Provided by University of New South Wales (news : web)

Explore further: Communication-optimal algorithms for contracting distributed tensors

add to favorites email to friend print save as pdf

Related Stories

Software 'Chipper' Speeds Debugging

Oct 01, 2007

Computer scientists at UC Davis have developed a technique to speed up program debugging by automatically "chipping" the software into smaller pieces so that bugs can be isolated more easily.

Serious Samba Problems

May 17, 2007

Three critical bugs in the popular open-source program allow for system compromise.

Software industry's 'patch culture' attack

Jun 06, 2006

An attack from the security chief of software giant Oracle on the so-called culture of patching and bug-ridden products in the software industry has drawn fire from industry observers, citing the comments as hypocritical ...

Recommended for you

Designing exascale computers

Jul 23, 2014

"Imagine a heart surgeon operating to repair a blocked coronary artery. Someday soon, the surgeon might run a detailed computer simulation of blood flowing through the patient's arteries, showing how millions ...

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

RayCherry
not rated yet Sep 25, 2009
Seriously good knews for computer systems development labs. But what about the rest of us?

Considering the efforts of this team, how much effort would be required to verify a complete operating system, and then the productivity software that we use to do our day-to-day work?

It took 30 work-years to verify just this 7500 line module - albeit the most critical module. But that number of lines is laughable in consideration of a single software product, such as an office application. How can consumers be informed of the quality of the software they purchase, if the verification process renders the software obsolete before it can reach the market?

Or is this kernal exclusively for laboratories, and its verification process a once only event?