Code breakthrough delivers safer computing

Sep 25, 2009

( -- Computer researchers at UNSW and NICTA have achieved a breakthrough in software which will deliver significant increases in security and reliability and has the potential to be a major commercialisation success.

Professor Gernot Heiser, the John Lions Chair in Science in the School of Computer Science and Engineering and a senior principal researcher with NICTA, said for the first time a team had been able to prove with mathematical rigour that an operating-system kernel - the at the heart of any computer or microprocessor - was 100 per cent bug-free and therefore immune to crashes and failures.

The breakthrough has major implications for improving the reliability of critical systems such as medical machinery, military systems and aircraft, where failure due to a error could have disastrous results.

“A rule of thumb is that reasonably engineered software has about 10 bugs per thousand lines of code, with really high quality software you can get that down to maybe one or three bugs per thousand lines of code,” Professor Heiser said.

“That can mean there are a lot of bugs in a system. What we’ve shown is that it’s possible to make the lowest level, the most critical, and in a way the most dangerous part of the system provably fault free.”

“I think that’s not an exaggeration to say that really opens up a completely new world with respect to building new systems that are highly trustworthy, highly secure and safe.”

Verifying the kernel - known as the seL4 microkernel - involved mathematically proving the correctness of about 7,500 lines of in an project taking an average of six people more than five years.

“The NICTA team has achieved a landmark result which will be a game changer for security-and-safety-critical software,” Professor Heiser said.

“The verification provides conclusive evidence that bug-free software is possible, and in the future, nothing less should be considered acceptable where critical assets are at stake.”

Provided by University of New South Wales (news : web)

Explore further: Computerized emotion detector

add to favorites email to friend print save as pdf

Related Stories

Software 'Chipper' Speeds Debugging

Oct 01, 2007

Computer scientists at UC Davis have developed a technique to speed up program debugging by automatically "chipping" the software into smaller pieces so that bugs can be isolated more easily.

Serious Samba Problems

May 17, 2007

Three critical bugs in the popular open-source program allow for system compromise.

Software industry's 'patch culture' attack

Jun 06, 2006

An attack from the security chief of software giant Oracle on the so-called culture of patching and bug-ridden products in the software industry has drawn fire from industry observers, citing the comments as hypocritical ...

Recommended for you

Computerized emotion detector

20 hours ago

Face recognition software measures various parameters in a mug shot, such as the distance between the person's eyes, the height from lip to top of their nose and various other metrics and then compares it with photos of people ...

Cutting the cloud computing carbon cost

Sep 12, 2014

Cloud computing involves displacing data storage and processing from the user's computer on to remote servers. It can provide users with more storage space and computing power that they can then access from anywhere in the ...

Teaching computers the nuances of human conversation

Sep 12, 2014

Computer scientists have successfully developed programs to recognize spoken language, as in automated phone systems that respond to voice prompts and voice-activated assistants like Apple's Siri.

Mapping the connections between diverse sets of data

Sep 12, 2014

What is a map? Most often, it's a visual tool used to demonstrate the relationship between multiple places in geographic space. They're useful because you can look at one and very quickly pick up on the general ...

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Sep 25, 2009
Seriously good knews for computer systems development labs. But what about the rest of us?

Considering the efforts of this team, how much effort would be required to verify a complete operating system, and then the productivity software that we use to do our day-to-day work?

It took 30 work-years to verify just this 7500 line module - albeit the most critical module. But that number of lines is laughable in consideration of a single software product, such as an office application. How can consumers be informed of the quality of the software they purchase, if the verification process renders the software obsolete before it can reach the market?

Or is this kernal exclusively for laboratories, and its verification process a once only event?