Machine-learning revolutionises software development

Mar 10, 2010
Machine-learning revolutionises software development

(PhysOrg.com) -- Automation technology has revolutionised the fine-tuning needed to maximise software performance on devices such as mobile phones.

Application developers for software on mobile phones and other embedded devices can achieve acceptable performance levels ten times faster thanks to a breakthrough by European researchers.

Human-readable software code needs to be translated into binary code by a compiler if it is to run on . When hardware is upgraded the software’s compiler usually needs to be tweaked or ‘tuned’ to optimise its performance. If compilers are not optimised for the hardware, doubling the processor size or increasing processor speed can actually result in a loss of software performance, not an improvement.

But hardware is changing so quickly compiler developers can’t keep up and compiler optimisation has become a bottleneck in the development process.

Using machine-learning technology, researchers on the Milepost project have developed an automatic way to optimise compilers for re-configurable embedded processors. Whether it is mobile phones, laptop computers or entire systems, the technology automatically learns how to get the best performance from the hardware and the software will run faster and use less energy.

Industry revolution

“All the compiler teams at the big companies are rethinking the way they do things as a result of this,” says Professor Michael O’Boyle, from the University of Edinburgh, and project coordinator for Milepost.

“Automation provides compiler developers with leverage to be more experimental. They can try new ideas, new analyses and new optimisations. The machine-learning technology analyses whether it works and when it works. It opens up a whole new area of research and a whole area of performance gains that we couldn’t try before. For instance, we were able to deliver a portable compiler that can work across any future architecture configuration.”

The Milepost GCC technology learns to predict the optimal compiler solution for any new program by analysing the execution time of various compiler options and the amount of code in their training programs.

The key technical challenge for the Milepost team was to describe programs and hardware in ways that machine-learning technology could use. That also meant completely redesigning compilers to enable them to use the new machine-learning technology.

Better software performance can open up new opportunities for product suppliers, explains O’Boyle. “If you can run things faster and more energy efficiently, you may be able to choose a different piece of hardware than before - perhaps a cheaper option for the same performance. Alternatively, you could add more functionality without increasing energy usage. You get more for your money.”

French company CAPS Enterprise SAS, one of the participants in the EU-funded Milepost project, planned to include Milepost technologies in its new set of tools. Other participants, including IBM, are using Milepost GCC to get better performance from their processors, making their products more attractive to customers.

Open source auto-tuner

The Milepost team has launched a code tuning website for the compiler development community. Developers can upload their to the site and automatically get input on how to tune their code so it works faster.

“This is one of the most successful projects I have been involved in,” says Michael O’Boyle. He and his fellow researchers are now seeking to apply the lessons of Milepost to help solve the challenges of next -generation computer technologies.

“We can use machine-learning technologies to look at multi-core and heterogeneous platforms and we will be looking at dynamic online adaptation,” he says.

But as workloads change, can we reconfigure hardware and to make it adaptable to the fine grain and big scalability challenges we will have when we move from 2, 4 or 8 cores to thousands of cores on a chip?

This is the big question facing developers of the future. And the smart money will be on the Milepost researchers to answer it.

Explore further: Modeling the ripples of health care information

More information: Milepost project: www.milepost.eu/

Related Stories

Intel Upgrades Software Tools to Support Mac OS X Leopard

Nov 28, 2007

Intel Corporation today announced an upgrade of its popular software tools suite for Mac OS X Leopard, the sixth major version of Apple's advanced operating system. The latest 10.1 version of the Intel C++ Compiler and Intel ...

Brand New 32-bit RISC Core

Sep 14, 2004

Today, Cambridge Consultants launches a novel 32-bit RISC core that brings a new level of code density and power economy to deeply-embedded applications. The XAP3 core is available in Verilog RTL and can be fabricated in und ...

New project aims to boost performance on every chip

Apr 07, 2009

The Defense Advanced Research Projects Agency (DARPA), as part of its Architecture Aware Compiler Environment Program, has awarded Rice University $16 million to develop a new set of tools that can improve the performance ...

Recommended for you

Forging a photo is easy, but how do you spot a fake?

Nov 21, 2014

Faking photographs is not a new phenomenon. The Cottingley Fairies seemed convincing to some in 1917, just as the images recently broadcast on Russian television, purporting to be satellite images showin ...

Algorithm, not live committee, performs author ranking

Nov 21, 2014

Thousands of authors' works enter the public domain each year, but only a small number of them end up being widely available. So how to choose the ones taking center-stage? And how well can a machine-learning ...

User comments : 5

Adjust slider to filter visible comments by rank

Display comments: newest first

maxcypher
3 / 5 (2) Mar 10, 2010
How many generations of computers designing computers will it take to create computers that work in ways completely beyond human comprehension? Figure that out and we'll know our deadline for improving our own intelligence in order to stay on the evolutionary cutting edge.
gennoveus
1 / 5 (1) Mar 10, 2010
Excellent comment maxcypher.

Hopefully neural/cybernetic implants, BCIs, etc. will start to mature at around about the same time as the singularity. People seem to be scared of the idea of "intellegent robots vs humans," but by the time we have these A.I.s the line between what is a human and what is a machine will be blurred. (as will the line between physical and virtual, when a memory/experience can be artificially generated)

... probably. Maybe.
Arikin
not rated yet Mar 10, 2010
maxcypher. This is just a real-time optimization tool. It doesn't design the code in the first place...

Suggesting that computers design computers from scratch is sort of like throwing a random sample of cell cultures into a lake and expecting them to evolve into a new species...
trekgeek1
not rated yet Mar 10, 2010
"Suggesting that computers design computers from scratch is sort of like throwing a random sample of cell cultures into a lake and expecting them to evolve into a new species..."

Isn't that basically how Nylonase was created?
Arikin
not rated yet Mar 12, 2010
Yes the problem with my analogy is that the cells are already alive. And yes they could change their digestive enzymes to digest nylon...

But the computer code still isn't alive.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.