New tool enables powerful data analysis

Jan 08, 2009
After using the algorithm to determine the filament structure of an aerogel -- a lightweight foam used in shielding electronic equipment in satellites -- the researchers were able to compute changes to its structural integrity by the simulated impact of a micrometeorite traveling at 10,000 miles per hour (red sphere on left). Image: Attila Gyulassy/UC Davis Copyright UC Regents

(PhysOrg.com) -- A powerful computing tool that allows scientists to extract features and patterns from enormously large and complex sets of raw data has been developed by scientists at University of California, Davis, and Lawrence Livermore National Laboratory. The tool - a set of problem-solving calculations known as an algorithm - is compact enough to run on computers with as little as two gigabytes of memory.

The team that developed this algorithm has already used it to probe a slew of phenomena represented by billions of data points, including analyzing and creating images of flame surfaces; searching for clusters and voids in a virtual universe experiment; and identifying and tracking pockets of fluid in a simulated mixing of two fluids.

"What we've developed is a workable system of handling any data in any dimension," said Attila Gyulassy, who led the five-year development effort while pursuing a PhD in computer science at UC Davis. "We expect this algorithm will become an integral part of a scientist's toolbox to answer questions about data."

A paper describing the new algorithm was published in the November-December issue of IEEE Transactions on Visualization and Computer Graphics.

This image of an early moment in the simulated mixing of two fluids was created by researchers using a powerful new algorithm they developed to extract features and patterns from massive data sets. In the image, blue and red spheres and the lines between them represent the branching of pockets of fluid. Image: Attila Gyulassi/UC Davis copyright UC Regents


Computers are widely used to perform simulations of real-world phenomena and to capture results of physical experiments and observations, storing this information as collections of numbers. But as the size of these data sets has burgeoned, hand-in-hand with computer capacity, analysis has grown increasingly difficult.

A mathematical tool to extract and visualize useful features from data sets has existed for nearly 40 years - in theory. Called the Morse-Smale complex, it partitions sets by similarity of features and encodes them into mathematical terms. But working with the Morse-Smale complex is not easy. "It's a powerful language. But a cost of that, is that using it meaningfully for practical applications is very difficult," Gyulassy said.

Gyulassy's algorithm divides data sets into parcels of cells, then analyzes each parcel separately using the Morse-Smale complex. Results of those computations are then merged together. As new parcels are created from merged parcels, they are analyzed and merged yet again. At each step, data that do not need to be stored in memory are discarded, drastically reducing the computing power required to run the calculations.

One of Gyulassy's tests of the algorithm was to use it to analyze and track the formation and movement of pockets of fluid in the simulated mixing of two fluids: one dense, one light. The complexity of this data set is so vast - it consists of more than one billion data points on a three-dimensional grid - it challenges even supercomputers, Gyulassy said. Yet the new algorithm with its streamlining features was able to perform the analysis on a laptop computer with just two gigabytes of memory. Although Gyulassy had to wait nearly 24 hours for the little machine to complete its calculations, at the end of this process he could pull up images in mere seconds to illustrate phenomena he was interested in, such as the branching of fluid pockets in the mixture.

Two main factors are driving the need for analysis of large data sets, said co-author Bernd Hamann: a surge in the use of powerful computers that can produce huge amounts of data, and an upswing in affordability and availability of sensing devices that researchers deploy in the field and lab to collect a profusion of data.

"Our data files are becoming larger and larger, while the scientist has less and less time to understand them," said Hamann, a professor of computer science and associate vice chancellor for research at UC Davis. "But what are the data good for if we don't have the means of applying mathematically sound and computationally efficient computer analysis tools to look for what is captured in them?"

Gyulassy is currently developing software that will allow others to put the algorithm to use. He expects the learning curve to be steep for this open-source product, "but if you just learn the minimal amount about what a Morse-Smale complex is," he said, "it will be pretty intuitive."

Source: University of California - Davis

Explore further: Forging a photo is easy, but how do you spot a fake?

add to favorites email to friend print save as pdf

Related Stories

Social sensing game detects classroom bullies

Nov 19, 2014

A social sensing game created at Illinois allows researchers to study natural interactions between children, collect large amounts of data about those interactions and test theories about youth aggression ...

Drone in flight test learns on the fly with special chip

Nov 05, 2014

The great computer challenge for many scientists centers around how well a computer can learn, react and adapt from the environment. Tom Simonite of MIT Technology Review on Tuesday had a report about recent ...

Using 3D printers to print out self-learning robots

Nov 12, 2014

When the robots of the future are set to extract minerals from other planets, they need to be both self-learning and self-repairing. Researchers at Oslo University have already succeeded in producing self-instructing ...

Recommended for you

Forging a photo is easy, but how do you spot a fake?

Nov 21, 2014

Faking photographs is not a new phenomenon. The Cottingley Fairies seemed convincing to some in 1917, just as the images recently broadcast on Russian television, purporting to be satellite images showin ...

Algorithm, not live committee, performs author ranking

Nov 21, 2014

Thousands of authors' works enter the public domain each year, but only a small number of them end up being widely available. So how to choose the ones taking center-stage? And how well can a machine-learning ...

Professor proposes alternative to 'Turing Test'

Nov 19, 2014

(Phys.org) —A Georgia Tech professor is offering an alternative to the celebrated "Turing Test" to determine whether a machine or computer program exhibits human-level intelligence. The Turing Test - originally ...

Image descriptions from computers show gains

Nov 18, 2014

"Man in black shirt is playing guitar." "Man in blue wetsuit is surfing on wave." "Black and white dog jumps over bar." The picture captions were not written by humans but through software capable of accurately ...

Converting data into knowledge

Nov 17, 2014

When a movie-streaming service recommends a new film you might like, sometimes that recommendation becomes a new favorite; other times, the computer's suggestion really misses the mark. Yisong Yue, assistant ...

User comments : 4

Adjust slider to filter visible comments by rank

Display comments: newest first

googleplex
not rated yet Jan 08, 2009
This sounds almost too good to be true. Anyone got a link to the paper?
earls
not rated yet Jan 08, 2009
jrdewoodjr
not rated yet Jan 08, 2009
i'm going to use this as a primary story element> very slick resonant flow curls!
kepler
not rated yet Jan 11, 2009
http://ieeexplore...=4658183&isnumber=4658121
$29 for non-members

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.