Powerful new metric quickly reveals network structure at multiple scales

August 19, 2016, Santa Fe Institute
The authors tested their new Onion Decomposition on, among other networks, the stanford.edu World Wide Web network domain. In the first layer (left), they discovered an unexpected subnetwork structure — some 8,500 nodes with two connections (~3% of the entire network) joined in chain-like fashion, a deviation from the expected centralized tree-like structure; such surprising structures seemed to occur on all scales within the Stanford network, they note. In the sixth layer they found a more typical, highly centralized community structure suggesting many groups governed by only a few nodes. Credit: Hébert-Dufresne, Allard, and Grochow

What does a network look like? It typically depends on what scale you're analyzing.

Researchers often want to know what hidden structures lie within data representing real-world networks, from power grids to the internet. To do this, they employ a variety of methods and metrics, but these methods are limited. Approaches that identify microscopic features miss the big structural picture. Methods that reveal macroscopic organization don't reliably show how the is constructed – and also tend to be computationally intensive.

"We don't have a good toolbox to get a quick understanding of the network structure," says Laurent Hébert-Dufresne, a James S. McDonnell Fellow at the Santa Fe Institute. The recent explosion in data available to researchers, both in the variety of sources and in the sheer size of the datasets, emphasizes the need for improved methods to analyze and synthesize structural information about .

Hébert-Dufresne's frustration with the current state of network analysis inspired him – together with Antoine Allard of the University of Barcelona and Joshua Grochow, an SFI Omidyar Fellow – to devise a new tool. The trio recently developed a new metric that hopscotches over the limitations of other methods, revealing at the microscopic, mesoscopic, and macroscopic levels in one clean blow. They introduced their approach, which they dubbed the "Onion Decomposition" as a metaphor for peeling back the layers of an onion, this week in Scientific Reports.

Complex networks are usually depicted as nodes, or dots, connected by edges, or lines. The researchers' new tool analyzes a network by taking it apart – peeling away "layers" of nodes that have the same number of connections, usually starting with the layers having the fewest numbers of connections. That's not new, strictly speaking; the same approach is used by another powerful algorithm called "k-core decomposition."

However, as it peels away the layers, k-core decomposition loses valuable information about those layers, says Hébert-Dufresne. His group's goal was to build an algorithm that capitalizes on the benefits of k-core decomposition but also uses layer-level information to provide insights about the network at multiple scales.

In their paper, the researchers report successful test drives of their method on a handful of real-world datasets, including the Northwestern U.S. power grid and the road system of Pennsylvania. In both cases, the onion delivered accurate snapshots of the network structures at different scales, and the authors were able to draw some interesting conclusions. By removing a few nodes in the power grid network, for example, the network's overall connectivity would quickly collapse; the road network is more robust, however, and the network would remain more or less well connected despite the removal of a few nodes.

Hébert-Dufresne sees the new metric as a valuable first step for that "allows us to understand, at a glance, whether a network is tree-like or grid-like, how heterogeneous it is, and even identify surprising subgraphs."

For instance, the paper demonstrates the easy detection of long chains of websites that do nothing but hold hands in subsets of the . The method could improve models of disease spread by not only considering how connected individuals are, but also how central they are (that is, how deep they are in the layers of a network).

Since creating the tool, he and his collaborators have been using it in their own research in a variety of settings, from social networks to food webs.

"The best testament for why we think this will be useful, even now that the project is done, is that this is a tool that we use," says Hébert-Dufresne.

Explore further: Can social isolation fuel epidemics?

More information: Laurent Hébert-Dufresne et al. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition, Scientific Reports (2016). DOI: 10.1038/srep31708

Related Stories

Can social isolation fuel epidemics?

July 21, 2015

Conventional wisdom has it that the more people stay within their own social groups and avoid others, the less likely it is small disease outbreaks turn into full-blown epidemics. But the conventional wisdom is wrong, according ...

In gene networks, it's location, location, location

October 29, 2015

From appearance to endurance, nature's adaptations all trace back to complex molecular networks of living things. Improving our understanding of how genes give rise to outward adaptations may hinge on three concepts from ...

Worldwide quantum web may be possible with help from graphs

June 8, 2016

(Phys.org)—One of the most ambitious endeavors in quantum physics right now is to build a large-scale quantum network that could one day span the entire globe. In a new study, physicists have shown that describing quantum ...

New technique for dataset cluster detection

November 27, 2013

(Phys.org) —A persistent problem for mathematicians trying to understand the structures of networks – in datasets representing relationships among everything from galaxies to people – is community detection: finding ...

Recommended for you

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

Steelwolf
1 / 5 (2) Aug 19, 2016
Being able to match Macro, Meso and Micro Scales and find the similarities of patterns is what I have been talking about for some time, including everything from the extreme cosmic events to the extremely micro events happening within an atom. Now they have a metric to be able to use to explore these relationships better. Seems that I am somewhat vindicated in what I have been saying for some time (and taken the ridicule) that systems are comparative fractal iterations of themselves at widely various scales, thus Az asking about homogenicity and randomness seen in the macro-universe; It is the same as we see in the ocean at micro-scale, with it's intensely diverse number of elements and compounds. We are even finding finer grades of matter/energy interactions than electrons, and neutrinos, and finding that we have to take into account the Present Universe and how things will react within these constraints, that might act differently under different ones or different scales of order.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.