Understanding collective animal behavior may be in the eye of the computer

Understanding collective animal behavior may be in the eye of the computer
This shows snapshots of video data of collective behavior (collected by NYU undergraduate student Gozde Ustuner) from experiments with (a) ants, (b) fish, (c) frogs, (d) chickens, and (e) humans. Human faces have been obscured to protect privacy. A machine learning system nearly matched human observation of the video in its assessment of collective behavior of each of the species. Credit: NYU School of Engineering

No machine is better at recognizing patterns in nature than the human brain. It takes mere seconds to recognize the order in a flock of birds flying in formation, schooling fish, or an army of a million marching ants. But computer analyses of collective animal behavior are limited by the need for constant tracking and measurement data for each individual; hence, the mechanics of social animal interaction are not fully understood.

An international team of researchers led by Maurizio Porfiri, associate professor of mechanical and aerospace engineering at NYU Polytechnic School of Engineering, has introduced a new paradigm in the study of social in animal species, including humans. Their work is the first to successfully apply toward understanding collective animal behavior from raw data such as video, image or sound, without tracking each individual. The findings stand to significantly impact the field of ethology— the objective study of animal behavior—and may prove as profound as the breakthroughs that allowed robots to learn to recognize obstacles and navigate their environment. The paper was published online today in Scientific Reports.

Starting with the premise that humans have the innate ability to recognize behavior patterns almost subconsciously, the researchers created a framework to apply that instinctive understanding to machine learning techniques. Machine learning algorithms are widely used in applications like and weather trend data, and allow researchers to understand and compare complex sets of data through simple visual representations.

Human interaction captured on video: People exhibited more social interaction than frogs but far less than ants, whether measured by ISOMAP's machine learning system or humans viewing the videos of 10-day experiments. Credit: NYU School of Engineering

A human viewing a flock of flying birds discerns both the coordinated behavior and the formation's shape—a line, for example—without measuring and plotting a dizzying number of coordinates for each bird. For these experiments, the researchers deployed an existing machine learning method called isometric mapping (ISOMAP) to determine if the algorithm could analyze video of that same flock of birds, register the aligned motion, and embed the information on a low-dimensional manifold to visually display the properties of the group's behavior. Thus, a high-dimensional quantitative data set would be represented in a single dimension—a line—mirroring human observation and indicating a high degree of organization within the group.

"We wanted to put ISOMAP to the test alongside human observation," Porfiri explained. "If humans and computers could observe social animal species and arrive at similar characterizations of their behavior, we would have a dramatically better quantitative tool for exploring collective animal behavior than anything we've seen," he said.

The team captured video of five social species—ants, fish, frogs, chickens, and humans—under three sets of conditions—natural motion, and the presence of one and two stimuli—over 10 days. They subjected the raw video to analysis through ISOMAP, producing manifolds representing the groups' behavior and motion. The researchers then tasked a group of observers with watching the videos and assigning a measure of collective behavior to each group under each circumstance. Human rankings were scaled to be visually comparable with the ISOMAP manifolds.

Whether measured by the ISOMAP machine learning system or by humans viewing videos like this of 10 days of experiments, frogs exhibited the least collective behavior of the 5 species studied. The researchers' findings were in line with known qualities of each species. Credit: NYU School of Engineering

The similarities between the human and machine classifications were remarkable. ISOMAP proved capable not only of accurately ascribing a degree of collective interaction that meshed with human observation, but of distinguishing between species. Both humans and ISOMAP ascribed the highest degree of interaction to ants and the least to frogs—analyses that hold true to known qualities of the species. Both were also able to distinguish changes in the animals' collective behavior in the presence of various stimuli.

The researchers believe that this breakthrough is the beginning of an entirely new way of understanding and comparing the behaviors of social animals. Future experiments will focus on expanding the technique to more subtle aspects of ; for example, the chirping of crickets or synchronized flashing of fireflies.

Journal information: Scientific Reports

Citation: Understanding collective animal behavior may be in the eye of the computer (2014, January 16) retrieved 4 July 2024 from https://phys.org/news/2014-01-animal-behavior-eye.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

The secret's in the (robotic) stroke: Researchers tease out cues that impact schooling fish behavior

0 shares

Feedback to editors