DARPA advances video analysis tools

Jun 24, 2011

A massive amount of data from video sensors is collected in theater, and there aren’t enough analysts or time available to review. Reducing the amount of data or the number of sensors isn’t the answer, and there will never be enough analysts. The solution lies in better automated capabilities that can identify areas and activities that require human analyst attention.

DARPA’s Video and Image Retrieval and Analysis Tool (VIRAT) and Persistent Stare Exploitation and Analysis System (PerSEAS) programs may soon enable better warfighter analysis of huge amounts of data generated from multiple types of sensors. “Bad guys do bad things, such as all the actions involved in burying an IED – so it is activity that matters. This is especially true when bad guys look, dress, and drive vehicles like those around them,” said Mita Desai, program manager for VIRAT and PerSEAS. “The analysis tools to find these activities and the underlying actions just don’t exist, which is why there is such interest in activity-based analysis and exploitation.”

Text searching and algorithms for facial and other object recognition already exist. Up to now, finding actions of interest within previously untagged, raw video has been a resource drain and such a technical challenge as to seem ‘impossible’. Desai explained, “The objectives of VIRAT and PerSEAS are NOT to replace human analysts, but to make them more effective and efficient by reducing their cognitive load and enabling them to search for activities and threats quickly and easily.”

This video is not supported by your browser at this time.
DARPA Video and Image Retrieval and Analysis Tool (VIRAT)

With VIRAT’s ability to identify and highlight key actions and PerSEAS’s ability to ‘see’ dangerous combinations of actions as activities, analysts will soon be able to concentrate on more detailed reviews and understanding of the data.

The initial prototype of VIRAT works in both forensic mode (finding activities in thousands of hours of archived, untagged data) and streaming mode (highlighting actions as the video download is being viewed). Additional research and development work is underway for PerSEAS prior to deployment for field testing.

VIRAT is focused on full-motion video, from platforms such as Predator or Aerostats, allowing analysts to either monitor the live downlink for specific actions of interest or search an existing archive for past occurrences. There searches are conducted using a video clip as the input query. VIRAT finds actions that are short in duration and occur in small geographic areas. PerSEAS focuses on wide-area coverage, such as data from Constant Hawk, Gorgon Stare, ARGUS-IS and other persistent sensors. PerSEAS observes multiple actions over a long duration and large geographic regions to postulate complex threat activities. Algorithms from VIRAT provide some of the underlying capabilities within PerSEAS.

Explore further: MIT groups develop smartphone system THAW that allows for direct interaction between devices

add to favorites email to friend print save as pdf

Related Stories

Modern society made up of all types

Nov 04, 2010

Modern society has an intense interest in classifying people into ‘types’, according to a University of Melbourne Cultural Historian, leading to potentially catastrophic life-changing outcomes for those typed – ...

'Robin Hoods of the digital age’

Dec 08, 2010

Many illegal file sharers believe they are the 'Robin Hoods of the digital age' and are motivated by altruism and a desire for notoriety, according to new research which analyses why people illegally download digital media.

Robot, object, action!

Oct 29, 2010

Robotic demonstrators developed by European researchers produce compelling evidence that ‘thinking-by-doing’ is the machine cognition paradigm of the future. Robots act on objects and teach themselves ...

Recommended for you

Who drives Alibaba's Taobao traffic—buyers or sellers?

22 hours ago

As Chinese e-commerce firm Alibaba prepares for what could be the biggest IPO in history, University of Michigan professor Puneet Manchanda dug into its Taobao website data to help solve a lingering chicken-and-egg question.

Computerized emotion detector

Sep 16, 2014

Face recognition software measures various parameters in a mug shot, such as the distance between the person's eyes, the height from lip to top of their nose and various other metrics and then compares it with photos of people ...

Cutting the cloud computing carbon cost

Sep 12, 2014

Cloud computing involves displacing data storage and processing from the user's computer on to remote servers. It can provide users with more storage space and computing power that they can then access from anywhere in the ...

Teaching computers the nuances of human conversation

Sep 12, 2014

Computer scientists have successfully developed programs to recognize spoken language, as in automated phone systems that respond to voice prompts and voice-activated assistants like Apple's Siri.

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Jun 27, 2011
I suppose burying an ied is one, carrying a rifle is another, laying on the ground and looking through binoculars is another, peeking around buildings and scurrying to the next concealment is another....sounds like this technology could be expansive and quite impressive.