New study challenges long-accepted views on human-autonomy interaction

August 17, 2017
Army scientists and engineers have challenged long-held views in the area of human-autonomy interaction. Credit: US Army research Laboratory

A team of Army scientists and engineers have challenged long-held views in the area of human-autonomy interaction to change the way science involves people, especially in developing advanced technical systems that involve artificial intelligence and autonomy.

As part of a research program initially funded in 2013 by the Office of the Secretary of Defense, U.S. Army Research Laboratory researchers led a multi-disciplinary team of Department of Defense, industry, and academic researchers to develop a novel, general-purpose principled framework.

The research team proposes what they've named the Privileged Sensing Framework, which was conceived to leverage recent advances in human sensing technologies to dynamically integrate human and autonomous agents on the basis of their individual characteristics. For example, Humans tend to easily adapt to changes in the environment or task. Autonomous agents typically can process large amounts of data more quickly than humans, Marathe explained.

The focus of this research was to demonstrate how the PSF preserves the human as a primary, critical and central authority while also enabling , like robots, to detect and mitigate when people's decisions or actions would lead to dysfunction or even catastrophe, said Dr. Amar Marathe, a researcher in ARL's Real-World Soldier Quantification Branch.

"The research was fundamentally enabled by a critical move towards a novel control systems framework that can account for dynamic interactions among information components that impact the value of that information and yet appropriately propagates into robust overall decisions. The PSF provides an evolved approach to HAI that treats the human as a special class of sensor rather than as the ultimate and absolute command arbiter.

The PSF was based on the concept of appropriately 'privileging' information during the process of integration to provide special rights to specific agents based on their capabilities within the current task context, and the performance goals.

"Through a series of simulation experiments, the PSF significantly improved joint human-autonomy performance without sacrificing the gains to be made from incorporating human strengths.

"Additional studies have extended this approach into a wide range of applications that include joint human-autonomy driving, human-autonomy target detection, and command and control. Overall, these efforts provide further evidence that the incorporation of the principles of the PSF can provide improved performance of joint human-autonomy systems across a wide range of applications," said Marathe.

He said future efforts will focus on developing novel methods for incorporating the PSF into experimental human-autonomy systems to enable further testing of the impact this approach on human-autonomy system performance, and generalizing the framework to accommodate a variety of tasks and scenarios.

In about 20 years or so, Marathe estimates, the inception of a generalizable framework that incorporates dynamic estimates of human capabilities to facilitate and advance human-autonomy interaction, the researchers argue, provides rich opportunity to revolutionize capabilities of multi-agent cooperative teams across a broad range of applications.

Human-automation integration challenges were addressed in human-computer coupled visual search, real-time mitigation of mistrust in automation, advanced commander decision aides, and in-the-loop test and evaluation of human-robot systems.

Marathe said the research was motivated by persistent, fundamental issues that have thus far precluded the transition of advanced automation and autonomous technologies from the laboratory into the operational environment.

"Generally, humans readily adapt to varying task and environmental complexities during decision making and therefore are often treated as a failsafe for cases where autonomous technology underperforms. However, humans are constantly changing due to factors such as fatigue or shifts in attention, which means that even skilled humans sometimes make errors. The inherent variability in human performance makes the problem of integrating humans in the loop with extremely challenging," he said.

Until recently, most frameworks for human autonomy integration (HAI) have preserved a central role for the human while neglecting the important role of human variability, Marathe noted. "As a result, human excellence has not been fully exploited and neither has human failure been fully offset, leaving joint human-autonomy systems fundamentally incapable of achieving their full potential."

Explore further: On the road to autonomy, remember the operator

Related Stories

On the road to autonomy, remember the operator

December 19, 2016

Automakers and their tech partners may be looking toward a future when they can offer fully autonomous cars, but Mica Endsley, the author of the recently released Human Factors paper, "From Here to Autonomy," contends that ...

Intel set to roll out 100 self-driving cars

August 9, 2017

Silicon Valley giant Intel on Wednesday announced plans for a fleet of self-driving cars following its completion of the purchase of Israeli autonomous technology firm Mobileye.

Expert discusses the future of human-centered robotics

May 16, 2017

Science and technology are essential tools for innovation, and to reap their full potential, we also need to articulate and solve the many aspects of today's global issues that are rooted in the political, cultural, and economic ...

Finding trust and understanding in autonomous technologies

January 2, 2017

In 2016, self-driving cars went mainstream. Uber's autonomous vehicles became ubiquitous in neighborhoods where I live in Pittsburgh, and briefly in San Francisco. The U.S. Department of Transportation issued new regulatory ...

Recommended for you

US faces moment of truth on 'net neutrality'

December 14, 2017

The acrimonious battle over "net neutrality" in America comes to a head Thursday with a US agency set to vote to roll back rules enacted two years earlier aimed at preventing a "two-speed" internet.

FCC votes along party lines to end 'net neutrality' (Update)

December 14, 2017

The Federal Communications Commission repealed the Obama-era "net neutrality" rules Thursday, giving internet service providers like Verizon, Comcast and AT&T a free hand to slow or block websites and apps as they see fit ...

The wet road to fast and stable batteries

December 14, 2017

An international team of scientists—including several researchers from the U.S. Department of Energy's (DOE) Argonne National Laboratory—has discovered an anode battery material with superfast charging and stable operation ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

Ojorf
not rated yet Aug 18, 2017
The PSF was based on the concept of appropriately 'privileging' information during the process of integration to provide special rights to specific agents based on their capabilities within the current task context, and the performance goals.


and

Human-automation integration challenges were addressed in human-computer coupled visual search, real-time mitigation of mistrust in automation, advanced commander decision aides, and in-the-loop test and evaluation of human-robot systems.


Cool, but such an incredibly awkwardly written article.
Is it an army thing?

I, for one, look forward to our cyborg overlords.
;-)

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.