Student examines the issue of over-trusting robotic systems

Student examines the issue of over-trusting robotic systems
Booth's robot, Gaia, waits outside the entrance to Quincy House. Credit: Serena Booth

If Hollywood is to be believed, there are two kinds of robots, the friendly and helpful BB-8s, and the sinister and deadly T-1000s. Few would suggest that "Star Wars: the Force Awakens" or "Terminator 2: Judgment Day" are scientifically accurate, but the two popular films beg the question, "Do humans place too much trust in robots?"

The answer to that question is as complex and multifaceted as robots themselves, according to the work of Harvard senior Serena Booth, a computer science concentrator at the John A. Paulson School of Engineering and Applied Sciences. For her senior thesis project, she examined the concept of over-trusting robotic systems by conducting a human-robot interaction study on the Harvard campus. Booth, who was advised by Radhika Nagpal, Fred Kavli Professor of Computer Science, received the Hoopes Prize, a prestigious annual award presented to Harvard College undergraduates for outstanding scholarly research.

During her month-long study, Booth placed a outside several Harvard residence houses. While she controlled the machine remotely and watched its interactions unfold through a camera, the robot approached individuals and groups of students and asked to be let into the keycard-access dorm buildings.

When the robot approached lone individuals, they helped it enter the building in 19 percent of trials. When Booth placed the robot inside the building, and it approached individuals asking to be let outside, they complied with its request 40 percent of the time. Her results indicate that people may feel safety in numbers when interacting with robots, since the machine gained access to the building in 71 percent of cases when it approached groups.

"People were a little bit more likely to let the robot outside than inside, but it wasn't statistically significant," Booth said. "That was interesting, because I thought people would perceive the robot as a security threat."

In fact, only one of the 108 study participants stopped to ask the robot if it had card access to the building.

But the human-robot interactions took on a decidedly friendlier character when Booth disguised the robot as a cookie-delivering agent of a fictional startup, "RobotGrub." When approached by the cookie-delivery robot, individuals let it into the building 76 percent of the time.

"Everyone loved the robot when it was delivering cookies," she said.

Student examines the issue of over-trusting robotic systems
The cookie delivery robot successfully gained entrance into the residence hall. Credit: Serena Booth

Whether they were enamored with the knee-high robot or terrified of it, people displayed a wide range of reactions during Booth's 72 experimental trials. One individual, startled when the robot spoke, ran away and called security, while another gave the robot a wide berth, ignored its request, and entered the building through a different door.

Booth had thought individuals who perceived the robot to be dangerous wouldn't let it inside, but after conducting follow-up interviews, she found that those who felt threatened by the robot were just as likely to help it enter the building.

"Another interesting result was that a lot of people stopped to take pictures of the robot," she said. "In fact, in the follow-up interviews, one of the participants admitted that the reason she let it inside the building was for the Snapchat video."

While Booth's robot was harmless, she is troubled that only one person stopped to consider whether the machine was authorized to enter the dormitory. If the robot had been dangerous—a robotic bomb, for example—the effects of helping it enter the building could have been disastrous, she said.

Student examines the issue of over-trusting robotic systems
Serena Booth and her robot, Gaia, in its cookie-delivery disguise. Credit: Adam Zewe/SEAS Communications

A self-described enthusiast, Booth is excited about the many different ways robots could potentially benefit society, but she cautions that people must be careful not to put blind faith in the motivations and abilities of the machines.

"I'm worried that the results of this study indicate that we trust robots too much," she said. "We are putting ourselves in a position where, as we allow robots to become more integrated into society, we could be setting ourselves up for some really bad outcomes."

Provided by Harvard University

Citation: Student examines the issue of over-trusting robotic systems (2016, May 26) retrieved 25 April 2024 from https://phys.org/news/2016-05-student-issue-over-trusting-robotic.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Hooray for Hollywood robots: Movie machines may boost robot acceptance

17 shares

Feedback to editors