Robots that teach us about ourselves

Robots that teach us about ourselves
Biomechanist Madhusudhan Venkadesan studies how each part of the human body functions together.

Janie, a quiet twelve-year-old girl sits at the table, her hands dropped casually in her lap. She doesn't turn to face you as you walk across the room and sit in the chair beside her. When you ask a question, she won't meet your eyes, and she repeats your words back to you, seemingly noncommittal and uninvolved in the conversation.

Janie is autistic.

But when you put the blue, fuzzy robot dinosaur on the table, the therapeutic session begins. Janie and the dinosaur bond within minutes as the dinosaur fearfully tries to cross an imaginary river that runs along the table. The dinosaur is scared of the river, but Janie is encouraging. "You can do it," she says, enthusiastically, sympathetically. "You can do it."

Brian Scassellati, professor of computer science and mechanical engineering & materials science, designed this session to teach autistic children how to use appropriate tone of voice—one of many ways Scassellati's Social Robotics Lab uses technology to study people and improve their lives. "I'm more interested in people than I am in machines, and the robots we build all serve a purpose,"

he says. "That purpose is to help kids."

In that sense, the most interesting new behavior that children in such sessions develop is a deeper connection to people. Viewing a recording of a session similar to the one with Janie, Scassellati notes how the child keeps glancing over and making contact with the therapist, a behavior known as social referencing. "Before this moment, we've never seen him do that," Scassellati says. "And just five minutes later, he talks to the therapist. He's still orienting away from the table, but despite two-and-a-half days together of one screening test after another, this is the first time he's actually had a with his therapist." In other words, while the child has successfully learned more about tone of voice, his reaction is positive in ways outside of the lesson's intent.

Robots that teach us about ourselves
Professor Brian Scassellati is using commercially available robots, such as the NAO humanoid robot, to test software programs designed to help youngsters with social difficulties.

Scassellati does not yet understand why such changes happen—just that they do. "Believe me," he says, "we've tried so many different things over the years to figure out how it works." His experiments, conducted over the past 13 years, have shown a robust and repeatable positive response to robots in a surprising number of difficult situations for children: teaching nutrition to first graders, and English to first and second graders who speak Spanish or Portuguese at home; presenting options for children who deal with bullying; working with teenagers who have behavioral disorders and anger management troubles. Using a $10 million grant from the National Science Foundation, Scassellati's lab has explored these expanding applications—and more—with a variety of robots, from the commercially available NAO humanoid robot to a custom-built dragonbot that sports wings fabricated by a Sesame Street puppeteer and a face displayed on the screen of a removable Android smartphone. "We need robots that can change and grow with the child," he says, "something that can be personalized to the particular child, something that can recognize what the child knows and doesn't know, and then something that can tailor the experience towards the parts they need. That's the goal."

As an example, he points to Keepon, a robot that looks like an 11-inch high, bright yellow rubber snowman. In one experiment, Keepon tells a story about the robot's imaginary dog, pausing to ask a child who speaks Spanish at home to translate a command for the dog from Spanish into English. Keepon analyzes the child's responses and can recognize which constructions are fully understood and which are not; Keepon then tailors the story to concentrate on the constructions the child doesn't yet fully grasp. "It's personalized tutoring," says Scassellati, adding that initially the child responds quickly because he wants the fun of interacting with the robot. However, after a week of similar sessions, the child learns where the difficult issues are. "And then," says Scassellati, "he works really hard, even if he still doesn't get how to overcome those issues. The excitement doesn't wear off, but he's willing to put that excitement on hold in order to work hard for the robot, which is what we want."

SEAS's newest roboticist, assistant professor of & materials science Madhusudhan Venkadesan, also studies people, though as a biomechanist his particular interest is in how each part of the human body—muscles, tendons, joints, nerves—functions together. For example, one of his experiments explored how our fingers tap the surface of a tablet or smartphone. Mechanically, the task seems simple, but in actuality, touching the screen requires a tricky bit of muscle coordination to shift from pushing your finger forward to holding your fingertip still: switch too early, your finger lands in the wrong place; switch too late, your finger's going to slip. "It turns out people are extraordinary estimators of when contact is going to happen and they switch the strategy 60 milliseconds before the finger lands on the surface," says Venkadesan. "The timing is incredibly precise."

As a larger part of his lab's research, Venkadesan has observed similarly precise neuromuscular coordination in the human ability to throw. Throwing well played an important role in shaping human evolution through our ability to hunt with a spear, and no species can throw as well as humans—even chimpanzees are incapable of throwing faster than 20 miles per hour. Venkadesan explores the foundations of our throwing ability, asking to what extent it's determined by our large brains or by the strength and flexibility of our musculature. "Understanding the mechanical, muscular, and neural basis of high speed throwing is clearly pertinent for sports such as baseball pitching," he says. "Pinpointing the ligaments, tendons, and muscles that experience high stresses during throwing could help us understand and perhaps reduce injuries suffered by even the most highly skilled pitchers."

At Yale, Venkadesan plans to build robots using such insights about human actions, human musculature, and even human evolution—creating machines that emulate human behaviors like tapping a screen or throwing a baseball, though without necessarily mimicking the human body's geometric structure. "By studying and distilling the complexities of human action," he says, "we can learn how the evolved and specialized morphology of humans makes us not only good at what we do, but also energy efficient while doing these things. Then we try to implement these principles on robots." His goal is twofold: create better, more useful robots by applying the design principles learned from studies of human subjects; and use insights from the mechanisms of successful robots to sharpen the understanding of how neural control and evolution work together to help humans move in efficient and stable ways. For example, the principles that enable a robotic finger to tap on a tablet surface—perhaps accomplished using a human-inspired approach—might be used in industry to apply stickers to fragile objects, in prosthetics to create a more responsive robotic hand, and in medicine to help regain dexterity after injury or disease. "It's humans helping robots helping humans," he says.

In addition to looking at hands and arms, a large focus of Venkadesan's research—and robotic inventions—centers on the foot. Almost a quarter of the human body's bones are located in the feet, making it pliable enough to accommodate the shifting balance of walking and running on diverse and inconsistent terrains, while still rigid enough to support weight without injury. Looking at this interplay of flexibility and stiffness, Venkadesan's research seeks to understand how the bones, muscles, tendons, and even signals from the nervous system contribute to maintaining stability during locomotion, especially while running at a marathoner's pace.

Robotic feet built by Venkadesan will contribute to his research by imitating select elements of the human structure and neuromuscular interactions while avoiding direct reconstruction of all the human foot's intricacies. "Each robot has to have a well-defined purpose, a single goal that answers a specific question," he says. "Although the structure of the human foot is incredible, is flexible and versatile, trying to replicate it in complete detail would likely result in me tweaking parameters for the rest of my life—and still without getting anywhere." Instead, Venkadesan might build a robotic foot just to study how the internal structure of the foot and the way it lands on the ground affects its compliance and flexibility. Stability could then be examined by attaching this foot to simple robots that run on rough ground and soft ground, sand and cobblestone. A different robotic foot could do the same tests to show how changes in morphology or mechanical properties of the foot affect running. Venkadesan then starts the cycle over, each foot spurring new questions about human body mechanics.

"Building a robot is a more definitive test of a design principle than anything I can do in biology," Venkadesan says. "If I believe this ligament or that tendon is responsible for energy efficient running, or for stability, I can't remove the ligament in your body to test my theory. But with a , I can. I can do that, and I can use any insights from that to better understand you while you're running."

And especially as the number of runners involved in recreational sports grows, Venkadesan hopes his insights can help people exercise more safely. "Every marathon I see, it's this huge mass of thousands of people running, and many of these runners will go on to suffer injuries that can affect locomotion and ultimately cause significant lifestyle problems," he says. "That's why I want to better understand the body, and perhaps suggest ways to prevent such injuries. I believe learning to design effective robots can teach us about ourselves."

Whether tapping touch screens and walking on cobblestone roads, teaching nutrition and commanding imaginary dogs, the robots created by Venkadesan and Scassellati are already providing answers to that question: They're enlivening our classrooms, demystifying our bodies, and ultimately showing us a path over our individual limitations, towards our best human selves.

Provided by Yale University

Citation: Robots that teach us about ourselves (2015, October 29) retrieved 26 April 2024 from https://phys.org/news/2015-10-robots.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

'Spring-mass' technology heralds the future of walking robots

60 shares

Feedback to editors