"If I did not have a handsome face and two arms ... you would probably not be as inclined to interact with me," Brian the robot tells visitors to Professor Goldie Nejat's autonomous systems and biomechatronics lab.
While handsome may be overstating things a bit, Brian’s silicone rubber face and mechanical torso certainly draw a crowd eager to interact with him wherever he goes. After all, robot sightings are still rare in North America, unlike in Japan where they are regularly used on assembly lines.
“Our research area focuses on trying to incorporate robots into everyday life and integrate them into society in applications beyond the manufacturing floor,” said Nejat, a mechanical engineering professor.
“We spent a few years designing the platform. He’s humanlike from the waist up with similar actions and body language to humans; it’s humanlike, but you can pretty much tell it’s a robot; it doesn’t confuse anyone.”
That’s by design, of course. Studies have shown people are generally accepting of robots but Nejat and her team want to make his limitations clear. Their aim this year is to integrate Brian into a healthcare team at Baycrest in a nursing unit populated by seniors whose memories may need prompting. They’re hoping he’ll become accepted by staff and patients alike.
“We’re focusing on how he communicates with people,” Nejat said. “We’d like him to interact where activities are involved to provide cognitive or social stimuli to individuals.”
At present, Brian can help people by providing daily reminders and also engaging a user in a simple memory card game but the design team would also like to see the robot help people accomplish a wide variety of daily tasks such as dressing and brushing their teeth or alerting the nurses when a patient is not feeling well. This requires his designers to gain an understanding of how humans communicate, both verbally and non-verbally, so they can program him to react accordingly.
“Every year we add a new module to the robot,” said Nejat. “We have a camera for face detection and gaze tracking. Now, we want him to take in more information from the environment and the people in it so he can intelligently determine his assistive behaviour.
“By integrating different sensors we’ll try to get him to understand human emotions from body language; that will determine his behaviour. We’re doing the same thing with voice, trying to get him to understand human speech.”
The entire effort is interdisciplinary, requiring a knowledge of mechatronics, artificial intelligence, mechanics, psychology and neuroscience, among other fields.
“One of our main motivations was the aging baby boomers,” Nejat said. “He’s an assistive robot. Robots are not here to replace healthcare workers, they will work in a team with them to help provide care.”
Nejat and her graduate students, Jeanie Chan and Derek McColl, see Brian as a long-term project, able to take on new tasks and characteristics as necessary.
“In North America, we’re not used to these types of robots, so I hope we can use him as a prototype to show what robots are capable of doing in order to improve our quality of life.”
Explore further: Posture affects infants' capacity to identify objects, study finds