Making robots more human

Making robots more human

Most people are naturally adept at reading facial expressions—from smiling and frowning to brow-furrowing and eye-rolling—to tell what others are feeling. Now scientists have developed ultra-sensitive, wearable sensors that can do the same thing. Their technology, reported in the journal ACS Nano, could help robot developers make their machines more human.

Nae-Eung Lee and colleagues note that one way to make interactions between people and robots more intuitive would be to endow machines with the ability to read their users' emotions and respond with a computer version of empathy. Most current efforts toward this goal analyze a person's feelings using visual sensors that can tell a smile from a frown, for example. But these systems are expensive, highly complex and don't pick up on subtle eye movements, which are important in human expression. Lee's team wanted to make simple, low-cost sensors to detect , including slight changes in gaze.

The researchers created a stretchable and transparent sensor by layering a carbon nanotube film on two different kinds of electrically conductive elastomers. They found it could tell whether subjects were laughing or crying and where they were looking. In addition to applications in robotics, the could be used to monitor heartbeats, breathing, dysphagia (difficulty swallowing) and other health-related cues.

More information: Stretchable, Transparent, Ultrasensitive, and Patchable Strain Sensor for Human-Machine Interfaces Comprising a Nanohybrid of Carbon Nanotubes and Conductive Elastomers, ACS Nano, Article ASAP, DOI: 10.1021/acsnano.5b01613

Abstract
Interactivity between humans and smart systems, including wearable, body-attachable, or implantable platforms, can be enhanced by realization of multifunctional human–machine interfaces, where a variety of sensors collect information about the surrounding environment, intentions, or physiological conditions of the human to which they are attached. Here, we describe a stretchable, transparent, ultrasensitive, and patchable strain sensor that is made of a novel sandwich-like stacked piezoresisitive nanohybrid film of single-wall carbon nanotubes (SWCNTs) and a conductive elastomeric composite of polyurethane (PU)-poly(3,4-ethylenedioxythiophene) polystyrenesulfonate (PEDOT:PSS). This sensor, which can detect small strains on human skin, was created using environmentally benign water-based solution processing. We attributed the tunability of strain sensitivity (i.e., gauge factor), stability, and optical transparency to enhanced formation of percolating networks between conductive SWCNTs and PEDOT phases at interfaces in the stacked PU-PEDOT:PSS/SWCNT/PU-PEDOT:PSS structure. The mechanical stability, high stretchability of up to 100%, optical transparency of 62%, and gauge factor of 62 suggested that when attached to the skin of the face, this sensor would be able to detect small strains induced by emotional expressions such as laughing and crying, as well as eye movement, and we confirmed this experimentally.

Journal information: ACS Nano

Citation: Making robots more human (2015, April 29) retrieved 29 March 2024 from https://phys.org/news/2015-04-robots-human.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Ordinary paper and pencil used to create primitive sensor

19 shares

Feedback to editors