Lillian Lee: Computers not yet able to understand human speech

Perhaps Hal from "2001: A Space Odyssey" may not have been wrong when he said: "I'm sorry, Dave, I'm afraid I can't do that." Machines—even Apple's Siri—cannot yet completely understand our natural language, a Cornell researcher says.

For the second installment of the School of Continuing Education and Summer Sessions lecture series, Cornell's Lillian Lee, professor of , drew 225 faculty, students and guests to Kennedy Hall's Call Auditorium July 18. Lee detailed the progress in (NLP) and machine learning, and the challenges that lie ahead.

"Understanding language is really hard, not just because of understanding the structure of language part ... it also involves understanding things about what human beings want," Lee explained. Scientists are trying to integrate the insight from linguistics into statistical models, but "we are not all the way there yet," Lee said.

What would happen if, in March 2012, you queried, "Is Snooki on stork watch?" into Google, or asked the question to "Watson," the machine that has beaten human champions in . "Google didn't know the answer!" Lee said. "I've argued that we need a probabilistic approach; a data approach. ... How would Watson figure this out? We have a lot of data. We as human beings may notice what answers the first question. Watson doesn't understand 'Snooki and fiancé Jionni LaValle are expecting their first child together' when asked about 'stork watch.'"

NLP seeks to create systems that can use human language as input or output. This includes speech-based interfaces, information retrieval (such as Google), automatic summarization of news, emails and postings, and automatic translation (such as Translate). According to Lee, the thrill of NLP is that it is "interdisciplinary, including fields of computer science, , psychology, communication, probability and statistics, and information theory."

"Why is understanding language so hard?" Lee answers her own question by providing the example: "I saw her duck with a telescope." According to Lee: "[This sentence] could mean a lot of things. If you look at the word 'duck,' it could mean I'm 'ducking' because people are throwing potatoes at me. Or the word duck could be the animal. In both cases, you have to ask who's holding the telescope … seven simple little words, and this sentence could mean a bazillion things."

According to Lee, somewhere between science fiction and new technological advancement there is a dream and a promise of computers that can understand what people are saying. intelligence can be demonstrated by natural language conversation.

Even has not been able to stand up to this test of intelligence. For example, Lee explains that telling her, "We can email you when you're back" generates "We can email you when you're fat."

The moral of Lee's story: "Today, we need to be careful before you hit, or now even say, the word 'send.'"

Provided by Cornell University

Citation: Lillian Lee: Computers not yet able to understand human speech (2012, July 24) retrieved 25 April 2024 from https://phys.org/news/2012-07-lillian-lee-human-speech.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Texting affects ability to interpret words

0 shares

Feedback to editors