Pittsburgh symposium answers: What is Watson?

Mar 30, 2011 By JOE MANDAK , Associated Press
Eric Brown, left, a researcher for IBM, hosts a version of the television game show Jeopardy! between IBM's Watson computer and students from Carnegie Mellon University and the University of Pittsburgh during a symposium on Watson, at Carnegie Mellon University in Pittsburgh, Wednesday, March 30, 2011. Watson is the question-answering IBM computer that made headlines by competing against humans on the actual television game show. (AP Photo/Keith Srakocic)

(AP) -- Six university students attempted to match wits with IBM's "Jeopardy!"-playing computer Wednesday and lost badly in a mock game show. But the competition was hardly the point of a daylong symposium meant to answer an appropriate question: What is Watson?

Watson is IBM's stab at advancing by creating a machine that can recognize words using complex mathematical formulas. The algorithms can deduce probable characteristics and meanings of words based on the context in which they are used and other clues.

"Language is difficult for computers because they're not human," David Ferrucci, IBM's lead Watson researcher told a packed auditorium at Carnegie Mellon University before the demonstration.

"People have this model of how computers work and often it's the model of 'looking something up,'" Ferrucci said. What Watson's creators didn't do - and couldn't do - was anticipate all of the possible answers-and-questions the machine might encounter in a game show.

As such, Watson isn't simply loaded with oodles of trivia that the computer sorts through at light speed. Instead, Watson looks at "Jeopardy!" answers and formulates correct questions based on contextual clues, including the category. It even has formulas that can recognize "pun relationships" between words, Ferrucci said.

When Watson comes up with an answer, the computer is not spitting out information but rather offering a percentage-based opinion that the words in its "Jeopardy!" question are "correct." That may be what makes Watson most valuable if and when the technology is applied to other problem-solving fields including law, business, computer science and engineering.

Watson's creators believe its probability-based approach also holds promise as a medical diagnostic tool, and they've already signed a deal to do real world tests at Columbia University Medical Center and the University of Maryland School of Medicine.

The stakes there are high and that's another reason the computer was designed to play "Jeopardy!" - a highly competitive game in which Watson is rewarded for right answers and punished for wrong ones.

The computer also can track how much money it has won compared with its human opponents. Watson will sometimes pass on offering a question if it determines the amount of money it could lose isn't worth the computer's level of uncertainty about being correct.

That didn't prove to be much of a problem on Wednesday. The computer has won 71 percent of 55 games it has played against "Jeopardy!" TV champions, and made mincemeat out of three-member teams from Carnegie Mellon and the neighboring University of Pittsburgh.

Watson racked up $52,199 in pretend cash by providing questions to answers in categories including "International Food," "European Bodies of Water" and "Its Reigning Men" (which provided answers to questions about historical monarchs).

Pitt scored a distant second with $12,937 and Carnegie Mellon with $7,463.

Will Zhang, 20, a computer science major who headed Carnegie Mellon's team, was blown away by Watson's speed. The junior from Carmel, Ind., expressed frustration when the mock game show host asked the human contestants, "You guys feeling OK over there?" as Watson raced out to a large lead.

"Not cool!," Zhang replied, laughing.

"It was pretty difficult. I knew I had absolutely no chance against Watson," Zhang said afterward, before expressing admiration for and the language issue its creators are trying to solve.

"I think it's really amazing to be able to work on a problem that nobody's really tackled before," Zhang said.

Explore further: A social-network illusion that makes things appear more popular than they are


Related Stories

'Jeopardy!' to pit humans against IBM machine

Dec 14, 2010

(AP) -- The game show "Jeopardy!" will pit man versus machine this winter in a competition that will show how successful scientists are in creating a computer that can mimic human intelligence.

IBM's 'Watson' to take on Jeopardy! champs

Feb 11, 2011

Nearly 15 years after an IBM machine defeated world chess champion Garry Kasparov, the US computer pioneer is rolling out another device to challenge mankind.

IBM computer, Jeopardy! champ tied after first day

Feb 15, 2011

An IBM computer displayed a few quirks but played to a draw on the opening day of a man vs. machine showdown with two human champions of the popular US television game show Jeopardy!. ...

IBM puts supercomputer in 'Jeopardy!'

Feb 08, 2011

"Let’s finish, ‘Chicks Dig Me’," intones the somewhat monotone, but not unpleasant, voice of Watson, IBM’s new supercomputer built to compete on the game show Jeopardy!

Recommended for you

EU open source software project receives green light

Jul 01, 2015

An open source software project involving the University of Southampton to extend the capacity of computational mathematics and interactive computing environments has received over seven million euros in EU funding.

Can computers be creative?

Jul 01, 2015

The EU-funded 'What-if Machine' (WHIM) project not only generates fictional storylines but also judges their potential usefulness and appeal. It represents a major advance in the field of computational creativity.

Algorithm detects nudity in images, offers demo page

Jul 01, 2015

An algorithm has been designed to tell if somebody in a color photo is naked. Isitnude.com launched earlier this month; its demo page invites you to try it out to test its power in nudity detection. You ...

User comments : 2

Adjust slider to filter visible comments by rank

Display comments: newest first

2.3 / 5 (3) Mar 31, 2011
I watched the Watson Jeopardy Episodes (and the Nova special). Based on what i saw I think I can formulate questions (well, answers) that will give the humans a big advantage. Basically, ask a very short answer with significant ambiguity. Watson will take longer than the human and lose the buzzer race. If you watch the last show, you will see that happen over and over again in 1 category. The longer the time taken to read the question, the longer Watson is able to search before the buzzer is enabled (Watson gets a text file at the beginning of the question).
5 / 5 (1) Mar 31, 2011
The game changer is when it's successor has computer science knowledge, can read a specification for a new program, or for a change to an existing program(for which you have the source), ask a few questions from the user, and create source code to do it that is easily understandable and editable by humans.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.