Can a machine tell when you're lying? Research suggests the answer is 'yes'

Mar 26, 2012

Inspired by the work of psychologists who study the human face for clues that someone is telling a high-stakes lie, UB computer scientists are exploring whether machines can also read the visual cues that give away deceit.

Results so far are promising: In a study of 40 videotaped conversations, an automated system that analyzed correctly identified whether interview subjects were lying or telling the truth 82.5 percent of the time.

That's a better than expert human interrogators typically achieve in judgment experiments, said Ifeoma Nwogu, a research assistant professor at UB's Center for Unified Biometrics and Sensors (CUBS) who helped develop the system. In published results, even experienced interrogators average closer to 65 percent, Nwogu said.

"What we wanted to understand was whether there are signal changes emitted by people when they are lying, and can machines detect them? The answer was yes, and yes," said Nwogu, whose full name is pronounced "e-fo-ma nwo-gu."

The research was peer-reviewed, published and presented as part of the 2011 IEEE Conference on Automatic Face and . Nwogu's colleagues on the study included CUBS scientists Nisha Bhaskaran and Venu Govindaraju, and UB communication professor Mark G. Frank, a behavioral scientist whose primary area of research has been facial expressions and deception.

In the past, Frank's attempts to automate deceit detection have used systems that analyze changes in or examine a slew of involuntary .

The automated UB system tracked a different trait -- eye movement. The system employed a to model how people moved their eyes in two distinct situations: during regular conversation, and while fielding a question designed to prompt a lie.

People whose pattern of eye movements changed between the first and second scenario were assumed to be lying, while those who maintained consistent eye movement were assumed to be telling the truth. In other words, when the critical question was asked, a strong deviation from normal eye movement patterns suggested a lie.

Previous experiments in which human judges coded facial movements found documentable differences in eye contact at times when subjects told a high-stakes lie.

What Nwogu and fellow did was create an automated system that could verify and improve upon information used by human coders to successfully classify liars and truth tellers. The next step will be to expand the number of subjects studied and develop automated systems that analyze body language in addition to eye contact.

Nwogu said that while the sample size was small, the findings are exciting.

They suggest that computers may be able to learn enough about a person's behavior in a short time to assist with a task that challenges even experienced interrogators. The videos used in the study showed people with various skin colors, head poses, lighting and obstructions such as glasses.

This does not mean machines are ready to replace human questioners, however -- only that computers can be a helpful tool in identifying liars, Nwogu said.

She noted that the technology is not foolproof: A very small percentage of subjects studied were excellent liars, maintaining their usual eye movement patterns as they lied. Also, the nature of an interrogation and interrogators' expertise can influence the effectiveness of the lie-detection method.

The videos used in the study were culled from a set of 132 that Frank recorded during a previous experiment. In Frank's original study, 132 interview subjects were given the option to "steal" a check made out to a political party or cause they strongly opposed.

Subjects who took the check but lied about it successfully to a retired law enforcement interrogator received rewards for themselves and a group they supported; Subjects caught lying incurred a penalty: they and their group received no money, but the group they despised did. Subjects who did not steal the check faced similar punishment if judged lying, but received a smaller sum for being judged truthful.

The interrogators opened each interview by posing basic, everyday questions. Following this mundane conversation, the interrogators asked about the check. At this critical point, the monetary rewards and penalties increased the stakes of lying, creating an incentive to deceive and do it well.

In their study on automated deceit detection, Nwogu and her colleagues selected 40 videotaped interrogations.

They used the mundane beginning of each to establish what normal, baseline eye movement looked like for each subject, focusing on the rate of blinking and the frequency with which people shifted their direction of gaze.

The scientists then used their automated system to compare each subject's baseline eye movements with eye movements during the critical section of each interrogation -- the point at which interrogators stopped asking everyday questions and began inquiring about the check.

If the machine detected unusual variations from baseline eye movements at this time, the researchers predicted the subject was lying.

Explore further: Coping with floods—of water and data

Related Stories

UB team's software is set to eyeball liars

Mar 08, 2012

(PhysOrg.com) -- A study team at the University of Buffalo, State University of New York, is working on video analysis software to analyze eye movements to spot liars. So far, they say their results show that ...

Technology would help detect terrorists before they strike

Oct 05, 2007

Are you a terrorist? Airport screeners, customs agents, police officers and members of the military who silently pose that question to people every day, may soon have much more than intuition to depend on to determine the ...

Pants on fire: When consumers lie to service providers

Mar 15, 2012

Is honesty the best policy? According to a new study in the Journal of Consumer Research, consumers who lie during a service encounter are more satisfied than truth tellers when they get what they want.

Recommended for you

Coping with floods—of water and data

Dec 19, 2014

Halloween 2013 brought real terror to an Austin, Texas, neighborhood, when a flash flood killed four residents and damaged roughly 1,200 homes. Following torrential rains, Onion Creek swept over its banks and inundated the ...

Cloud computing helps make sense of cloud forests

Dec 17, 2014

The forests that surround Campos do Jordao are among the foggiest places on Earth. With a canopy shrouded in mist much of time, these are the renowned cloud forests of the Brazilian state of São Paulo. It is here that researchers ...

User comments : 14

Adjust slider to filter visible comments by rank

Display comments: newest first

NotParker
1 / 5 (6) Mar 26, 2012
On this site, they already use a lie detecting machine.

All lies are labeled with "Vendicar_Decarian".
TheGhostofOtto1923
1 / 5 (4) Mar 26, 2012
With emerging technologies we will soon be able to ascertain the guilt or innocence of a defendant before he enters the courtroom. And with rfid networks implanted and imbedded within everyone and everything of value, crime will soon be impossible. Preventing and repairing damage to prenatal brains will make the compulsion rare. Lawyers and prisons will no longer be necessary.
Kedas
1 / 5 (1) Mar 26, 2012
Practical applications
Imagine having cameras connected to your PC in your house.
and the PC is checking for lies.
You get the following dialogue:
Women: Do I look fat in this dress?
Man: No!
Computer Voice: THAT'S A LIE!
Turritopsis
1 / 5 (1) Mar 26, 2012
Due to genetic differences neurochemistry in response to stimuli is not the same in any 2 individuals, these differences become more pronounced and apparent when examining a sociopaths neural sensory responses. Although on the by and large (the societal norm) a 2nd tier sensory system can be devised (1st stimuli gets a 1st response from human being, 2nd sensor (computer) is stimulated by human and derives conclusion based on said humans signals).

The program can be devised to pick up visual cues in the liar (ie sweating), or auditory clues (ie cracking voice), it can extrapolate neural activity indicating which brain region is active (if creative region is active the story is being fabricated).

The limitation rests on two factors:

1. Programmers understanding of lying process.
2. Individuality of subject. No two humans are alike. Just because someone appears to be lying does not necessarily mean that they are.
Turritopsis
1 / 5 (1) Mar 26, 2012
In other words, no human derived machine can ever with 100% accuracy differentiate between a truth and a lie.
nkalanaga
4.5 / 5 (2) Mar 26, 2012
There is one problem with even perfect "lie detectors" in legal cases. If the witness truly believes what they say, it will be judged truthful, even though it may be false. One can tell if a person is knowingly making a false statement, but no machine can tell if the statement is actually true.

On the other hand, a human judge has the same problem, and even eliminating just the deliberate lies would be a big improvement.
TheGhostofOtto1923
1.4 / 5 (7) Mar 26, 2012
Due to genetic differences neurochemistry in response to stimuli is not the same in any 2 individuals
Depends entirely upon the stimuli and the nature of the analysis.
these differences become more pronounced and apparent when examining a sociopaths neural sensory responses.
How do you figure?
2nd sensor (computer) is stimulated by human and derives conclusion based on said humans signals).
Huh? A series of tests will eliminate reasonable doubt.
In other words, no human derived machine can ever with 100% accuracy differentiate between a truth and a lie.
But they can and will do far better than lawyers, judges, and juries. Your argument is the old religionist-inspired 'We're all unique, unfathomable black boxes because we have a soul.' This is mumbo jumbo. We should even be able to play back the memory, conscious and unconscious, like a camcorder.

We will soon KNOW when people are lying. We will KNOW when they are guilty. Does this bother you?
TheGhostofOtto1923
1.4 / 5 (7) Mar 26, 2012
If the witness truly believes what they say, it will be judged truthful,
You are assuming you know the nature of the test. I assume scientists would be including such concerns as part of their work.

Technology should be able to discern between real memories and fabricated ones. Better than the defendant himself. Certainly better than any and all players in a courtroom.

Money can buy expert witnesses to cancel each other out. Youre still left with credibility, lawyerly salesmanship and the politics of message-sending to determine justice. I would prefer machine judgment any day. Or throw me in the moat and see if god throws me back.
nkalanaga
5 / 5 (1) Mar 26, 2012
True, but I was thinking of a witness who believes, based on their experience and senses, that their statement is true.

Example: You see a person, carrying a gun, run into a building. You know that there is only one person supposed to be in that building. You then hear a gunshot, see the same person run out, and enter the building. A the person you know is supposed to be there is on the floor, dead, with a bullet hole in them. There is no other person in the building, and no other gun in sight. Would you testify that the person you saw was the shooter?

It would be reasonable for you to believe that, based on your own observations and knowledge, and the lack of evidence to the contrary. However, while very unlikely, the shooter could have been a third person, who entered and left the building on a side you weren't watching, or who fired through an open window out of your site.

The difference is between a lie and an honest mistake. The mistakes will still get through.
Turritopsis
1 / 5 (1) Mar 26, 2012
The only true machine would have to be a quantum one, completely self programming. The machine would have to take readings of every cell in the body, readings of the individuals entire genome and in real time compare the individuals genome with his/her functioning. This would allow you access into the human being for extraction of experiences. Even if the individuals wiring is off (false memories or interpretations, iow even if the subject is deluded) a such a computational system would collect the data, not the individuals accounts or interpretations or extrapolations. Therefore, such as is the case for judgements, the system would analyze the TRUE data, and leave the persons interpretations and reactions out of the event.

Picking up visual cues by analyzing the direction of eyerolls is very vague and generalized. A good judge should try to fully analyze all variables involved in the event, not just base his judgement on the direction of the eye roll when asking "did you do it?"
satyricon
1 / 5 (1) Mar 27, 2012
What happens to those who suffer from schizophrenia and other types of psychosis and hallucinations?
Birger
not rated yet Mar 27, 2012
I assume sociopaths are among the 17.5% that do not show up as liars...
TheGhostofOtto1923
1 / 5 (4) Mar 27, 2012
Example: You see a person, carrying a gun, run into a building. You know that there is only one person supposed to be in that building. You then hear a gunshot, see the same person run out, and enter the building. A the person you know is supposed to be there is on the floor, dead, with a bullet hole in them. There is no other person in the building, and no other gun in sight. Would you testify that the person you saw was the shooter?
Witness testimony would be far less important when science can examine the accused and determine what his memories are. Self-incrimination? Lawyers and lawmakers who insist on this argument would have to be made to yield to the independent objectivity of memory WITHOUT influence from it's owner. Lawyers will need to find new work. Laws will be made by machines instead of lawyer politicians seeking to feed their brethren.

'In the place of justice - wickedness was there.' ecc3
TheGhostofOtto1923
1 / 5 (4) Mar 27, 2012
I assume sociopaths are among the 17.5% that do not show up as liars...
No they will clearly show up as sociopaths and their own peculiar responses well documented and discernible. Their memories should be as readable as anyone elses. These tests should ultimately circumvent the emotions entirely as they can be controlled with practice and training.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.