As facial recognition use grows, so do privacy fears

July 8, 2018 by Rob Lever
A display shows a facial recognition system for law enforcement during the NVIDIA GPU Technology Conference in 2017 in Washington

The unique features of your face can allow you to unlock your new iPhone, access your bank account or even "smile to pay" for some goods and services.

The same technology, using algorithms generated by a facial scan, can allow to find a wanted person in a crowd or match the image of someone in police custody to a database of known offenders.

Facial came into play last month when a suspect arrested for a shooting at a newsroom in Annapolis, Maryland, refused to cooperate with police and could not immediately be identified using fingerprints.

"We would have been much longer in identifying him and being able to push forward in the investigation without that system," said Anne Arundel County police chief Timothy Altomare.

Facial recognition is playing an increasing role in law enforcement, border security and other purposes in the US and around the world.

While most observers acknowledge the merits of some uses of this biometric identification, the technology evokes fears of a "Big Brother" surveillance state.

Heightening those concerns are studies showing facial recognition may not always be accurate, especially for people of color.

A 2016 Georgetown University study found that one in two American adults, or 117 million people, are in facial recognition databases with few rules on how these systems may be accessed.

A growing fear for civil liberties activists is that law enforcement will deploy facial recognition in "real time" through drones, body cameras and dash cams.

"The real concern is police on patrol identifying law-abiding Americans at will with body cameras," said Matthew Feeney, specialist in emerging technologies at the Cato Institute, a libertarian think tank.

"This technology is of course improving but it's not as accurate as science fiction films would make you think."

A Chinese police officer in Zhengzhou in China's central Henan province wearing high-tech sunglasses that can spot suspects in a crowded train station, the newest use of facial recognition that has raised concerns among human rights groups
'Aggressive' deployments

China is at the forefront of facial recognition, using the technology to fine traffic violators and "shame" jaywalkers, with at least one arrest of a criminal suspect.

Clare Garvie, lead author of the 2016 Georgetown study, said that in the past two years, "facial recognition has been deployed in a more widespread and aggressive manner" in the US, including for border security and at least one international airport.

News that Amazon had begun deploying its Rekognition software to police departments sparked a wave of protests from employees and activists calling on the tech giant to stay away from law enforcement applications.

Amazon is one of dozens of tech firms involved in facial recognition. Microsoft for example uses facial recognition for US , and the US state of Maryland uses technology from German-based Cognitec and Japanese tech firm NEC.

Amazon maintains that it does not conduct surveillance or provide any data to law enforcement, but simply enables them to match images to those in its databases.

The tech giant also claims its facial recognition system can help reunite lost or abducted children with their families and stem human trafficking.

'Slippery slope'

Nonetheless, some say facial recognition should not be deployed by law enforcement because of the potential for errors and abuse.

That was an argument made by Brian Brackeen, founder and the chief executive officer of the developer Kairos.

"As the black chief executive of a software company developing facial recognition services, I have a personal connection to the technology, both culturally and socially," Brackeen said in a blog post on TechCrunch.

"Facial recognition-powered government surveillance is an extraordinary invasion of the privacy of all citizens—and a slippery slope to losing control of our identities altogether."

The screen of a computer with an automatic facial recognition system shows German Interior Minister Thomas de Maiziere in December 2017 visiting the Suedkreuz train station, where automatic facial recognition technologies are tested

The Georgetown study found facial recognition algorithms were five to 10 percent less accurate on African Americans than Caucasians.

Policy questions

Microsoft announced last month it had made significant improvements for facial recognition "across skin tones" and genders.

IBM meanwhile said it was launching a large-scale study "to improve the understanding of bias in facial analysis."

While more accurate facial recognition is generally welcomed, civil liberties groups say specific policy safeguards should be in place.

In 2015, several consumer groups dropped out of a government-private initiative to develop standards for facial recognition use, claiming the process was unlikely to develop sufficient privacy protections.

Cato's Feeney said a meaningful move would be to "purge these databases of anyone who isn't currently incarcerated or wanted for violent crime."

Jennifer Lynch, an attorney with the Electronic Frontier Foundation, said that the implications for police surveillance are significant.

"An inaccurate system will implicate people for crimes they did not commit. And it will shift the burden onto defendants to show they are not who the system says they are," Lynch said in a report earlier this year.

Lynch said there are unique risks of breach or misuse of this data, because "we can't change our faces."

Evan Selinger, a philosophy professor at the Rochester Institute of Technology, says is too dangerous for law enforcement.

"It's an ideal tool for oppressive surveillance," Selinger said in a blog post.

"It poses such a severe threat in the hands of law enforcement that the problem cannot be contained by imposing procedural safeguards."

Explore further: Facial recognition was key in identifying US shooting suspect

Related Stories

New model for large-scale 3-D facial recognition

July 6, 2018

Researchers from The University of Western Australia have designed a new system capable of carrying out large-scale 3-D facial recognition that could transform the entire biometrics industry.

Amazon urged not to sell facial recognition tool to police

May 22, 2018

The American Civil Liberties Union and other privacy advocates are asking Amazon to stop marketing a powerful facial recognition tool to police, saying law enforcement agencies could use the technology to "easily build a ...

Recommended for you

NASA's Mars 2020 rover is put to the test

March 20, 2019

In a little more than seven minutes in the early afternoon of Feb. 18, 2021, NASA's Mars 2020 rover will execute about 27,000 actions and calculations as it speeds through the hazardous transition from the edge of space to ...


Adjust slider to filter visible comments by rank

Display comments: newest first

1 / 5 (1) Jul 08, 2018
Yes, at 71, facial recognition really scares me. What if after they see my face they want me to be the next big movie star?
On the other hand perhaps it will strengthen my security by actually catching the bad guys. That would be horrible!

3 / 5 (2) Jul 08, 2018
Used to shame jay walkers
Law enforcement already is one step behind, proved by misdemeanour use - which incidentally is the justified fear, justified
Yes, at 71, facial recognition really scares me. What if after they see my face they want me to be the next big movie star?
On the other hand perhaps it will strengthen my security by actually catching the bad guys. That would be horrible!

The bad guys will find a way round these glass's with their facial recognition if they have not already done so, which is why they are being used to shame jay walkers
3 / 5 (2) Jul 08, 2018
And while I do jay walk does that make it alright?
A judge once said somthing like "It's not the severity of the punishment that deters crime as much as the certainty of getting caught".
not rated yet Jul 08, 2018
I believe the cow is already out of the barn. These things are a natural result of our ever increasing technological progress. Totalitarian governments will employ them to control their people, and even if we carefully craft all sorts of well thought out controls and balances on their use in our society, those controls and balances will be tossed out the window immediately after the next 9/11. Since I am in my mid-60's i, like rderkis, don't see it as a direct threat to me, but I do wonder what might be in store for my grandkids.
1 / 5 (2) Jul 08, 2018
You grandkids?
Starting with your children and perhaps you, death will be optional.
We are in the last stages of any kind of government we can understand.
As long as we increase our IQ before the computer becomes self aware we will reach a utopia soon.

Don't believe it? Check out Ray Kurzweil's study on how fast technology is advancing based on last 2000 years. BTW Check out Ray Kurzweil biography first. One of the world's greatest minds. Or so says forbes magazine, Time magazine and most of his peers like Elon Musk, Hawkings, etc.

Error rate of predictions can be calculated by dividing projected years by 2000 years Example 10 projected years/2000 years = .005 of 1 percent.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.