Big Brother facial recognition needs ethical regulations

Big Brother facial recognition needs ethical regulations
Will facial recognition software make the world a safer place, as tech firms are claiming, or will it make the marginalized more vulnerable and monitored? Credit: Shutterstock

My mother always said I had a face for radio. Thank God, as radio may be the last place in this technology-enhanced world where your face won't determine your social status or potential to commit a crime.

RealNetworks, the global leader of a technology that enables the seamless digital delivery of audio and video files across the internet, has just released its latest computer vision: A machine learning software package. The hope is that this new software will detect, and potentially predict, suspicious behaviour through .

Called SAFR (Secure, Accurate Facial Recognition), the toolset has been marketed as a cost-effective way to smoothly blend into existing CCTV video monitoring systems. It will be able to "detect and match millions of faces in real time," specifically within school environments.

Ostensibly, RealNetworks sees its technology as something that can make the world safer. The catchy branding, however, masks the real ethical issues surrounding the deployment of facial detection systems. Some of those issues include questions about the inherent biases embedded within the code and, ultimately, how that captured data is used.

The Chinese model

Big Brother is watching. No other country in the world has more video surveillance than China. With 170 million CCTV cameras and some 400 million new ones being installed, it is a country that has adopted and deployed facial recognition in an Orwellian fashion.

In the near future, its citizens, and those of us who travel there, will be exposed to a vast and integrated network of facial recognition systems monitoring everything from the use of public transportation, to speeding to how much toilet paper one uses in the public toilet.

The most disturbing element so far is the recent introduction of facial recognition to monitor school children's behaviour within Chinese public schools.

As part of China's full integration of their equally Orwellian social credit system —an incentive program that rewards each citizen's commitment to the state's dictated morals—this fully integrated digital system will automatically identify a person. It can then determine one's ability to progress in society—and by extension that person's immediate family's economic and social status —by monitoring the state's non-sanctioned behaviour.

In essence, facial recognition is making it impossible for those exposed to have the luxury of having a bad day.

Facial recognition systems now being deployed within Chinese schools are monitoring everything from classroom attendance to whether a child is daydreaming or paying attention. It is a full-on monitoring system that determines, to a large extent, a child's future without considering that some qualities, such as abstract thought, can't be easily detected or at best, looked upon favourably, with facial recognition.

It also raises some very uncomfortable notions of ethics or the lack thereof, especially towards more vulnerable members of society.

Need for public regulation

RealNetworks launch of SAFR comes hot on the heels of Microsoft president Brad Smith's impassioned manifesto on the need for public regulation and corporate responsibility in the development and deployment of .

Smith rightly pointed out that facial recognition tools are still somewhat skewed and have "greater error rates for women and people of colour." This problem is twofold, with an acknowledgement that the people who code may unconsciously embed cultural biases.

The data sets currently available may lack the objective robustness required to ensure that people's faces aren't being misidentified, or even worse, predetermined through encoded bias as is now beginning to happen in the Chinese school system.

In an effort to address this and myriad other related issues, Microsoft established an AI and Ethics in Engineering and Research (AETHER) Committee. This committee is also set up to help them comply with the European Union's newly enforced General Data Protection Regulation (GDPR) and its eventual future adoption, in some form, in North America.

Smith's ardent appeal rightly queries the current and future intended use and deployment of facial recognition systems, yet fails to address how Microsoft or, by extension, other AI technology leaders, can eliminate biases within their base code or data sets from the onset.

Minority report

The features of our face are hardly more than gestures which force of habit has made permanent.—Marcel Proust, 1919

Like many technologies, Pandora has already left the box. If you own a smart phone and use the internet, you have already opted out of any basic notions of personal anonymity within Western society.

With GDPR now fully engaged in Europe, visiting a website now requires you to "opt in" to the possibility that that website might be collecting personal data. Facial recognition systems have no means of following GDPR rules, so as such, we as society are automatically "opted-in" and thus completely at the mercy of how our faces are being recorded, processed and stored by governmental, corporate or even privately deployed CCTV systems.

Facial recognition trials held in England by the London Metropolitan Police have consistently yielded a 98 per cent failure rate. Similarly, in South West Wales, tests have done only slightly better with less than 10 per cent success.

Conversely, University of California, Berkeley, scientists have concluded that substantive facial variation is an evolutionary trait unique to humans. So where is the disconnect?

If as Marcel Proust has suggested, our lives and thus our personalities are uniquely identifiable by our faces, why can't facial recognition systems not easily return positive results?

The answer goes back to how computer programming is written and the data sets used by that code to return a positive match. Inevitably, code is written to support an idealized notion of facial type.

As such, outlying variations like naturally occurring or affected by physical or mental trauma represent only a small fraction of the infinite possible facial variations in the world. The data sets assume we are homogeneous doppelgängers of each other, without addressing the micro-variations of peoples faces.

If that's the case, we are all subject to the possibility that our faces as interpreted by the ever-increasing deployment of immature facial recognition systems will betray the reality of who we are.


Explore further

New model for large-scale 3-D facial recognition

Provided by The Conversation

This article was originally published on The Conversation. Read the original article.The Conversation

Citation: Big Brother facial recognition needs ethical regulations (2018, July 23) retrieved 21 October 2019 from https://phys.org/news/2018-07-big-brother-facial-recognition-ethical.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
19 shares

Feedback to editors

User comments

Jul 23, 2018
Well of course when you call it Big Brother Facial Recognition then OF COURSE needs ethical regulation.

I wonder - is the average consumer so gullible that they can be manipulated this way, or is it only the news people who think they are?

STOP THE STUPID. We're sick of it.

And this is the result:

"DONALD Trump's approval rating has risen since meeting Russia's leader Vladimir Putin, according to the latest poll result, with his campaign against what he considers "fake news" fuelling his popularity."

I suppose The Stupid IS reaching some people however:

"100 people dressed in 'Handmaid's Tale' garb to protest Pence in Philly on Monday"

-but their base shrinks with every stupid display.
https://youtu.be/PIhD2TTc1kI
https://youtu.be/2gRYC67qTpc

Jul 23, 2018
Just wear juggalo paint.

https://theoutlin...f6moxiso

Jul 23, 2018
Ostensibly, RealNetworks sees its technology as something that can make the world safer.

Ya know, they have built up the CCTV camera count in the UK to preposterous levels...and year after year every study shows that it isn't having any effect whatsoever.

Still they keep building them, in the hopes of...what exactly?

Jul 23, 2018
Still they keep building them, in the hopes of...what exactly?


Remember the facial recognition targeted advertising in Minority Report?

Jul 23, 2018
The end of junk news - watch it on Fox!

"Tronc swung the ax hard on the New York Daily News — cutting half the newspaper's editorial staff, including editor-in-chief Jim Rich and managing editor Kristen Lee."

"CNN Falls To Food Network In Last Week's Ratings... Fox News earned its 24th consecutive week on top of basic cable with 1,465,000 average daily viewers... CNN lost to HGTV, Investigation Discovery, the History Channel, the Discovery Channel, and the Food Network."
Just wear juggalo paint
I always wear juggalo paint.

Jul 23, 2018
If I lived in China, with an oppressive government that limits freedom and has no legal protections of basic human rights, I would be worried. I live in the U.S., where freedom is (still) constitutionally protected despite attempts by leftists and their PC police to erode those freedoms and replace them with their idea of how we should think and behave.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more