Privacy fears over artificial intelligence as crimestopper

November 12, 2017 by Rob Lever
A display shows a vehicle and person recognition system for law enforcement during the NVIDIA GPU Technology Conference, which s
A display shows a vehicle and person recognition system for law enforcement during the NVIDIA GPU Technology Conference, which showcases artificial intelligence, deep learning, virtual reality and autonomous machines

Police in the US state of Delaware are poised to deploy "smart" cameras in cruisers to help authorities detect a vehicle carrying a fugitive, missing child or straying senior.

The video feeds will be analyzed using to identify vehicles by license plate or other features and "give an extra set of eyes" to officers on patrol, says David Hinojosa of Coban Technologies, the company providing the equipment.

"We are helping officers keep their focus on their jobs," said Hinojosa, who touts the new as a "dashcam on steroids."

The program is part of a growing trend to use vision-based AI to thwart crime and improve public safety, a trend which has stirred concerns among privacy and civil liberties activists who fear the technology could lead to secret "profiling" and misuse of data.

US-based startup Deep Science is using the same technology to help retail stores detect in real time if an armed robbery is in progress, by identifying guns or masked assailants.

Deep Science has pilot projects with US retailers, enabling automatic alerts in the case of robberies, fire or other threats.

The technology can monitor for threats more efficiently and at a lower cost than human security guards, according to Deep Science co-founder Sean Huver, a former engineer for DARPA, the Pentagon's long-term research arm.

"A common problem is that security guards get bored," he said.

Until recently, most predictive analytics relied on inputting numbers and other data to interpret trends. But advances in visual recognition are now being used to detect firearms, specific vehicles or individuals to help law enforcement and private security.

Elliot Hirsch of Deep Science holds a fake gun as he demonstrates the company's security system to automatically detect firearms
Elliot Hirsch of Deep Science holds a fake gun as he demonstrates the company's security system to automatically detect firearms and thieves

Recognize, interpret the environment

Saurabh Jain is product manager for the computer graphics group Nvidia, which makes computer chips for such systems and which held a recent conference in Washington with its technology partners.

He says the same computer vision technologies are used for self-driving vehicles, drones and other autonomous systems, to recognize and interpret the surrounding environment.

Nvidia has some 50 partners who use its supercomputing module called Jetson or its Metropolis software for security and related applications, according to Jain.

One of those partners, California-based Umbo Computer Vision, has developed an AI-enhanced security monitoring system which can be used at schools, hotels or other locations, analyzing video to detect intrusions and threats in real-time, and sending alerts to a security guard's computer or phone.

Israeli startup Briefcam meanwhile uses similar technology to interpret video surveillance footage.

"Video is unstructured, it's not searchable," explained Amit Gavish, Briefcam's US general manager. Without artificial intelligence, he says, ''you had to go through hundreds of hours of video with fast forward and rewind."

"We detect, track, extract and classify each object in the video. So it becomes a database."

This can enable investigators to quickly find targets from video surveillance, a system already used by law enforcement in hundreds of cities around the world, including Paris, Boston and Chicago, Gavish said.

"It's not only saving time. In many cases they wouldn't be able to do it because people who watch video become ineffective after 10 to 20 minutes," he said.

Facial recognition can be useful for law enforcement and public safety but raises questions about secret profiling
Facial recognition can be useful for law enforcement and public safety but raises questions about secret profiling

'Huge privacy issues'

Russia-based startup Vision Labs employs the Nvidia technology for facial recognition systems that can be used to identify potential shoplifters or problem customers in casinos or other locations.

Vadim Kilimnichenko, project manager at Vision Labs, said the company works with law enforcement in Russia as well as commercial clients.

"We can deploy this anywhere through the cloud," he said.

Customers of Vision labs include banks seeking to prevent fraud, which can use face recognition to determine if someone is using a false identity, Kilimnichenko said.

For Marc Rotenberg, president of the Electronic Privacy Information Center, the rapid growth in these technologies raises privacy risks and calls for regulatory scrutiny over how data is stored and applied.

"Some of these techniques can be helpful but there are huge when systems are designed to capture identity and make a determination based on personal data," Rotenberg said.

"That's where issues of secret profiling, bias and accuracy enter the picture."

Rotenberg said the use of AI systems in criminal justice calls for scrutiny to ensure legal safeguards, transparency and procedural rights.

In a blog post earlier this year, Shelly Kramer of Futurum Research argued that AI holds great promise for , be it for surveillance, scanning social media for threats, or using "bots" as lie detectors.

"With that encouraging promise, though, comes a host of risks and responsibilities."

Explore further: New iPhone brings face recognition (and fears) to the masses

Related Stories

Nest security camera knows who's home with Google face tech

May 31, 2017

Nest Labs is adding Google's facial recognition technology to a high-resolution home-security camera, offering a glimpse of a future in which increasingly intelligent, internet-connected computers can see and understand what's ...

Smaller cities across US opening high-tech crime centers

May 24, 2016

Michelle Plante scoured a surveillance video for clues, trying to identify the man seen shooting at someone in a Hartford playground recently in broad daylight. Luckily, no children were there, and the man fled into a nearby ...

Video analysis software becomes more usable for police

April 22, 2013

As police increasingly sort through video surveillance tapes in their hunt for criminals, as happened in the search for the Boston Marathon bombers, some are turning to face-recognition and data-analysis software to help ...

Facial-recognition technology proves its mettle

May 24, 2013

(Phys.org) —In a study that evaluated some of the latest in automatic facial recognition technology, researchers at Michigan State University were able to quickly identify one of the Boston Marathon bombing suspects from ...

Recommended for you

Volvo to supply Uber with self-driving cars (Update)

November 20, 2017

Swedish carmaker Volvo Cars said Monday it has signed an agreement to supply "tens of thousands" of self-driving cars to Uber, as the ride-sharing company battles a number of different controversies.

5 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

flashgordon
not rated yet Nov 12, 2017
DeepMind A.I. . . . Quantum Computers(hundreds of quibits by the end of 2018 alone) . . . I don't even have to say anything.

Boy, there's going to be some big surprises!

flashgordon
not rated yet Nov 12, 2017
TheGhostofOtto1923
5 / 5 (1) Nov 13, 2017
"Some of these techniques can be helpful but there are huge privacy issues when systems are designed to capture identity and make a determination based on personal data," Rotenberg said."

-No there aren't. Replacing humans with AI increases privacy and reduces the opportunity to exploit. Exposing criminals in the act increases personal security and mutual trust in society, leading to more privacy, not less.

The only people to object to this are those who want to retain their ability to cheat and victimize ie psychopaths.
KBK
not rated yet Nov 14, 2017
I believer you are correct Otto. No real news, there, just observation of a coming domino.

The next domino is: Abuse of the public via said mechanism and lever.

Where a fascism/totalitarianism has a will - fascism/totalitarianism..will find a way.

When developing cutting edge technology, I find it interesting or try to openly note, loudly..in a public place, so people can understand...that the psychopaths, the war mongers.... are almost always the first to throw money at new technology, in order to add it to their war machine..in order to gain advantage. I've seen it first hand and anyone with an ounce of mental capacity and a minimal eye for observation can also note it.

Say it out loud, put it on the table.

Politics and corporate power (as bookends) is an involved blood brother to the war machine, hence, technology and abuse of technology in the service of 'barely cloaked totalitarianism', is always just around the corner - in the world of technological developments.
TheGhostofOtto1923
not rated yet Nov 14, 2017
Mr black hoodie sloganeer
are almost always the first to throw money at new technology, in order to add it to their war machine
You do love your slogans dont you? More fashion, less brain.

I think you misunderstand the purpose of military r&d. It is conducted with the understanding that if we don't exploit weapons tech first, someone else definitely will and they will use it against us.
Say it out loud, put it on the table
We have enemies in this world, mostly religionists whose growth rate makes their aggression in search of resources inevitable. And so we need to protect ourselves against them.

The question is, are you friend or foe?
https://youtu.be/btm46UnGFic

- We already know your weakness.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.