How algorithms (secretly) run the world

February 11, 2017

Credit: CC0 Public Domain
When you browse online for a new pair of shoes, pick a movie to stream on Netflix or apply for a car loan, an algorithm likely has its word to say on the outcome.

The complex mathematical formulas are playing a growing role in all walks of life: from detecting skin cancers to suggesting new Facebook friends, deciding who gets a job, how police resources are deployed, who gets insurance at what cost, or who is on a "no fly" list.

Read: Algorithms: the managers of our digital lives

Algorithms are being used—experimentally—to write news articles from raw data, while Donald Trump's presidential campaign was helped by behavioral marketers who used an to locate the highest concentrations of "persuadable voters."

But while such automated tools can inject a measure of objectivity into erstwhile subjective decisions, fears are rising over the lack of transparency algorithms can entail, with pressure growing to apply standards of ethics or "accountability."

Data scientist Cathy O'Neil cautions about "blindly trusting" formulas to determine a fair outcome.

"Algorithms are not inherently fair, because the person who builds the model defines success," she said.

Amplifying disadvantages

O'Neil argues that while some algorithms may be helpful, others can be nefarious. In her 2016 book, "Weapons of Math Destruction," she cites some troubling examples in the United States:

- Public schools in Washington DC in 2010 fired more than 200 teachers—including several well-respected instructors—based on scores in an algorithmic formula which evaluated performance.

- A man diagnosed with bipolar disorder was rejected for employment at seven major retailers after a third-party "personality" test deemed him a high risk based on its algorithmic classification.

- Many jurisdictions are using "predictive policing" to shift resources to likely "hot spots." O'Neill says that depending on how data is fed into the system, this could lead to discovery of more minor crimes and a "feedback loop" which stigmatizes poor communities.

- Some courts rely on computer-ranked formulas to determine jail sentences and parole, which may discriminate against minorities by taking into account "risk" factors such as their neighborhoods and friend or family links to crime.

- In the world of finance, brokers "scrape" data from online and other sources in new ways to make decisions on credit or insurance. This too often amplifies prejudice against the disadvantaged, O'Neil argues.

Her findings were echoed in a White House report last year warning that algorithmic systems "are not infallible—they rely on the imperfect inputs, logic, probability, and people who design them."

The report noted that data systems can ideally help weed out human bias but warned against algorithms "systematically disadvantaging certain groups."

Digital crumbs

Zeynep Tufekci, a University of North Carolina professor who studies technology and society, said automated decisions are often based on data collected about people, sometimes without their knowledge.

"These computational systems can infer all sorts of things about you from your digital crumbs," Tufekci said in a recent TED lecture.

"They can infer your sexual orientation, your personality traits, your political leanings. They have predictive power with high levels of accuracy."

Such insights may be useful in certain contexts—such as helping medical professionals diagnose postpartum depression—but unfair in others, she said.

Part of the problem, she said, stems from asking computers to answer questions that have no single right answer.

"They are subjective, open-ended and value-laden questions, asking who should the company hire, which update from which friend should you be shown, which convict is more likely to reoffend."

The EU model?

Frank Pasquale, a University of Maryland law professor and author of "The Black Box Society: The Secret Algorithms That Control Money and Information," shares the same concerns.

He suggests one way to remedy unfair effects may be to enforce existing laws on consumer protection or deceptive practices.

Pasquale points at the European Union's data protection law, set from next year to create a "right of explanation" when consumers are impacted by an algorithmic decision, as a model that could be expanded.

This would "either force transparency or it will stop algorithms from being used in certain contexts," he said.

Alethea Lange, a policy analyst at the Center for Democracy and Technology, said the EU plan "sounds good" but "is really burdensome" and risked proving unworkable in practice.

She believes education and discussion may be more important than enforcement in developing fairer algorithms.

Lange said her organization worked with Facebook, for example, to modify a much-criticized formula that allowed advertisers to use "ethnic affinity" in their targeting.

Scapegoat

Others meanwhile caution that algorithms should not be made a scapegoat for societal ills.

"People get angry and they are looking for something to blame," said Daniel Castro, vice president at the Information Technology and Innovation Foundation.

"We are concerned about bias, accountability and ethical decisions but those exist whether you are using algorithms or not."

Explore further: Here's how we can protect ourselves from the hidden algorithms that influence our lives

Related Stories

Opinion: Should algorithms be regulated?

January 3, 2017

Accidents involving driverless cars, calculating the probability of recidivism among criminals, and influencing elections by means of news filters—algorithms are involved everywhere. Should governments step in?

Recommended for you

Team breaks world record for fast, accurate AI training

November 7, 2018

Researchers at Hong Kong Baptist University (HKBU) have partnered with a team from Tencent Machine Learning to create a new technique for training artificial intelligence (AI) machines faster than ever before while maintaining ...

8 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

betterexists
1 / 5 (1) Feb 11, 2017
Rather, After trying 10 Unique Things You Can Do With Augmented Reality and Portal in Augmented Reality with HoloLens, I Realized at least for 2 to 3 years, It should be used for Pets When Owners are not in their homes. Of course, they should NOT touch Smartphones, but tap on Floor or Wall (Wall is NOTHING for them)
truth4life
not rated yet Feb 12, 2017
The last comment, says it all, A puppeteer and puppet approach to acquire an advantage over another human being, all for what, to make a profit
jdman
not rated yet Feb 13, 2017
Sure wish they would come up with an algorithm that could and would pick the winners in a dog race. That would make my retirement just dandy.
434a
not rated yet Feb 13, 2017
Sure wish they would come up with an algorithm that could and would pick the winners in a dog race. That would make my retirement just dandy.


There is, unfortunately the bookies have it!
hmoulding
not rated yet Feb 13, 2017
Geez, Center for Democracy and Technology and Information Technology and Innovation Foundation sure sound like industry mouth pieces. Why do they even get mentioned in an article like this???
hmoulding
not rated yet Feb 13, 2017
"unfortunately the bookies have it" ~ No, actually, the bookies assume you have it, and adjust the odds based on the actual bets coming in.
434a
not rated yet Feb 14, 2017
"unfortunately the bookies have it" ~ No, actually, the bookies assume you have it, and adjust the odds based on the actual bets coming in.


My comment was in part tongue in cheek, one gambler to another if you like.
I agree with you that the odds will change as punters bet however the book needs to be made prior to any betting taking place. That job as far as I am aware is done by an odds setter/compiler. However, there has been a lot of work done to build algorithms that can do the job more efficiently and cheaply.
lizmcgehee
not rated yet Feb 17, 2017
Follow up question: my understanding from the article is that the EU data protection law comes into play next year. Does this not mean that companies that sweep up data will need to create systems that comply with those laws if they want to operate in that market? Why is it more burdensome to create & maintain two seperate systems, one for Europe & one for the US, vs just maintaining the one system that complies with EU law? I'm curious why the reporter didn't ask a follow up question to Ms. Althea Lange's comment that the EU law "sounds good" but is "totally burdensome." I don't see technology companies pulling out of Europe, so they must be able to comply, right?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.