Fake news: algorithms in the dock

Algorithms run our Google searches, our Facebook newsfeed, recommend articles or videos to us and sometimes censor questionable
Algorithms run our Google searches, our Facebook newsfeed, recommend articles or videos to us and sometimes censor questionable content

At the heart of the spread of fake news are the algorithms used by search engines, websites and social media which are often accused of pushing false or manipulated information regardless of the consequences.

What are algorithms?

They are the invisible but essential computer programmes and formulas that increasingly run modern life, designed to repeatedly solve recurrent problems or to make decisions on their own.

Their ability to filter and seek out links in gigantic databases means it would be impossible to run global markets without them, but they can also be refined down to produce personalised quotes on everything from mortgages to plane tickets.

They also run our Google searches, our Facebook newsfeed, recommend articles or videos to us and sometimes censor questionable content because it may contain violence, pornography or racist language.

Other algorithms charged with the most complex and sensitive tasks can be opaque "black boxes" which develop their own artificial intelligence based on our data.

A skewed view of the world?

"Algorithms can help us find our way through the huge amount of information on the internet," said Margrethe Vestager, the European commissioner for competition.

"But the problem is that we only see what these algorithms—and the companies that use them—choose to show us," she added.

In organising your online content, algorithms also tend to create "filter bubbles", insulating us from opposing points of view.

During the US presidential election in 2016, Facebook was accused of helping Donald Trump by allowing often false information about his rival Hillary Clinton to circulate online, closing people into a news bubble.

Algorithms also tend to make extreme opinions "and fringe views more visible than ever", according to Berlin-based Lorena Jaume-Palasi, founder of the Algorithm Watch group.

However, their effects can be difficult to measure, she warned, saying that algorithms alone are not to blame for the rise in nationalism in Europe.

Spreading fake news?

Social media algorithms tend to push the most viewed content without checking if it is true or not, which is why they magnify the impact of .

On YouTube in particular, conspiracy theory videos get a great deal more traffic than accurate and properly sourced ones, said Guillaume Chaslot, one of the Google-owned platform's former engineers.

These videos, which may claim that the moon landings or climate change are lies, get far more views and comments, keeping users on the platform longer and undermining credible, traditional media, Chaslot insisted.

More ethical algorithms?

Some observers believe that algorithms could be programmed "to serve human freedom", with many non-governmental groups demanding far more transparency.

"Coca-Cola doesn't reveal its formula but its products are tested for their effect on our health," Jaume-Palasi argued, insisting on the need for clear regulation.

The French privacy protection body, the CNIL, last year recommended state oversight of algorithms and that there should be a real push to educate people "so they understand the cogs of the (information technology) machine".

New European data protection rules also allow people to contest the decision of an and "demand a human intervention" in case of conflict.

Some internet giants have themselves begun to act to some degree: Facebook has started an effort to automatically label suspicious posts, while YouTube is reinforcing its "human controls" on videos aimed at children.

However, former Silicon Valley insiders who make up the Center for Humane Technology, which was set up to combat tech's excesses, have warned that "we can't expect attention-extraction companies like YouTube, Facebook, Snapchat, or Twitter to change, because it's against their business model."

Explore further

When our view of the world is distorted by algorithms

© 2018 AFP

Citation: Fake news: algorithms in the dock (2018, July 14) retrieved 23 October 2019 from https://phys.org/news/2018-07-fake-news-algorithms-dock.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments

Jul 14, 2018
By definition, "fake news" is lies. To control "fake news", then, is to control lies. But there are many types of lies. Among other things, though, suggestively, almost all who claim they are interested in controlling "fake news" refuses to acknowledge a number of these types.
One type, deceit by omission. It is reported that a big guy is beating up a little guy. Perfectly true. But it's left out that the little guy is one of a gang who tried to abduct the big guy's wife, using guns. The story has a completely different sense. The "press" uses this constantly, Accusing Trump of making comments contemptuous of women, but not mentioning that all politicians made the same comments in "locker room talk".
The Catholic Church says telling an ugly truth about a rich person keeping them from scamming the people is a "lie" called "detraction". The "press" use that now, saying Trump shouldn't tell ugly facts about "allies" in NATO.

Jul 14, 2018
"These videos, which may claim that the moon landings or climate change are lies, ..."

The climate change believers just can't resist injecting their religion into everything.

Here moon landings and climate change are equated as undeniable hard facts. That is patently false. The moon landings happened in the past. There is evidence of them. Climate change is speculation about the future. There is zero evidence from the future. It is open to debate.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more