US lawmakers: social media answers on extremist content too vague

US lawmakers said Google and Twitter failed to provided details and Facebook did not respond to a request on efforts by social m
US lawmakers said Google and Twitter failed to provided details and Facebook did not respond to a request on efforts by social media to block extremist content

Two US lawmakers berated social media firms Thursday for failing to provide specific information on their efforts to root out extremist content on their platforms.

The congressmen said Twitter and Google-owned YouTube provided incomplete responses and Facebook did not respond to the requests made by the House Committee on Homeland Security after mosque attacks in New Zealand that were livestreamed online.

Representative Bennie Thompson, who chairs the committee, and Max Rose, head of the subcommittee on intelligence and counterterrorism, said none of the companies was able to describe the resources dedicated to counter and extremism on their platforms.

"The fact that some of the largest corporations in the world are unable to tell us what they are specifically doing to stop terrorist and extremist content is not acceptable," they said in a joint statement.

"Domestic terrorism is on the rise both here and abroad, and of all forms of terrorism and extremism are increasingly turning to these to proliferate their message and spread their violent, hateful content. As we saw in New Zealand, Facebook failed and admitted as much."

A letter from YouTube said the video-sharing spends hundreds of millions of dollars and employs some 10,000 people to block or remove content which violates policies on or incitement of violence.

But YouTube maintained that it would be "difficult and possibly misleading" to separate its counterterrorism efforts from overall expenditure to protect the site.

The company had manually reviewed a million videos suspected of violating policy on terrorist content in the first three months of 2019, it said, and removed fewer than 10 percent.

It added that its automated systems often remove videos before they are viewed.

Twitter said it had suspended more than 1.4 million accounts for violations related to the promotion of terrorism and that it enforces its ban on specific threats of violence or to others.

Thompson and Rose said the letters lacked specifics on how these firms tackle extremism.

"Broad platitudes and vague explanations of safety procedures aren't enough," they wrote. "We need a full accounting of what is being done."

The House panel asked in March for a briefing from Google, Facebook, Twitter and Microsoft after the attacks that killed 50 worshipers at mosques in Christchurch which the assailant streamed on Facebook Live, and which was then copied and reposted elsewhere.


Explore further

New Zealand wants answers from tech giants after mosque attack livestream

© 2019 AFP

Citation: US lawmakers: social media answers on extremist content too vague (2019, May 2) retrieved 19 July 2019 from https://phys.org/news/2019-05-lawmakers-social-media-extremist-content.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
10 shares

Feedback to editors

User comments

May 03, 2019
Facebook and others have no business even defining fake news because they define peoples opinions that differ from their own as fake and just step on peoples' free speech. Free speech is under the biggest attack in history. Just because you find something someone says offensive doesn't mean the person doesn't have the right to say it.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more