US lawmakers said Google and Twitter failed to provided details and Facebook did not respond to a request on efforts by social media to block extremist content

Two US lawmakers berated social media firms Thursday for failing to provide specific information on their efforts to root out extremist content on their platforms.

The congressmen said Twitter and Google-owned YouTube provided incomplete responses and Facebook did not respond to the requests made by the House Committee on Homeland Security after mosque attacks in New Zealand that were livestreamed online.

Representative Bennie Thompson, who chairs the committee, and Max Rose, head of the subcommittee on intelligence and counterterrorism, said none of the companies was able to describe the resources dedicated to counter and extremism on their platforms.

"The fact that some of the largest corporations in the world are unable to tell us what they are specifically doing to stop terrorist and extremist content is not acceptable," they said in a joint statement.

"Domestic terrorism is on the rise both here and abroad, and of all forms of terrorism and extremism are increasingly turning to these to proliferate their message and spread their violent, hateful content. As we saw in New Zealand, Facebook failed and admitted as much."

A letter from YouTube said the video-sharing spends hundreds of millions of dollars and employs some 10,000 people to block or remove content which violates policies on or incitement of violence.

But YouTube maintained that it would be "difficult and possibly misleading" to separate its counterterrorism efforts from overall expenditure to protect the site.

The company had manually reviewed a million videos suspected of violating policy on terrorist content in the first three months of 2019, it said, and removed fewer than 10 percent.

It added that its automated systems often remove videos before they are viewed.

Twitter said it had suspended more than 1.4 million accounts for violations related to the promotion of terrorism and that it enforces its ban on specific threats of violence or to others.

Thompson and Rose said the letters lacked specifics on how these firms tackle extremism.

"Broad platitudes and vague explanations of safety procedures aren't enough," they wrote. "We need a full accounting of what is being done."

The House panel asked in March for a briefing from Google, Facebook, Twitter and Microsoft after the attacks that killed 50 worshipers at mosques in Christchurch which the assailant streamed on Facebook Live, and which was then copied and reposted elsewhere.