(AP)—Google is refusing a White House request to take down an anti-Muslim clip on YouTube, but is restricting access to it in certain countries.
YouTube said in a statement Friday that the video is widely available on the Web and is "clearly within our guidelines and so will stay on YouTube."
The short film "Innocence of Muslims" denigrates Islam and the Prophet Muhammad. It played a role in igniting mob violence against U.S. embassies across the Middle East. And it has been blamed for playing a role in violence in Libya, where the U.S. ambassador and three others were killed though the exact cause of the attacks is under investigation.
U.S. and Libyan officials are investigating whether the protests in Libya were a cover for militants, possibly al-Qaida sympathizers, to carry out a coordinated attack on the U.S. Consulate in Benghazi and kill Americans. Washington has deployed FBI investigators to try and track down militants behind the attack.
While the protests intensified over the video, YouTube blocked access to the clip in Libya and Egypt. YouTube cited "the very sensitive situations" in those two countries. Later YouTube also blocked access to the video in India and Indonesia after their governments told Youtube the video broke their laws.
The controversy underscores how some Internet firms have been thrust into debates over the limits of free speech.
In its Friday statement, YouTube said that outside of Libya, Egypt, India and Indonesia, the video will remain on its website.
"We work hard to create a community everyone can enjoy and which also enables people to express different opinions," the YouTube statement said. "This can be a challenge because what's OK in one country can be offensive elsewhere. This video—which is widely available on the Web—is clearly within our guidelines and so will stay on YouTube. However, we've restricted access to it in countries where it is illegal such as India and Indonesia as well as in Libya and Egypt, given the very sensitive situations in these two countries. This approach is entirely consistent with principles we first laid out in 2007."
YouTube's community guidelines say the company encourages free speech and defends everyone's right to express unpopular points of view. But YouTube says it does not permit hate speech.
"'Hate speech' refers to content that promotes hatred against members of a protected group," the guidelines say. "Sometimes there is a fine line between what is and what is not considered hate speech. For instance, it is generally okay to criticize a nation, but not okay to make insulting generalizations about people of a particular nationality."
Explore further: Online rights groups fear violence backlash