Published: September 13, 2012
SAN FRANCISCO — As violence spread in the Arab world over a video on YouTube ridiculing the Prophet Muhammad, Google, the owner of YouTube, blocked access to it in two of the countries in turmoil, Egypt and Libya, but did not remove the video from its Web site.
Google said it decided to block the video in response to violence that killed four American diplomatic personnel in Libya. The company said its decision was unusual, made because of the exceptional circumstances. Its policy is to remove content only if it is hate speech, violating its terms of service, or if it is responding to valid court orders or government requests. And it said it had determined that under its own guidelines, the video was not hate speech.
Millions of people across the Muslim world, though, viewed the video as one of the most inflammatory pieces of content to circulate on the Internet. From Afghanistan to Libya, the authorities have been scrambling to contain an outpouring of popular outrage over the video and calling on the United States to take measures against its producers.
Google’s action raises fundamental questions about the control that Internet companies have over online expression. Should the companies themselves decide what standards govern what is seen on the Internet? How consistently should these policies be applied?
“Google is the world’s gatekeeper for information so if Google wants to define the First Amendment to exclude this sort of material then there’s not a lot the rest of the world can do about it,” said Peter Spiro, a constitutional and international law professor at Temple University in Philadelphia. “It makes this episode an even more significant one if Google broadens the block.”
He added, though, that “provisionally,” he thought Google made the right call. “Anything that helps calm the situation, I think is for the better.”
Under YouTube’s terms of service, hate speech is speech against individuals, not against groups. Because the video mocks Islam but not Muslim people, it has been allowed to stay on the site in most of the world, the company said Thursday.
“This video — which is widely available on the Web — is clearly within our guidelines and so will stay on YouTube,” it said. “However, given the very difficult situation in Libya and Egypt we have temporarily restricted access in both countries.”
Though the video is still visible in other Arab countries where violence has flared, YouTube is closely monitoring the situation, according to a person briefed on YouTube’s decision-making who was not authorized to speak publicly. The Afghan government has asked YouTube to remove the video, and some Google services were blocked there Thursday.
Google is walking a precarious line, said Kevin Bankston, director of the free expression project at the Center for Democracy and Technology, a nonprofit in Washington that advocates for digital civil liberties.
On the one hand, he said, blocking the video “sends the message that if you violently object to speech you disagree with, you can get it censored.” At the same time, he said, “the decision to block in those two countries specifically is kind of hard to second guess, considering the severity of the violence in those two areas.”
“It seems they’re trying to balance the concern about censorship with the threat of actual violence in Egypt and Libya,” he added. “It’s a difficult calculation to make and highlights the difficult positions that content platforms are sometimes put in.”
All Web companies that allow people to post content online — Facebook and Twitter as well as Google — have grappled with issues involving content. The questions are complicated by the fact that the Internet has no geographical boundaries, so companies must navigate a morass of laws and cultural mores. Web companies receive dozens of requests a month to remove content. Google alone received more than 1,965 requests from government agencies last year to remove at least 20,311 pieces of content, it said.
These included a request from a Canadian government office to remove a video of a Canadian citizen urinating on his passport and flushing it down the toilet, and a request from a Pakistan government office to remove six videos satirizing Pakistani officials. In both cases, Google refused to remove the videos.
But it did block access in Turkey to videos that exposed private details about public officials because, in response to Turkish government and court requests, it determined that they violated local laws.
Similarly, in India it blocked local access to some videos of protests and those that used offensive language against religious leaders because it determined that they violated local laws prohibiting speech that could incite enmity between communities.
Requests for content removal from United States governments and courts doubled over the course of last year to 279 requests to remove 6,949 items, according to Google. Members of Congress have publicly requested that YouTube take down jihadist videos they say incite terrorism, and in some cases YouTube has agreed.
Google has continually fallen back on its guidelines to remove only content that breaks laws or its terms of service, at the request of users, governments or courts, which is why blocking the anti-Islam video was exceptional.
Some wonder what precedent this might set, especially for government authorities keen to stanch expression they think will inflame their populace.
“It depends on whether this is the beginning of a trend or an extremely exceptional response to an extremely exceptional situation,” said Rebecca MacKinnon, co-founder of Global Voices, a network of bloggers worldwide, and author of “Consent of the Networked,” a book that addresses free speech in the digital age.
Somini Sengupta contributed reporting.
Categories: Health Technology News