Free expression and controversial content on the web
We also know that letting people express their views freely has real practical benefits. Allowing individuals to voice unpopular, inconvenient or controversial opinions is important. Not only might they be right (think Galileo) but debating difficult issues in the open often helps people come to better decisions.
While most people agree in principle with the right to free expression, the challenge comes in putting theory into practice. And that's certainly the case on the web, where blogs, social networks and video sharing sites allow people to express themselves - to speak and be heard - as never before.
At Google we have a bias in favor of people's right to free expression in everything we do. We are driven by a belief that more information generally means more choice, more freedom and ultimately more power for the individual. But we also recognize that freedom of expression can't be -- and shouldn't be -- without some limits. The difficulty is in deciding where those boundaries are drawn. For a company like Google with services in more than 100 countries - all with different national laws and cultural norms - it's a challenge we face many times every day.
In a few cases it's straightforward. For example, we have a global all-product ban against child pornography, which is illegal in virtually every country. But when it comes to political extremism it's not as simple. Different countries have come to different conclusions about how to deal with this issue. In Germany there's a ban on the promotion of Nazism -- so we remove Nazi content on products on Google.de (our domain for German users) products. Other countries' histories make commentary or criticism on certain topics especially sensitive. And still other countries believe that the best way to discredit extremists is to allow their arguments to be publicly exposed.
All this raises important questions for Internet companies like Google. Our products are, after all, specifically designed to help people create and communicate, to find and share information and opinions across the world. So how do we approach these challenges?
It should come as no surprise to learn people have different views about what should appear on our sites. How and where to draw the boundaries is the subject of lively debate even within Google. We think that's healthy. And partly because of this, we realize that creating a flawless set of policies on which everyone can agree is an impossible task.
Google is not, and should not become, the arbiter of what does and does not appear on the web. That's for the courts and those elected to government to decide. Faced with day-to-day choices, however, we look at our products in three broad categories: search, advertising and services that host other people's content.
Search is the least restricted category. We remove results from our index only when required by law (for example, when linked to content infringing copyright) and in a small number of other instances, such as spam results or results including unauthorized credit card and social security numbers. Where feasible, we tell our users when we remove results.
At the other, most restrictive, end of the spectrum, we have what might be called commerce products –- the text of the advertisements we carry, which are subject to clear ad content policies.
The most challenging areas are where we host other people’s content -- offerings like Blogger, Groups, orkut and video. On the one hand, we're not generating the content and we aim to offer a platform for free expression. On the other hand, we host the content on our servers and want to be socially responsible. So we have terms that we ask our users to follow. (See Blogger and orkut for examples.)
So the question becomes: how do we enforce those terms? In general, Google does not want to be a gatekeeper. We don't, and can't, check content before it goes live, any more than your phone company would screen the content of your phone calls or your ISP would edit your emails. Technology can sometimes help here, but it's rarely a full answer. We also have millions of active users who are vocal when it comes to alerting us to content they find unacceptable or believe may breach our policies. When they do, we review it and remove it where appropriate. These are always subjective judgments and some people will inevitably disagree. But that’s because what’s acceptable to one person may be offensive to another.
We also face the added complication that laws governing content apply differently in the different parts of the world in which we operate. As we all know, some governments are more liberal about freedom of expression than others. These legal differences create real technical challenges, for example, about how you restrict one type of content in one country but not another. And, in extreme cases, we face questions about whether a country's laws and lack of democratic processes are so antithetical to our principles that we simply can't comply or can't operate there in a way that benefits users.
But it's not only legal considerations that drive our policies. One type of content, while legal everywhere, may be almost universally unacceptable in one region yet viewed as perfectly fine in another. We are passionate about our users so we try to take into account local cultures and needs -- which vary dramatically around the world -- when developing and implementing our global product policies.
Dealing with controversial content is one of the biggest challenges we face as a company. We don’t pretend to have all the right answers or necessarily to get every judgment right. But we do try hard to think things through from first principles, to be as transparent as possible about how we make decisions, and to keep reviewing and debating our policies. After all, the right to disagree is a sign of a healthy society.
googleblog.blogspot.com
published @ September 1, 2008