Google In Trouble Again Over Racist Search Suggestions In UK

Is Google Suggest (autocomplete) simply a mirror held up to society; or is it a product that needs to be censored and regulated to protect us from potentially offensive and objectionable content — even defamation? In the latest row over Google Suggest, the company has removed racist and other offensive autocomplete entries in the UK. […]

Chat with SearchBot

google-autocomplete-featuredIs Google Suggest (autocomplete) simply a mirror held up to society; or is it a product that needs to be censored and regulated to protect us from potentially offensive and objectionable content — even defamation? In the latest row over Google Suggest, the company has removed racist and other offensive autocomplete entries in the UK.

According to the Daily Mail Online, “Google has taken action after its search engine was found to be suggesting vile racist terms when users searched for a number of UK cities including Bradford, Leicester and Birmingham.” Users were typing in city names and then getting offensive suggestions:

The search engine was found to be making crude and offensive suggestions when users typed in ‘Why is’ followed by the name of a city. For example, if a user typed in the phrase ‘Why is Bradford…’ the site automatically suggested the search ‘Why is Bradford so full of P****.’

The article doesn’t disclose the specific terms deemed offensive. I tried to recreate one of these autocomplete suggestions but got there too late. Emulating the pattern above, I typed “Why is Leicester so …” and got “Why is Leicester so multicultural.” 

Autocomplete racism

Google Suggest has been at the center of controversy many times in the past for different reasons. For example, the Google autocomplete function has been used to discover and highlight discriminatory attitudes toward women.

Less than a year ago, a court in Germany required Google to block defamatory language next to individual names after being alerted to their presence. There have been similar lawsuits in France (in January 2010 and September 2010). Google also lost a case in Italy and settled a case with an Irish hotel regarding autocompete.

In another French case in 2012, anti-racism groups sued Google over anti-Semitic search suggestions. And there are several other examples.

Google uses a mixture of historical search query data and personalized information to deliver autocomplete suggestions. Generally it’s a very helpful and useful tool. However because of Google’s visibility and ubiquity it also has the capacity to do damage — even it it’s just a mirror of our own attitudes and bias.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Greg Sterling
Contributor
Greg Sterling is a Contributing Editor to Search Engine Land, a member of the programming team for SMX events and the VP, Market Insights at Uberall.

Get the must-read newsletter for search marketers.