The “Right To Be Forgotten” – EU Court Gives People Ability To Delete Their Google Search Results

The “right to be forgotten”: a triumph of individual privacy rights or censorship? It’s going to depend on how it plays out in specific circumstances. The Luxembourg-based European Union Court of Justice, Europe’s “top court,” ruled earlier today that Google can be compelled to remove information about individuals from search results as part of a […]

Chat with SearchBot

The “right to be forgotten”: a triumph of individual privacy rights or censorship? It’s going to depend on how it plays out in specific circumstances.

The Luxembourg-based European Union Court of Justice, Europe’s “top court,” ruled earlier today that Google can be compelled to remove information about individuals from search results as part of a new, EU-specific “right to be forgotten.” The ruling will have no impact on US search results or those in other markets.

However the ruling is significant in that it creates another chasm between the way that Google operates in North America and Europe. Google’s antitrust settlement in Europe is similar in that it creates an EU-specific SERP.

The case arose when a Spanish citizen wanted information on real estate debts (the auction of his house) removed from Google’s Spanish search results. The information was published in a Spanish newspaper. The individual, Mr. Costeja González, asked the newspaper to remove the information because the debt had been resolved and the matter was 16 years old. The paper refused.

Ultimately, the dispute resulted in a legal case in Spain against Google and wound up before the National High Court of Spain. The Spanish court referred a number of questions about the case to the EU Court of Justice.

The EU Court of Justice effectively ruled that individual privacy rights trump almost all other considerations when it comes to personal data (reputation). This amounts to the formal establishment of a “right to be forgotten” in Europe. It applies to search engines even where the underlying site or data source (a newspaper site in this case) is not required to remove the content and continues to publish it online. This part of the ruling makes almost no sense.

I’m quite sympathetic to the idea behind this ruling: that after some (unspecified) period of time some types of information become “outdated” or “irrelevant.” In California, there’s a new law that (called the “eraser law“) that allows individuals to have their “youthful indiscretions” (occurring before age 18) deleted or removed from the internet.

The idea is that some unfortunate act or incident that happens when someone is a minor (e.g., “sexting”) shouldn’t haunt people for the rest of their adult lives. That’s fair. However, the distinction here is that the EU ruling applies to adults. There’s also considerable ambiguity surrounding the ruling.

There would appear to be numerous potential problems with practical implementation of the right to be forgotten. How much time must pass before lawful information becomes “outdated”? Will the rules be applied evenhandedly in separate jurisdictions in Europe? What about professional negligence or low-level public figures trying to control public perceptions; what information does the public have a right to know?

Here’s what the EU court says about the public interest and the right to be forgotten:

However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of internet users potentially interested in having access to that information, the Court holds that a fair balance should be sought in particular between that interest and the data subject’s fundamental rights, in particular the right to privacy and the right to protection of personal data. The Court observes in this regard that, whilst it is true that the data subject’s rights also override, as a general rule, that interest of internet users.

There’s no guidance given regarding what sorts of examples might weigh in favor of a public continuing “right to know.” Again, this only seems to apply to search engines and not to the originator or original publisher of the content.

It appears that the procedure for removal of content would require the subject to make a request to the search engine first. If that’s refused, he or she has recourse to his or her country’s data protection authority, which would then have (apparently ultimate) discretion over the determination.

I’m not clear on whether there’s any appeal process for the search engine (I assume not). If not, it would essentially give data protection authorities significant power and control over what appears about individuals in search indexes. (Think about the potential for bribery or corruption in cases involving politicians, business executives and so on.)

While the privacy idea behind the “right to be forgotten” is valid, this decision is highly problematic for several reasons, not least of which is that it singles out search engines. It remains to be seen whether there will now be a flood of requests by individuals to remove content about them from Google and Bing search results.

I wouldn’t go so far as to say this sanctions “censorship,” but I would argue that in some cases the practical effect of the rule may amount to the same thing.

Below is a summary of the EU court’s decision and the rationale behind it.


Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.


About the author

Greg Sterling
Contributor
Greg Sterling is a Contributing Editor to Search Engine Land, a member of the programming team for SMX events and the VP, Market Insights at Uberall.

Get the newsletter search marketers rely on.