Sign up for our daily recaps of the ever-changing search marketing landscape.
Two major changes potentially coming to EU’s Right to Be Forgotten with global implications
Authorities in France considering automatic delisting for 'sensitive personal data' and forcing global index removal.
There are some potentially major changes coming to Europe’s Right to Be Forgotten (RTBF). Two pending issues to soon be decided are discussed by Google’s Senior Privacy Counsel Peter Fleischer in a blog post earlier today:
- Whether RTBF should be absolute — requiring automatic delisting — when the underlying content in dispute involves “sensitive personal data”
- Whether RTBF should be enforced globally beyond the boundaries of Europe
Fleischer provides the procedural background on the first question:
The background to this CJEU case is that in 2016, four individuals were unhappy with our decision not to remove certain links to webpages about them. They appealed to the French data protection regulator, the CNIL, asking them to review our decisions, challenging the underlying principle that a public interest test should apply.
In its review, the CNIL agreed with our decisions. The individuals subsequently took their case to the French Supreme Administrative Court (the Conseil d’Etat). This court heard their arguments in February of this year, and referred the case to the European Court of Justice of the European Union (case number C-136/17).
Examples of “sensitive, personal data” could be political affiliations or a criminal record. To date, RTBF has involved a case-by-case assessment of the right to individual privacy vs. the public’s interest in having access to information via search engines. The underlying data sources (e.g., newspapers) are not required to purge the information.
Explanatory FAQs (.pdf) provided by the EU say the following about how search engines must assess each request:
[RTBF] criteria relate to the accuracy, adequacy, relevance — including time passed — and proportionality of the links, in relation to the purposes of the data processing (paragraph 93 of the Court’s ruling). The request may for example be turned down where the search engine operator concludes that for particular reasons, such as for example the public role played by John Smith, the interest of the general public to have access to the information in question justifies showing the links in Google search results. In such cases, John Smith still has the option to complain to national data protection supervisory authorities or to national courts. Public authorities will be the ultimate arbiters of the application of the Right to be Forgotten.
A set of appeals happened in the pending case, and now the CJEU has to decide whether there are certain categories of personal data that require automatic removal from search indexes. Because of the CJEU’s high court status — this was the court that announced RTBF in the first place — any decision favoring the litigants could fundamentally alter the implementation of RTBF, as opposed to simply vindicating the rights of these particular individuals.
Google’s Fleischer argues a ruling in favor of the litigants would elevate the rights of individuals over the broader public interest:
Requiring automatic delisting from search engines, without any public interest balancing test, risks creating a dangerous loophole. Such a loophole would enable anyone to demand removal of links that should remain up in the public interest, simply by claiming they contain some element of sensitive personal data.
It would be a serious mistake for the CJEU to rule that some categories of information require automatic removal, without any sort of test involving considerations of the public interest. That could allow unscrupulous or controversial figures to erase their histories of bad, even criminal behavior.
The second issue before the French Conseil d’Etat (a top legal-administrative body) is perhaps even more troubling. For at least three years, France has been arguing that RTBF must apply to Google’s global index and not just Europe. Accordingly, its courts and privacy authority have argued that such a move is necessary to prevent people from simply accessing the same information via Google.com — however, the company has made it harder to get to Google.com in Europe.
Fleischer argues that allowing global removal would set a “grave precedent.” It could empower authoritarian political figures and governments, such as China, Russia, Pakistan, and now Turkey, to try to censor the internet globally. Chinese state authorities could adopt the French position and argue, under some pretext, that all references to Tiananmen Square or the Dalai Lama must be removed from Google’s index.
While that may sound unlikely, the leap is not that great. Numerous countries would love to be able to control the content of Google’s search index. Imagine de facto Philippine dictator Rodrigo Duterte, for example, arguing that content about his policy of state drug killings was “sensitive” and must be removed globally from Google’s index. Again, it’s not that far-fetched a scenario.
In addition, the combination of these two rules would be a kind of one-two punch for censorship and could have serious global consequences for the free flow of information and the public’s right to know.