In May 2014, the Court of Justice of the European Union (CJEU) ruled that search engine operators in the EU are responsible for handling individuals’ requests to remove links to personal information that appear in search engine results. The Google Spain case—somewhat misnamed as the “Right to Be Forgotten” case—arose when a Spanish citizen filed a complaint alleging that when individuals queried his name in Google’s search engine they would be provided links to a 12-year-old newspaper article announcing a foreclosure auction on his home—a fact he believed was irrelevant now that his finances were in good order. The CJEU agreed that European law requires Google and other search engines to evaluate requests to delist truthful information if it is no longer relevant.
Whatever one thinks of the merits of the CJEU decision, the societal and practical implications of the decision require deeper analysis. Beyond the implications to privacy and freedom of expression, the holding places a considerable burden on search engine companies, most of which are located in the United States. Intel Corporation has a strong interest in promoting a public policy environment that both encourages innovative internet companies to thrive and promotes trust so individuals will have confidence in their use of digital devices. Establishing a neutral, international body to take the burden of the Google Spain decision off of search engines is a notion that furthers those goals and warrants consideration.
Google Spain was met with controversy both in the EU and internationally. I have recently co-written a law review article that concludes the CJEU opinion is well-founded under European law. As part of the analysis in the article, my coauthors and I address the misconception that the ruling requires a “forgetting” of information. Google Spain is often referred to as establishing a “right to be forgotten” on the internet; yet the decision does not call for the removal of data. Rather, it requires that companies obscure information from searches that are based solely on an individual’s name if those search results are irrelevant or excessive. The actual data remain; they are simply obscured.
However, even this limited ability to delist information requires search engines to invest significant time and resources to comply with the decision. It also requires them to play the role of front-line adjudicators for the right to delisting. These companies lack clear criteria on which to base a decision on the complicated questions that arise in this regard. An individual who seeks delisting of certain links from multiple search engines must petition them separately. Each company—Google, Yahoo, Bing, Facebook, others—must review the facts of the request and determine whether they meet the vague standard established by the court. If the company approves the request (and companies may want to err on the side of over-compliance to avoid regulatory enforcement), the company must then take down the links. This process puts an unfair burden on the internet companies, while at the same time delivering unpredictable results for the individuals (the information may be delisted by one search engine and not another).
To reduce those burdens, I have proposed the creation of a global Internet Obscurity Center that would shift the responsibility and decision-making authority away from private companies. In addition to taking away an unreasonable burden on companies, it also would relieve individuals of the task of petitioning each search engine or other information intermediary (the decision could easily apply to social media websites and data brokers). The Center would serve as one centralized body that would make delisting recommendations, based on an established set of criteria. Those recommendations could then be made to all service providers. Regulators from individual countries could provide assurance that they will start enforcement actions only against the companies that do not comply with the Obscurity Center recommendations.
Many internet websites and services operate globally. Search engines for example both access information from around the world, and for the most part can be queried by citizens from any country (limited to some degree by a few countries blocking traffic from certain websites.) While search engines can restrict access by attempting to identify individuals’ locations from their IP addresses or their OS settings, there are easy methods to frustrate those controls. While it is understandable that regulators may desire country specific obscurity, the best solution for companies and individuals is to obscure information globally.
The Obscurity Center would reduce resources companies need to invest to comply with Google Spain. It would also eliminate the duplication of effort when several search engines are petitioned to obscure information. Instead of each company having to hire staff dedicated specifically to making decisions about removing internet links and establishing its own criteria to do so, businesses could invest in a center that would evaluate requests against a single set of criteria and make decisions on behalf of all of the companies. This approach would reduce an individual company’s exposure to liability for arguably incorrect decisions.
A global Internet Obscurity Center would also introduce needed predictability and consistency into the process mandated by Google Spain, relieving companies of the need to make decisions based on the very limited guidance provided by the court. The Center would also maintain independence by employing its own leadership and staff—trained in EU privacy law—who would make the determinations. The best solution would be a Center that has organizational independence from companies and regulators with oversight by a respected board of academics and experts. Regulators could provide oversight of the Obscurity Center to guard against undue influence from the companies.
Regulators around the world are attempting to find solutions to deal with concerns that individuals need some obscurity on the internet. A patchwork of regulations and guidance will be confusing for individuals and needlessly burdensome on companies. US companies that process personal data of individuals from outside the US will struggle to navigate this regulatory swamp without a centralized entity that regulators are willing to support.
The Google Spain case undoubtedly will continue to be the subject of debate and analysis, both in the U.S. and in Europe. But to assist businesses that must carry out the requirements of the decision, and to promote predictability and consistency in determinations about when information should be obscured, I believe that an Obscurity Center can serve as a starting point for discussion on both sides of the Atlantic.
* * *
Editor’s note and disclosure: Intel is a generous financial supporter of Lawfare. This article, as with all articles, underwent Lawfare’s normal editorial process and review.