Summary: The EU General Data Protection Regulation
After four years of negotiation, the European Parliament approved the General Data Protection Regulation (GDPR) on April 14, 2016. Enforcement is scheduled to begin May 25. This post provides a high-level summary of what the GDPR requires, how it differs from past EU data regulations and what it means for how data is handled outside the EU.
What the GDPR Does
The GDPR creates an EU-wide set of standards for the protection of digital personal data relating to online or real-world behavior for EU internet users. Importantly, these standards apply to the personal data of EU internet users regardless of the location of the entity holding their data. In this sense, the standards have significant extraterritorial reach. This regulation replaces Directive 95/46/EC, commonly referred to as the Data Policy Directive, which had established a goal for all EU countries. Individual member states separately enacted national legislation implementing the directive’s goals, creating an unwieldy regulatory patchwork. The GDPR was intended to harmonize those standards but allows individual member states discretion on a number of provisions. On data processing, for example, there is flexibility over means by which entities can demonstrate GDPR compliance, data transfer outside the EU and freedom of expression in the media.
The GDPR defines personal data as “information relating to an identified or identifiable natural person.” This understanding of personal data includes IP address, device ID and customer reference number. Importantly, these protections apply to all corporate entities that process the personal data of EU citizens, even if the processing of relevant data does not take place within the EU. The new regulation also imposes restrictions on transferring personal data outside of the EU. Personal data may be transferred outside the EU only if the European Commission determines that the receiving jurisdiction “ensures an adequate level of protection” consistent with the GDPR; the processing entity has provided “appropriate safeguards”; or the individual has provided specific consent for the transfer. Furthermore, the GDPR guarantees a number of privacy rights to EU internet users, including mandatory, prompt notification of data breaches likely to “result in a risk for the rights and freedoms of individuals,” access to one’s personal data, the ability to instruct an entity to erase one’s personal data (consistent with the “right to be forgotten”), and the ability to move one’s personal data from one processing entity to another. Together, these rights are at the heart of the regulation’s purpose—“to give citizens back control over their personal data.”
These objectives are advanced through several mechanisms. First, organizations that breach their obligations can be fined as much as 4 percent of their annual global turnover or 20 million euros (whichever is greater). This fine applies primarily to breaches of the GDPR’s consent requirements—which is related to the second point: Under the GDPR, consent must always be unambiguous. For special categories of personal data (e.g., race or ethnicity, political opinion, genetic data, union membership) affirmative, explicit consent is required. Third, the GDPR requires that entities monitoring data subjects “on a large scale” or, again, processing special categories of personal data appoint a data protection officer. Such officers advise their organization on GDPR compliance, serve as a point of contact for subjects inquiring into their data, and liaise with EU supervisory authorities. Fourth, the GDPR encourages the creation of data protection certification mechanisms, such that entities can clearly demonstrate compliance with the regulations. Individual EU member states as well as entities within the European Commission are empowered to enforce the provisions.
A Brief History of EU Data Privacy
These substantial protections build on a history of European concern for data privacy dating to 1980 and the Organization for Economic Cooperation and Development (OECD)’s Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. The guidelines, which the United States also signed, defined personal data as “any information relating to an identified or identifiable individual.” These guidelines were more explicit than the GDPR about the effect of national privacy laws on cross-border data flows. The preface, for example, recommends that “Member countries endeavour to remove or avoid creating, in the name of privacy protection, unjustified obstacles to transborder flows of personal data.”
One year later, the Council of Europe negotiated the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, which codified many of the OECD’s recommendations. Even then, however, there were signs that the EU was moving toward more privacy protections. For example, the convention provided that member states not impede the flow of personal data between other member states. This change in emphasis was made clearer in 1995 with the Data Protection Directive, which expanded the rights of EU citizens to control their data. Article 25 of the directive, for instance, provided that member states guarantee that transferred personal data will enjoy an “adequate level of protection” outside the EU. In making this adequacy determination, member states had to consider, inter alia, “the rules of law, both general and sectoral, in force in the third country in question and the professional rules and security measures which are complied with in that country.” This adequacy framework is a far cry from the OECD’s 1980 trade-oriented support for the free movement of data. It also shows the extent to which the GDPR’s data privacy rules are historically unsurprising.
Importantly, the GDPR is only one part of the EU’s data privacy regime. Article 8 of the European Convention on Human Rights provides that “Everyone has the right to respect for his private and family life, his home and his correspondence.” The European Court of Human Rights has used Article 8 to establish a variety of digital privacy protections (such as regarding the use of GPS information and medical information). Directive 2016/680 instituted separate provisions regarding the processing of personal data related to criminal investigations and prosecutions. Other cases before the EU Court of Justice as well as a variety of other EU directives, guidelines and thematic papers round out Europe’s data privacy regime.
What Effect Will This Have on the United States?
Already, the GDPR’s extraterritorial application has had a significant effect on data privacy practices outside the EU.
Google, for example, has said that it is “working hard to prepare” for the GDPR and that, as a data processor, it plans to “update our agreements to reflect the obligations of controllers and processors and offer data-processing agreement where required in time for May 2018.” Google also cited its membership in the EU-U.S. Privacy Shield as a sign of its adherence to GDPR rules on the cross-border transfer of personal data. This framework is a set of privacy standards and protocols, negotiated and implemented by the U.S. Department of Commerce and the European Commission, “to provide companies on both sides of the Atlantic with a mechanism to comply with data protection requirements.” The European Commission annually reviews the program, and U.S. implementation of it, to assess whether it continues to adequately protect EU users’ privacy.
The commission renewed the privacy shield’s mandate in its first review in October 2017 but also made several notable recommendations. Strikingly, the commission stated that it would welcome it if Congress “would consider favourably enshrining in the Foreign Intelligence Surveillance Act the protections for non-Americans offered by Presidential Policy Directive 28.” The commission also urged the Commerce Department to undertake regular compliance checks and actively search for companies falsely claiming to participate in the privacy framework. These recommendations carry real weight; as of this first review, 2,400 U.S. companies have signed up for the program, including some of the largest U.S. tech firms (Google, Facebook and Microsoft). In 2015, the United States had to scramble when the European Court of Justice found that Safe Harbor, a less restrictive cross-border data pact between the EU and U.S., was inadequately protective of privacy. The economic fallout from a non-compliance determination gives the EU impressive leverage to influence data privacy practices in the United States.
Notwithstanding this leverage, it is important to acknowledge the potentially limited degree to which the GDPR’s more privacy-oriented safeguards will be generalized beyond the data of EU users. Google, for example, notes that only EU users are asked “for permission to use data to personalize ads.” Given the specificity with which entities can differentiate between users in different national jurisdictions, there is no guarantee that some practices will be applied to users outside the EU. This is especially true given the degree to which some organizations, particularly those that depend on analysis of large data sets, have pushed back against the GDPR’s requirements.