Today is Data Privacy Day, an annual event in which I am — rather proudly — personally invested. Data Privacy Day began with a conversation at my dinner table eight years ago, when Leonardo Cervera Navas (then with the European Commission and now with the European Data Protection Supervisor’s office) and Jolynn Dellinger (then with Intel and now with Minding Privacy) joined my family for dinner. At the start of dinner my boys, who were very young at the time, brought out a sheet of pictures of flags and asked the adults at the table to point out the flags of the countries where they had lived. That discussion of the flags prompted a stark comparison between my US suburban neighborhood and the places we had lived in the UK and Germany, and areas in Belgium, Ireland, Spain and Italy that our guests had called home. While the focus at the beginning was on cultural differences, the conversation soon turned to the values shared across those cultures.
Leonardo had the idea first that there should be a day when people could recognize those shared values and promote transatlantic cooperation. Data Protection Day had already been recognized in Europe and held on January 28th, which is the anniversary of the Council of Europe’s signing of Convention 108. It is Convention 108 which first recognized privacy as a fundamental human right. I vividly recall Jolynn then saying, “We shouldn’t just talk about it, we should do it.” By the end of dessert, Data Privacy Day was born.
Since that time, Data Privacy Day has evolved into a time to identify and discuss mechanisms to promote the ethical and innovative use of data. Individuals around the globe are asking for better ways to protect privacy while making it possible to derive value from data that pertains to them. Data Privacy Day has become an opportunity to focus on delivering on that goal.
On January 14th the Pew Research Center released a report on its most recent privacy survey research, “Privacy and Information Sharing.” Pew’s lead observation was that individuals will provide personal data if they perceive the value to be worth it, and the risks to be manageable. For example, a majority of survey respondents were comfortable with allowing doctors to use electronic health records to provide medical services. However, for technology that would provide insight into activities in the home, such as smart thermostats, individuals become more reticent (only 27 percent favorable). Some respondents to the survey seem resigned to the fact that their data would be collected. One respondent noted, “The data isn't really the problem. It’s who gets to see and use that data that creates problems. It’s too late to put that genie back in the bottle.” These findings demonstrate that while individuals see value in the innovative use of personal data, they want assurances that it will be used appropriately.
As the recent electric grid attack in Ukraine demonstrates, cybersecurity attacks continue to pose significant risks not just to devices, but to cities, societies and individuals. Protecting against these attacks is critical for privacy, as we have seen from the many data breaches experienced by governments and companies. Protecting against these attacks requires innovative security protections.
My own company, Intel, is situated in a unique place in these discussions. Intel’s business model is not predicated on using user data in ways that would surprise, or be at odds with the interests, of individuals. We primarily focus on delivering hardware and software solutions to be used across the spectrum of computing from the Internet of Things to smartphones to the data center. We are also a security company that has invested billions in security technologies, a fundamental component of which involves the need to scan, predict and prevent malicious traffic on the internet. To put the matter simply, protecting against cyberattacks necessarily requires the processing of personal data.
Security is often described as at odds with privacy, but the reality is that security and privacy can and should reinforce each other. We find ourselves at a time when protecting privacy often requires the processing of personal data. If you want to be protected against cyberattacks, you need a trusted entity to scan through the internet traffic to prevent that code from running on your machine. If you want to be confident that the links you’re going to click on aren’t malicious, you need software to know both where you are on the web and what’s waiting for you there. The same is true for preventing terrorism and providing effective law enforcement.
The point is that it takes data to protect data. It takes data to provide security. It takes data to protect people. Artificial binary approaches stating that privacy and security are always a zero sum tradeoff are inaccurate and unhelpful. In recognition of Data Privacy Day’s mission of promoting transatlantic cooperation and raising privacy awareness, we need to focus on specific solutions to provide the right oversight and controls to give citizens in all countries comfort that data that relates to them will not be used inappropriately but it will be used reasonably to protect them.
I have an idea of where we should start.
Intel has been working for some time to define how to promote privacy while also empowering organizations to pursue the innovative use of data. We call this effort “Rethink Privacy” and ground our recommendations in the Fair Information Practice Principles as articulated in the OECD Privacy Guidelines. Intel’s Paula Bruening has written an excellent blog post describing how the OECD FIPPs are “the common language of privacy.” The OECD FIPPs are foundational and do not need to be changed. They do, however, need to be implemented in new ways to properly adjust to an environment of the internet of things, cloud computing and advanced data analytics.
As EU and US negotiators focus on a new agreement to provide a lawful basis for the transfer of personal data, I offer the following recommendations based on our Rethink Privacy initiative.
Collection Limitation — Government agencies should not hold the entire “haystack” to find needles. The private sector keeps data for its business purposes and it should not be obligated to keep the data beyond that point. Instead of government demanding all of that data, it could provide the private sector with algorithms to isolate specific information to which it would like access. Supplying those algorithms and requiring the provision of the data identified by the algorithms should be subject to court oversight using reasonable due process. Both the USA Freedom Act and the recently passed French intelligence law approach collection limitation in this way.
Use Limitation — If data is provided to governments to protect against terrorism or cyberattacks, it should not be used by agencies for other purposes. Use of personal data to prevent terrorism or cyberattacks has been the subject of considerable discussion around cybersecurity information sharing legislation. To drive consensus on this issue, it is important to limit law enforcement and surveillance agency use of personal data to only the most important concerns. There will be a natural tendency for governments to use the data for other legitimate purposes, but scope creep will erode support, particularly in other countries.
Accountability — All companies and government organizations should have an adequately resourced privacy officer. Whether privacy is provided for individuals depends on the government and private sector putting in place resources, policies and processes to make certain personal data is managed responsibly. The Information Accountability Foundation has done considerable work to articulate the elements of accountability with respect to privacy. One simple test is to ask an organization who their Privacy Officer is and to whom that person reports. If an agency or company cannot identify one person who is in charge of privacy, or if that person is mired deep in the organization, then it is a sure sign privacy is not a priority.
Security Safeguards — Companies and governments need to invest more in cybersecurity. Chris Young, Senior Vice President and General Manager of Intel Security, has repeatedly argued that the world has not invested sufficiently in cybersecurity. This lack of investment has created a “cyber debt” that must be paid down. The unfortunate steady stream of high profile data breaches shows us how much work we have yet to do to properly protect data. Cybersecurity tools and services require constant innovation to stay ahead of evolving threats. In other words, that investment is an ongoing project — a process — not a simple capital outlay.
Privacy and security are not mutually exclusive, but instead are essentially linked. We can have both. Data Privacy Day is an excellent opportunity to both think about, and to “rethink”, privacy. We have pressing needs in law enforcement, education, healthcare and urban planning to use data to provide value for society. We also have pressing needs in the provision of individual security and privacy. We can accomplish those goals while still protecting privacy. Indeed, on this Data Privacy Day, I would like to leave you with the following thought: protecting data requires data, and the real question is not just whether data is collected and processed but whether it’s protected from those who would abuse it and whether those we trust to handle it are doing so appropriately, with oversight, and in our interests.
Editor’s note and disclosure: Intel is a generous financial supporter of Lawfare. This article, as with all articles, underwent Lawfare’s normal editorial process and review.