This year is off to a quick start in shaping the future of cross-border data flows on the Internet. Earlier this month, Sen. Orrin Hatch introduced the “CLOUD Act,” which drew bipartisan backing in the Congress, won the approvals of tech industry leaders, and gained the support of government officials on both sides of the Atlantic. On Monday, the second annual Global Internet and Jurisdiction Conference convened in Ottawa, where global leaders from government, industry, and civil society discussed the growing challenge of conflicting national laws affecting the Internet. Also on Monday, it was reported that the European Commission was preparing a draft directive aimed at providing European law enforcement officials with data stored overseas. And on Tuesday, the U.S. Supreme Court heard oral argument in United States v. Microsoft Corp., which centers on the question whether or not U.S. law enforcement may serve a Stored Communications Act warrant for extraterritorial data.
Each of these efforts seeks to address the growing challenge of how to balance the free flow of information on the Internet, a principle which has allowed the global digital economy to flourish over the past three decades, with competing public policy demands (such as data privacy and intellectual property protection) that justify imposing constraints on just that free flow of information.
For most of the millennia during which humans have maintained written records, those records have occupied a known physical space—generally near where they were created and used. Today, however, data is regularly stored on computers located far from the places from which the data originated or is used. Data is often copied and then stored or cached in multiple servers in disparate locations in several countries simultaneously. Increasingly, data is being sliced into small portions and then distributed across the Internet, shifting from one geographic location to another (often on an automated basis), as more efficient storage space becomes available elsewhere on the globe.
The questions that follow from this have quickly become some of the most vexing and persistent questions of law and policy in the Internet age. Who “owns” what data? Which sovereign laws should apply to which data overseas? When, and how, should governments access data stored abroad? Governments, tech companies, and legal experts across the world are struggling to reconcile the Internet’s globalization of data flows with the often-competing demands of privacy, national sovereignty, and global trade, along with longstanding notions of international comity and cooperation.
What is emerging is a set of highly complex and intertwined international disputes that threaten the continued growth of the Internet and the broader information economy, while simultaneously undermining international cooperation on other pressing global issues. These disputes have become more frequent and more contentious over time, and various governments’ responses to them have become increasingly assertive. Vint Cerf, one of the Internet’s original creators, has described the growing set of conflicts over jurisdiction on the Internet as “one of the biggest policy challenges of our time and more complex even than building the [I]nternet.”
In response to these challenges, governments are advancing new legal and policy regimes that seek to address aspects of the problem; some of these regimes are highly problematic, such as data localization policies. And many (if not most) of these regimes are fundamentally premised on the geographic location of data as the basis for asserting legal jurisdiction, which only serves to further exacerbate the underlying problem. It is becoming clear that the current piecemeal response by the international community needs to be radically rethought.
In our latest paper, published by New America as part of its Cybersecurity Initiative, we describe these various challenges in detail and introduce a framework for data flow controls that avoids any direct reference to the physical location of data—an approach that, in our view, allows for the harmonization of various national approaches. Additionally, we describe opportunities for advancing international cooperation on several of the more troublesome international data flow conflicts, consistent with our proposed framework.
Our paper is not an endorsement of any new omnibus international legal instrument, such as a global treaty on data flows or cyber crime. On the contrary, given the complexity of the problems and the diverse set of interests and stakeholders involved, we suspect that any solution will require multiple arrangements—some legal, some normative, and some procedural. Nevertheless, we maintain that absent a harmonized approach to data flow controls, the escalating fragmentation of the Internet will further split it into separate, semi-sovereign networks. This fragmentation, we argue, would fundamentally degrade the Internet’s future potential for innovation, economic growth, and other social, scientific, and democratic advancements we have come to expect from today’s global network.
We hope our analysis encourages adoption of data flow controls that recognize the reality of today’s globalized data environment, protecting the future of the digital economy and the freedom to send and receive information across frontiers.
The views expressed here are the authors’ own and should not be interpreted as official U.S. government policy.