Ending The Endless Crypto Debate: Three Things We Should Be Arguing About Instead of Encryption Backdoors
Recently I participated in a fascinating conference at Georgia Tech entitled “Surveillance, Privacy, and Data Across Borders: Trans-Atlantic Perspectives.” A range of experts grappled with the international aspects of an increasingly pressing question: how can we ensure that law enforcement is able to obtain enough information to do its job in the twenty-first century, while also ensuring that digital security and human rights are protected? How can or should law and policy adapt to a world of digital evidence, much of which is easily obtainable—but much of which is not?
The primary focus of that conference was on how best to regulate the sharing of needed user data between internet companies in one country and law enforcement in another country. However, in this post—part of an online symposium at Lawfare following up on that conference—I’ll be mostly focusing on another, particularly controversial part of the broader conversation regarding modern policing: the debate over encryption, and how law enforcement should respond to it.
First, to very briefly summarize a long-running debate: until he was dismissed in May, FBI Director Comey had been arguing since 2014 that the growing prevalence of encryption—in particular, default encryption on some smartphones, and end-to-end encrypted messages that neither messaging service providers nor the government can decode—is depriving government investigators of needed evidence. This is what the FBI calls the “Going Dark” problem. For the past several years, and most recently in two speeches in March and in testimony to Congress in early May, Comey called for a solution to that problem. Privacy and security advocates fear that the FBI’s preferred solution may end up being a wrong-headed legislative mandate requiring providers to ensure some sort of exceptional technical access to encrypted data for government—what opponents (like me) would call a “backdoor”—or otherwise ensure that they do not deploy any encryption that they themselves cannot decrypt. I won’t bother repeating here the many arguments why such a mandate would be bad for America’s cybersecurity and economic security, as well as the civil and human rights of people around the world, nor why it would be mostly useless at preventing bad guys from using encryption if they want to; see here for arguments that I and my organization Open Technology Institute have previously made.
Recently in Slate I responded to Comey’s repeated calls for an “adult conversation” on this seemingly endless debate. I replied that an “adult conversation” means moving past any discussion of discouraging or undermining the deployment of unbreakable encryption, in light of the broad consensus outside of the FBI that such a move would be dangerously bad policy. Rather than continuing to argue about whether or how we might force encryption technology to adapt to law enforcement’s needs, our time would be better spent focusing on how we can help law enforcement adapt to the technology. This was essentially the same conclusion of the House of Representatives working group that examined the encryption issue last year. In its report at the end of 2016, members of that working group—including the chair and ranking members of both the House Judiciary and Commerce committees—concluded that undermining encryption was bad policy, and that we should instead focus on other pathways by which the government can get the information it needs for its investigations.
I wholeheartedly agree. There is no single answer to law enforcement’s Going Dark concerns, but here I outline three potential courses of action that should form the focus of the debate. I expect that a broad selection of privacy advocates, internet companies, security experts, and a range of other pro-encryption stakeholders inside and outside of government are willing to discuss and debate in good faith these three ideas about how law enforcement can adapt to the spread of encryption, if we can stop wasting time arguing about the wisdom of having unbreakable encryption at all. Of course, there are aspects of these three conversations that will be difficult or unpleasant for all of these stakeholders, as well as the government. But I take that as a sign that I’m on the right track, as per the old saying: a good compromise leaves all parties equally dissatisfied.
1. Establish Rules of the Road for Government Hacking and Vulnerability Disclosure
When the government can’t unlock an encrypted phone or intercept an encrypted text message, hacking into the phone is one way to get around the encryption problem (as the San Bernardino iPhone case demonstrated). However, there are still no clear rules for the government to follow when it wants to use a zero-day vulnerability to hack into a phone it has seized, or when it wants to secretly hack into a phone remotely to read messages in the phone’s storage that it couldn’t read in transit due to end-to-end encryption. Congress can and should change that.
As the House report indicated, Congress should consider creating a legal framework to answer those questions. In particular, as suggested by privacy-minded groups like the Electronic Frontier Foundation, Congress could craft a statute that governs legal hacking much like the federal wiretapping statute or “Title III” regulates electronic eavesdropping. Congress could also consider whether and how to codify the current White House-led “Vulnerabilities Equities Process” that dictates when the government can hold on to previously unknown software vulnerabilities for its own use versus disclosing those vulnerabilities so they can be fixed. The Open Technology Institute has held events and written papers that discuss both of these options but there’s only so far we can take those ideas when we know so little about the government’s actual activities. Congress should hold hearings to collect basic knowledge about what the government’s current practices and capabilities look like before making any decisions.
As the House Report pointed out and FBI Director Comey repeated in his May testimony, law enforcement doesn’t consider lawful hacking alone to be an adequate solution to the encryption challenge, since it can be a time- and resource-intensive tactic that won’t always scale to meet investigative needs, especially for state and local law enforcement. This is certainly true to some extent—lawful hacking will always be more difficult and resource-intensive for law enforcement than bluntly demanding backdoors for all messaging systems. That’s actually one of the reasons why some privacy advocates have argued expressly in favor of lawful hacking, as a targeted alternative to such broad capability mandates.
But the objection that hacking will never give law enforcement as much access as would a backdoor into or a ban on user-controlled encryption misunderstands the problem we’re trying to solve. The societal goal here is not to ensure that law enforcement can access every piece of data it might ever seek, but that it can get enough information to do its job, and hacking is certainly a part of the solution. How big a part it might play is not a question that we should just take the FBI’s word on. Rather, focused dialogue in Congress—through public hearings—is the best way to obtain the information necessary to evaluate how well hacking can address the Going Dark concern.
Law enforcement is not likely to welcome with open arms any brand-new legislative constraints on its ability to secretly hack into suspects’ computers or stockpile vulnerabilities for such hacking, when it has been conducting these activities with little guidance or restriction from Congress for nearly twenty years. However, clear statutory rules guided by the same Fourth Amendment principles that informed the wiretap statute would provide the police officer and the prosecutor with something they lack right now, which is some level of legal certainty—and clear endorsement of the practice, when paired with appropriate checks and balances—directly from Congress.
The idea of such an endorsement makes some privacy advocates nervous, both because this kind of hacking relies at least sometimes on zero-day (previously unknown) software vulnerabilities that they would prefer to see disclosed and patched, and because it raises a wide variety of novel privacy and security risks. But again, the government has already been conducting remote computer intrusions at least since the turn of the century and Congress has so far shown no inclination to prohibit it; indeed, Congress recently helped clear the path for more hacking by allowing a procedural rule change that lets police remotely hack computers without knowing their location. That said, if anyone seriously expects that Congress might consider banning government hacking outright, they should also share the initial goal of getting Congress to hold hearings so as to better understand how the government is using hacking now.
In the meantime, as unbreakable encryption proliferates, so too will government hacking necessarily increase, whether advocates like it or not. Therefore there’s little to lose and much to gain if Congress affirmatively regulates when and how the government gets to hack, along with when and how it gets to keep software vulnerabilities secret instead of disclosing them. Such a statute would help constrain and minimize the privacy and security harms that stem from the government’s hacking activities without foreclosing constitutional arguments that litigators might raise against the practice. Indeed, it may be even easier to raise such challenges once the government’s practices have been exposed to greater scrutiny and codified.
2. Up the Tech Game of Government Investigators at All Levels
More government hacking—whether it’s secret remote hacking of suspects’ computers or forensic hacking of devices in law enforcement’s possession—will require that more law enforcement resources be directed toward developing and maintaining the expertise and tools necessary to conduct such searches, or toward purchasing the expertise and tools from outside contractors. The latter option is, to my mind, the less attractive one. Like many other advocates, I worry about how the growing market for software vulnerabilities helps criminal networks and oppressive governments get their hands on sophisticated hacking and surveillance tools (see the vulns market primer that I co-wrote here). Having our government focus on developing its own hacking tools rather than buying them may be one of the best ways to shrink that market, since right now the US is its biggest buyer. Some other advocates might argue that the government having its own hacking research teams is dangerous, but if the government’s going to hack—and trust me, the government’s going to hack—I’d rather it’d be done by sworn law enforcement officers rather than by shady contractors who are also selling spy tools that could ultimately lead to dissidents in other countries being tortured and killed.
Another key aspect of upping government investigators’ tech game is making sure that they all know exactly what data they can lawfully obtain from internet companies today, without any new technical mandates. That means we need companies to step up and educate law enforcement and everyone else about the data they have—all of it, and not just the data they typically offer as matter or course in response to legal process. Some companies and privacy advocates would object to this idea, saying that it’s dangerous to give law enforcement such a large menu from which to choose. I’m sympathetic to that argument, but three counterarguments have pushed me the other way. First, in the context of the encryption debate, it is disingenuous to argue that the government doesn’t need backdoors because we are living in a “golden age of surveillance” that allows lawful access to unprecedentedly vast amounts of data, and to then hide the ball on what data is actually available. Second, any information security expert will tell you that “security by obscurity”—hoping that a piece of information will remain secure based on it not being widely known—is a losing strategy anyway. Law enforcement will ultimately figure out what’s really available one way or another, especially considering the revolving door of personnel moving between law enforcement and internet companies’ compliance departments; it’s better to be upfront about it now and in the process help head off a bad result like backdoors. Third and finally, comprehensive transparency to law enforcement and the public about what data the companies are creating and storing is the only way that policymakers or the market will ever be able to enforce any real accountability over those practices, and I think (or at least hope) that the overall benefit to consumers’ privacy from that accountability will balance out the harmful effects of giving the government a full menu of data.
Of course, educating law enforcement about the range of available data and enhancing its capabilities around hacking will cost money—money that isn’t being spent now. For example, based on figures that came out during a National Academies of Science workshop on the encryption issue, only 39 FBI staff (only 11 of whom are agents) are working the “Going Dark” issue, with a budget of $31 million (peanuts in the law enforcement and intelligence realm). Congress should fix that—not only by upping that budget and staff, but by exploring how it can support and expand other offices of government that might serve as clearinghouses for federal, state, and local police looking for up-to-date and centralized information about how to conduct criminal investigations in the twenty-first century. One attractive candidate highlighted by the House working group report is the National Domestic Communications Assistance Center (NDCAC), a relatively new office of the Department of Justice that serves as a hub for the sharing of technical knowledge and expertise between federal, state, and local law enforcement and the communications industry. Another possible avenue for greater education is the Cyber Center at the Bureau of Justice Assistance, created as a partnership between DOJ and the International Association of Police Chiefs and similarly tasked with sharing information and expertise between all levels of law enforcement on issues like computer forensics and cybercrime.
Regardless of which part of the government ultimately does the work, it’s in the interest of both law enforcement and privacy advocates that we carefully survey and build on existing sources of technical expertise and knowledge-sharing inside of government, as an alternative to knee-jerk anti-encryption policies. Such an investment will pay real dividends in terms of law enforcement effectiveness, while avoiding potentially billions of dollars of harm that a legislative anti-encryption mandate could cause to the U.S. economy by eroding the tech industry’s international competiveness and consumer trust.
3. Solve the Problem of Access to Data in Cross-Border Investigations
Finally, we arrive at the difficult challenge at the heart of the Georgia Tech conference: how do we ensure that foreign investigators can get data about users of U.S. internet companies—and that U.S. investigators can get data about users of foreign internet companies—while ensuring due process of law and the protection of civil and human rights? This is one of the hardest problems in tech policy right now, one that has been written about extensively here at Lawfare and elsewhere. Suffice it to say that just as it is disingenuous to wave a finger and say “but it’s the golden age of surveillance!” to a frustrated American police officer who doesn’t know about all the data that is available beyond what’s on that locked iPhone at the scene of a crime, it’s also unfair to say it to a (let’s say) British officer who’s working a local homicide with a local victim and a local suspect who just happens to store his email with a U.S. provider.
There is of course a Mutual Legal Assistance Treaty (MLAT) in place between the U.S. and the U.K. such that British cop could get that data…if s/he’s willing to wait 6-12 months and if s/he can meet U.S. legal standards. Predictably, though, with the number of foreign cases requiring data from the U.S. increasing at internet speed, the U.K. and other countries with whom we have MLATs are increasingly dissatisfied by that status quo—both by the delay and by the need to jump through U.S. legal hoops even for purely domestic cases—and are demanding new, easier pathways to obtain the data they need. As the Open Technology Institute has described elsewhere, we are not fans of the current U.S./U.K. proposal that seeks to address this problem by changing the law to allow U.S. companies to respond directly to legal process from the U.K. (and, eventually, from other countries as well under rather vague conditions intended to ensure that they are “good” countries that respect human rights).
I don’t like the current proposal on the table—but I recognize the need to come up with a solution to this problem, and soon, whether by reforming the current MLAT system or building a new process. Failure to do so could lead other governments to unilaterally take other steps to get the information they need—whether by demanding that companies store their data in-country, making extraterritorial demands without regard for MLAT and threatening in-country company employees with jail or other sanctions if their demands aren’t met, or (most relevant to this discussion) by passing anti-encryption mandates so they can intercept the data while it is transmitted within the country rather than having to reach outside the country to where it is stored. Any of those alternatives would be much worse for privacy, security, and human rights than a well-negotiated solution to the cross-border problem.
However, any new deal to ease cross-border data requests must explicitly preclude that parade of horribles, since that’s the whole point. More specifically, any U.S./U.K. deal must specifically prohibit the U.K. from mandating data localization by U.S. companies, enforcing extraterritorial demands against U.S. companies outside the scope of the agreement, or demanding new decryption capabilities that aren’t already required by U.S. law. Failure on any of these points makes for an affirmatively bad deal that will likely end with the worst of both worlds: a weakened MLAT process and the parade of horribles. Especially considering that the President prides himself on being a strong dealmaker who puts America first, we should be demanding real concessions from the U.K. to protect U.S. companies—and the continued openness of the internet!—before agreeing to any deal.
Unfortunately, circumstances seem to be trending in the wrong direction. The Trump Administration is reportedly pushing hard for a deal with the U.K. on this issue, and Congress is holding hearings on how to address it (including another one tomorrow). Yet at the very same time, the U.K. is publicly excoriating U.S. companies and rattling its sabers about the possibility of new anti-encryption mandates against those companies. The cost of moving forward on a cross-border deal must include the U.K. forswearing such politically opportunistic scapegoating and working in good faith with the U.S. and its companies to find a non-backdoors solution.
As a society, we have our work cut out for us on all three of the above issues—each is a complex and difficult task, but likely all must be pursued if we are going to arrive at a mode of twenty-first century law enforcement that doesn’t resort to short-sighted anti-encryption proposals that threaten our cybersecurity and economic security. Put another way, we need a sustained conversation about how to develop the full range of possible “encryption workarounds” that can help law enforcement do its job despite encryption and without technical mandates that dangerously seek to undermine encryption. Yet the unceasing argument over the possibility of such mandates is sucking the air out of the room and dangerously robbing us of the time and space to actually have these hard but constructive conversations.
It’s time to end the endless crypto debate and move on.