As many Lawfare readers know, several years ago Steve Bellovin, Matt Blaze, Sandy Clark, and I presented the idea of using vulnerabilities already present in devices as a way to facilitate court-authorized wiretaps. As we explained in a technical article, this would involve a two-step process of first using a wiretap order to remotely examine the device to determine what software was on it, and then using a second court order to actually install the wiretap through using a vulnerability present on the device.
By showing how to wiretap even in the presence of end-to-end encryption, our Lawful Hacking paper shows how to secure society's communications while still enabling lawful interception. Lawful hacking entails two court orders and a custom-built wiretap, this method is not cheap from either a legal nor a technical standpoint. That is the right balance; our laws and national policy have wiretaps as legal tools of essentially last resort.
Recently Herb Lin suggested that software updates could be used to deliver the wiretap. It's highly likely that security agencies have already done such man-in-the-middle attacks in which they look like the software vendor that delivers a wiretap via a software "update." But that's different from Herb's proposal, which is that instead of the government delivering the wiretap, Microsoft — or Google or Facebook or You-Name-the-Company — would do so.
On the surface, Herb's proposal looks good: it simplifies interception and saves the government money in installing wiretaps. But it comes at too high a societal cost to actually be viable. There are two costs, one to business, one to security. Putting Internet companies in charge of delivering wiretaps destroys the trust relationship between the customer and the vendor (something US companies already have enough trouble with due to the Snowden revelations). And such an approach has security costs for it will hinder patching. Because patches often break running software, people already delay in installing patches. The idea that patches might also be used to install wiretaps will create further hesitation and delays in patch installation. That will affect not only those who don't patch, but can even have impact on those who do patch (e.g., by slowing down network access). In a society that should be emphasizing good cybersecurity practices, creating impedance to patching is a really bad idea.
This brings me to Ben's recent post. He analyzes the legal situation on requiring companies to subvert their patching mechanisms in order to install a court-ordered wiretap, and concludes it might happen. Now by training I'm a mathematician. One thing we learn early is that if you start with a false hypothesis, even if your reasoning is fine, your conclusion can be false. Ben's analysis of the legality of requiring companies to install wiretaps as part of a patching process is fine, but it starts with the false hypothesis that Herb's approach of using software updates for lawful hacking is plausible.
That takes me to Ben's final question, "All of this is intended merely to raise a simple question: Is a regime in which companies may have to do these things better or worse from a civil liberties perspective than a regime under which they have to help with decryption?"
Ben misses the point here. "Helping with encryption" is possible only if the company supplies what my colleague Matt Blaze has labeled breakable encryption — encryption with backdoors, frontdoors, or exceptional access. If breakable encryption is the only permitted encryption solution, it will not only be the US government that reads the communications of US companies and others, but also the Chinese, the Russian, the Iranian, the French, and many others. And they'll do so with or without court orders.
So Ben is mistaken in two places. Herb's suggestion doesn't pass the plausibility test, and therefore is not the one with which to test our proposal. And our national-security concerns include protecting the communications and data of essentially eveyone, which is the reason that unbreakable encryption — encryption without backdoors, front doors, or exception access — is our only reasonable security solution.