We expected that our piece this morning on liability standards with respect to end-to-end encryption would provoke strong reactions.
We confess we have been surprised by the ferocity of those reactions, their bizarrely ad hominem character, their frequent misstatements of fact, and their near total absence of legal argument in response to what is, at the end of the day, a legal analysis. We may be wrong in what we’ve written; it’s a first cut analysis of a very complicated issue. But in the weird spasm of anger directed our way over the past several hours, nobody has identified what is wrong with our argument.
Let us be clear about a few things up front. The piece is not an argument that Apple should be liable if bad things happen as a result of terrorists using its encryption services. It is also not an argument that companies should be forced to grant extraordinary access to law enforcement or intelligence agencies in response to warrants.
Neither of us takes either of these positions. As Ben has made clear before, he is something of an agnostic on the question of whether tech companies should maintain the capacity to provide decrypted signal to government. While he’s conceptually sympathetic to the idea, he has grave reservations about its technical plausibility without compromising other cybersecurity goods. Zoe, for her part, is deeply skeptical of the idea that a backdoor wouldn’t cause many more problems than it would solve.
Rather, the article was a (partial) response to a question that Senator Sheldon Whitehouse posed to Deputy Attorney General Sally Yates in a recent hearing: Do tech companies have potential exposure flowing from the use of their products by terrorists or criminals? Yates did not know the answer to that question, which seemed to us worth exploring. Exploring such a legal question should not be a controversial idea, in our view—even had it not been posed by a sitting senator to a senior Justice Department official. Whether you’re attracted to the idea of forcing tech companies to have back doors or protective of their desire to provide end-to-end encryption, the liability environment they face for different courses of action objectively matters a great deal. So we thought we would look into it and give as straightforward an account as we could of where the companies’ defenses are strong and where they are weaker.
You wouldn’t know that’s what we were doing by the response. Within 12 minutes of the article’s going live on the site, Glenn Greenwald tweeted:
National Security State mouthpieces now expressly threatening Apple w/terrorism prosecution for providing encryption https://t.co/kEVPopZeZW
— Glenn Greenwald (@ggreenwald) July 30, 2015
It’s amazing how many factual errors Greenwald can pack into 140 characters. The only words in this tweet that are even arguably true are the description of us as “national security state mouthpieces.” That’s, after all, a matter of opinion. In Ben’s case it, it may even be a fair opinion. In Zoe’s case, it’s not. Zoe is actually a named plaintiff in a current ACLU case against the national security establishment. On purely factual questions, the tweet does not have a true statement in it. Our article does not, in fact, deal with criminal prosecution at all but with civil liability. We don’t threaten anyone, let alone do we do so expressly. And we actually specifically say that Apple could not plausibly be held liable for providing encryption. The liability could potentially arise only for continuing to provide encryption after the receipt of a warrant.
But never mind.
A few minutes later, Chris Soghoian of the ACLU declared:
— Christopher Soghoian (@csoghoian) July 30, 2015
Braindead jihad against encryption? Really? The storied history of the ACLU, founded by (among others) Felix Frankfurter, has come down to this kind of name-calling? Who knew?
Other respectable civil liberties groups joined in too. Center for Democracy and Technology fellow Jake Lapperuque tweeted:
— Jake Laperruque (@JakeLaperruque) July 30, 2015
Leave aside the demeaning feature of many of these tweets that leave Zoe out of the article. The technical term for this sort of statement is a “bald-faced lie.” Here’s what we (both Ben and Zoe) actually wrote on the question of whether Apple’s selling of a secure phone could be considered material support:
the sale of an encrypted phone by a major company to the general public cannot plausibly constitute material support for terrorism. Unlike a charity donating to Hamas, Apple in this situation has no intention of supporting or in any way contributing to violent activity. It does not know—and likely could not possibly know—that the person buying the phone is intending to use it for violence or coercion. And the product in question has a million legitimate uses. This is not, in Posner’s words, putting a loaded gun in the hands of child. Rather, it is putting a safety device in the hands of anyone who can buy one, with some knowledge that some small fraction of those people will misuse the product. Holding Apple liable here would make no more sense than holding a car maker liable if one of its vehicles ended up being used in a car bombing (emphasis added.)
To reiterate, the only situation we identified as plausibly giving rise to a serious question of liability involved Apple’s continuing to provide encryption services after a warrant had been delivered. We specifically distinguished that from the selling of the phone.
By this afternoon, The Intercept (“Fearless, Adversarial Journalism”) was on the case with an article headlined, “Obama Administration’s War Against Apple and Google Just Got Uglier.”
Neither of us has ever served in the Obama Administration.
But never mind that. The Intercept is fearless and adversarial; its Twitter feed doesn’t boast that it’s truthful. And the article in question certainly was not truthful. In fact, the original version of it ended by misquoting a tweet Ben had sent so as to make it mean the precise opposite of what he said. Here’s what the tweet said:
@csoghoian I am not sure at all that Apple is not doing the right thing by encrypting end to end.
— Benjamin Wittes (@benjaminwittes) July 30, 2015
Here is how the article characterized it:
Wittes, while couching his post as a hypothetical, left little doubt about his personal sentiment. “All that said,” he and his coauthor wrote, “it’s a bit of a puzzle how a company that knowingly provides encrypted communications services to a specific person identified to it as engaged in terrorist activity escapes liability if and when that person then kills an American in a terrorist incident that relies on that encryption.”
The authors didn’t say what exactly they wanted Apple to do instead. Wittes tweeted after publishing the post that he is “not sure at all that Apple is doing the right thing by encrypting end to end" (emphasis added).
The Intercept later corrected the quotation and added the following notation: “Correction: An earlier version of this article misquoted Wittes’ tweet, mischaracterizing its meaning.” The funny thing is that because the article now quotes Ben correctly, it also now shows that there is, in fact, significant doubt about his “personal sentiment” and that the previous paragraph is dead wrong. Author Jenna McLaughlin, however, did not bother adjusting her statement that there was “little doubt” about what Ben thought.
In fact, the only real sentiment the article expresses is that the liability environment here is something of a “puzzle” and that it’s not wholly clear that Apple is free of exposure given the right (or wrong) constellation of facts that could arise if FBI Director Comey is right that ISIS is using end-to-end encrypted chat systems to plot attacks. That is our honest evaluation of the state of the law—an evaluation that either describes the law cogently or does not.
And interestingly, in the stream of bile directed our way today, there has been next to zero discussion of whether, in fact, it does describe the law cogently or not.
We say “next to zero” because Patrick Toomey of the ACLU did raise a very interesting point that bears flagging and further study. Responding to the post, he tweeted:
.@benjaminwittes Breathless italics aside, FISA requires carrier to continue providing services with "minimum of interference" 1805(c)(2)(b)
— Patrick Toomey (@PatrickCToomey) July 30, 2015
Leaving aside the question of whether italics can really be breathless, Toomey here has put his finger on an issue we had not considered. The statute to which he is referring, 50 USC § 1805(c)(2)(B), is a part of FISA and reads as follows:
An order approving an electronic surveillance under this section shall direct—
(B) that, upon the request of the applicant, a specified communication or other common carrier, landlord, custodian, or other specified person, or in circumstances where the Court finds, based upon specific facts provided in the application, that the actions of the target of the application may have the effect of thwarting the identification of a specified person, such other persons, furnish the applicant forthwith all information, facilities, or technical assistance necessary to accomplish the electronic surveillance in such a manner as will protect its secrecy and produce a minimum of interference with the services that such carrier, landlord, custodian, or other person is providing that target of electronic surveillance.
It also has an analogous provision with respect to Title III wiretaps, 18 USC § 2518(4):
An order authorizing the interception of a wire, oral, or electronic communication under this chapter shall, upon request of the applicant, direct that a provider of wire or electronic communication service, landlord, custodian or other person shall furnish the applicant forthwith all information, facilities, and technical assistance necessary to accomplish the interception unobtrusively and with a minimum of interference with the services that such service provider, landlord, custodian, or person is according the person whose communications are to be intercepted.
So maybe Apple gets around liability in the situation we envision by arguing that it was compelled by the court order not to interfere with service in the course of facilitating the surveillance.
The trouble with this argument is that in the factual scenario we envisioned, Apple is not able to comply with the order at all. And neither of these statutes envisions a situation where the service provider does not comply—and thus neither obviously regulates them. It’s clear that if Apple has the ability to decrypt the communications and provide them, it is obligated by the orders under these laws to do so with a minimum of service interference. But what if Apple’s response is that it cannot technically comply at all. It is then left with this question of whether to continue providing service to someone it knows to be under active investigation.
Toomey’s point, however, has two additional wrinkles. First, because Apple presumably is able to provide the metadata (though not the contents) associated with the communications in question, these provisions might still apply—though that may depend under what authority the government has sought the metadata. Apple may well also argue that it has complied by providing encrypted communications, even if they’re not in any way useful—thus triggering the obligation to do so with minimal interference.
Perhaps more importantly, these provisions may well provide Apple and the government a basis on which to agree in a given investigative context that Apple should not terminate service to a targeted user. If both parties agree, after all, that Apple is obliged to continue service as a result of these provisions—even if that’s not necessarily the best reading of the statute in question—that could be a powerful argument in subsequent civil litigation that Apple had no choice but to act as it did.
As we say, Toomey’s is an interesting and valuable point. The fact that an interesting and valuable point was a singular event today raises the question of why the civil liberties community has such an attack mentality with respect to analysis that does not reflect its particular passions on the subject of encryption. We’re more than happy to be wrong in this particular analysis. And we appreciate that at least one person—if only one—took the time to engage with the analysis, rather than simply hurling invective or just plain lying about what we said.