Don’t look now but a new front is about to open up in the Second Crypto War. The Hill reports that the encryption legislation being drafted by Senate Intelligence Committee leaders Richard Burr and Dianne Feinstein could come as soon as this week:
“I’m hopeful,” Sen. Richard Burr (R-N.C.) told The Hill before a Wednesday vote.
The long-awaited bill—in the works since last fall’s terror attacks in Paris and San Bernardino, Calif.—is expected to force companies to comply with court orders seeking locked communications.
The FBI and law enforcement have long warned that encryption is making it more difficult to uncover criminal and terrorist plots.
Burr, who chairs the Senate Intelligence Committee, has been drafting legislation to address the issue with Sen. Dianne Feinstein (D-Calif.), the committee’s ranking member.
Feinstein told The Hill she passed the text along earlier this week to White House chief of staff Denis McDonough.
“My hope is since I was the one that gave it to Denis McDonough, they will take a look at it and let us know what they think,” she said.
The Obama administration’s response will determine the bill’s timing, Burr added.
The legislative front opens even as the judicial fronts are multiplying like rabbits. The New York Times reports that in addition to the Apple litigations over locked iPhones, the Justice Department may soon be facing a showdown with WhatsApp (owned by Facebook) over wiretaps:
As recently as this past week, officials said, the Justice Department was discussing how to proceed in a continuing criminal investigation in which a federal judge had approved a wiretap, but investigators were stymied by WhatsApp’s encryption.
The Justice Department and WhatsApp declined to comment. The government officials and others who discussed the dispute did so on condition of anonymity because the wiretap order and all the information associated with it were under seal. The nature of the case was not clear, except that officials said it was not a terrorism investigation. The location of the investigation was also unclear.
To understand the battle lines, consider this imperfect analogy from the predigital world: If the Apple dispute is akin to whether the F.B.I. can unlock your front door and search your house, the issue with WhatsApp is whether it can listen to your phone calls. In the era of encryption, neither question has a clear answer.
Some investigators view the WhatsApp issue as even more significant than the one over locked phones because it goes to the heart of the future of wiretapping. They say the Justice Department should ask a judge to force WhatsApp to help the government get information that has been encrypted. Others are reluctant to escalate the dispute, particularly with senators saying they will soon introduce legislation to help the government get data in a format it can read.
The Feinstein-Burr bill is a significant event for a number of reasons. First off, the Intelligence Committee’s interest, and the apparent bipartisan unity of its leadership, will tend to pull the center of gravity of the discussion away from the tech-sector friendly judiciary committees and into an arena traditionally more sensitive to the government’s concerns. Moreover, the legislative arena potentially offers more comprehensive and creative options than does litigation. After all, a court is only empowered to address the issue before it, which means it can only contemplate pieces of the Going Dark problem at any one time. Congress, on the other hand, can think holistically about it, tackling—for example—the relationship between devices and apps and the relationship between foreign companies and domestic ones.
In that comprehensiveness, there is opportunity: Whatever one’s view of how to define the problem, Congress can address the whole thing. But there is also danger: Congress can make a mess of the whole thing too.
Importantly, the fact that Congress is about to get involved does not answer the question of the nature of its involvement. While the Intelligence Committee leadership might be expected to act with the government’s concerns in mind, Congress could ultimately try to resolve the dispute in favor of tech companies, in favor of the FBI, or in favor of one in certain areas and the other in certain other areas. It could also try to split the baby.
What follows are five general approaches that Congress might take towards intervention in this space. They appear in ascending order of regulatory intrusiveness—that is, from the most friendly to Apple and the other tech companies to the most solicitous of law enforcement’s concerns.
They could, of course, be used in combination and interaction with one another. A reasonable legislature might, for example, choose to take a different approach to wiretapping than it does towards data at rest. At least in this initial stab, I’m approaching the subject at a high level of altitude. I may well return to each of these in greater depth in the coming weeks.
Approach #1: Protect the Tech
Call this one the Tim Cook’s Dream option. Currently, the tech companies have no CALEA-like obligations to build their systems to facilitate wiretapping, and CALEA anyway exempts telecommunications carrier from any obligation to help with decryption. So the current set of issues arises under the All Writs Act for data at rest on devices, and under Title III and FISA for data in motion between devices. (Title III and FISA have substantially similar technical assistance provisions.) If Congress wants to resolve this issue decisively in Silicon Valley’s favor, it need only pass a brief statute clarifying that the All Writs Act may not be used to require information service providers to undermine or circumvent their own security systems and that neither may the technical assistance provisions of FISA or Title III. This would be a very easy law to write for a Congress that had the votes to pass it.
Approach #2: Options and Notice
Here’s an approach that I haven’t seen proposed anywhere, one based on consumer options and transparency: What if Congress required that encrypted services (storage or communications services or both) have what one might call an “Emergency Access Mode” as an option available to consumers?
On initiation of new service—starting the app for the first time or opening the device—the customer would be given a choice to use this mode, rather than simply defaulting to end to end encryption. A device in Emergency Access Mode would have the strongest security the company could provide subject to one condition: The company would retain the ability to unlock the device—or decrypt the communications—on the death or incapacity of the user at the request of next of kin or at the request of law enforcement with appropriate legal process. Jim Comey could go on a speaking tour around the country encouraging people to avail themselves of this mode, raising awareness of cases like that of Brittany Mills—the pregnant Louisiana murder victim whose locked iPhone contains her diary and perhaps the only clues to her murder.
Consumers would retain the option of picking the no-exception, full-security, warrant-proof option, and providers would be free to provide that option too. But Congress might also require that they then disclose, say, to the Department of Homeland Security which of the consumers are using it.
The combination of these two approaches could, I suspect, be powerful. A great many people use the highest security available because it’s the default option and, warned of the possible risks and with a meaningful option of strong security with an emergency exception, would avail themselves of that option. Similarly, a data stream identifying users of the strongest security might not be useful in and of itself. Cross-reference it against other data streams (think of the no-fly list or the sex offender registries), however, and it could be a useful tool—particularly if a lot of normal people activated the Emergency Access Mode.
Approach #3: Define Technical Assistance
A middle ground possibility would be to precisely define in statute what kinds of technical assistance different types of providers are required to give to authorities under different circumstances.
As noted above, the obligation for private entities to provide varying degrees of technical assistance to law enforcement is currently codified in Title III, CALEA, and FISA. While the precise scope of technical assistance is flexible under the law, none of these provisions extend to the type of assistance the FBI is requesting of Apple and similarly situated companies. The absence of a statutory technical assistance provision forms the core controversy under the law, as Apple objects to the application of the gap-filling All Writs Act.
But Congress always retains the power to fill the gaps itself, answering the questions of who has to do what and when. And it could do this with varying levels of aggressiveness.
This option is essentially the same as Option #1, except that the answer to the question of what obligations Congress would impose on the companies wouldn’t be to exempt them from any and all obligations. Here, rather, the idea here is that companies would have no prospective obligations to design their systems to be capable of serving warrants, as the telecommunications companies do in CALEA. A company like Apple could design a system specifically aimed at avoiding all exceptional access, including law enforcement. But when presented with a valid warrant, it would have some degree of obligation to assist law enforcement in circumventing its security systems where possible in order to effectuate said warrant.
Of course, the nature of the obligation would be where the rubber hits the road in this model. But note, that in pursuing this route, Congress is empowered to pick which options to put on or take off the table. While legislative language requires flexibility in order to be functional over time, Congress could easily draft language that the provision shall not be read to require X, Y, or Z. Should a company ever be required to sign government code, send a “malicious” software update, or turn over source code? But if these are questions for Congress, then a carefully-scoped technical assistance provision is likely the most tailored vehicle for addressing them.
Option #4: Set the Trial Lawyers on Them
A few weeks ago, I floated what I termed an “out of the box” approach to the Going Dark problem:
civil immunity under CDA § 230 is a big deal for internet companies. The provision explicitly eliminates carrier liability for content generated by third parties. The language of § 230(c)(1) is broad and has been interpreted more broadly still: “[n]o provider or user of an interactive computer services shall be treated as the publisher or speaker” of content created by a third party. Courts have interpreted § 230(c)(1) as a substantive provision immunizing providers and users of online services from civil suits based on third-party-generated content, whether or not the website otherwise tried to remove or police that content.
So here's my thought: Why should Congress not condition § 230 immunity on whether companies maintain the technical capacity to deliver interpretable signal in response to lawful wiretap orders? Congress would not need to specify how the companies comply, much less direct that they weaken encryption. If companies want instead to facilitate device hacking—the preferred means of attack for many cryptographers, that’s fine. If they prefer to come up with some other method, that’s fine too.
The basic policy point would be for Congress to say to the internet companies: Don’t expect us to protect you from liability for third-party conduct if you actively design your systems to frustrate government efforts to monitor that third-party conduct.
The government has lot of other carrots and sticks, short of regulatory mandates, it could apply to this situation. Government purchasing and contracting can be a powerful lever, for example.
Congress could simply make the judgment that the many goodies that the federal government throws the way of Silicon Valley (protection from the trial lawyers chief among them) are not unconditional gratuities and can be linked to cooperation in the investigative space.The idea is not without controversy.
Option #5: The Full Comey
Finally, Congress could simply legislate an outcome, a performance standard—requiring companies, by whatever means, to retain the capacity to deliver decrypted signal (or decrypted stored data). Technology companies and civil liberties groups talk about this as unthinkable. But it’s really not unthinkable. The telecommunications companies have lived with CALEA for more than 20 years. And there are lots of industries that face regulatory mandates designed to facilitate surveillance on their customers. Banks have obligations to report “suspicious activity” and thus have to maintain the capacity to notice it. The internet companies themselves have obligations under the FISA Amendments Acts.
This kind of performance standard appears to be the direction the French government is going. And while there may be lots of reasons to oppose this sort of technical mandate, it is far less of a novelty in U.S. law than people seem to think.
There are, undoubtedly, other legislative approaches available to a legislature that wants to be imaginative. The key thing, in my view, is for the different parties to this conversation to lay out clearly what their legislative objectives really are and what strategies they would like Congress to adopt to achieve those objectives.