Going Dark

An Out of the Box Approach to the Going Dark Problem

By Benjamin Wittes
Tuesday, February 2, 2016, 11:49 AM

I’ve been thinking about out-of-the-box ways to address the "Going Dark" problem. Specifically, I am looking for approaches that potentially thread the needle between an industry allergic to any technological mandates, civil libertarians and cryptographers deeply committed to strong encryption without back doors, and a government alarmed by the rise of encrypted communications it cannot capture.

Here is one that just might thread that needle.

The following idea grows out of two lines of thought I've been working on in previous Lawfare posts. One line deals with how the “technical assistance” provisions of FISA and Title III might assist the government in addressing the going dark problem, an issue I discuss here. The other line involves the application of Section § 230 of the Communications Decency Act (47 U.S.C § 230) to civil suits against the companies who provide service to ISIS and other bad guys, an issue Zoe Bedell and I discuss here. Those posts may be useful and important background to what follows.

As Zoe and I explained, civil immunity under CDA § 230 is a big deal for internet companies. The provision explicitly eliminates carrier liability for content generated by third parties. The language of § 230(c)(1) is broad and has been interpreted more broadly still: “[n]o provider or user of an interactive computer services shall be treated as the publisher or speaker” of content created by a third party. Courts have interpreted § 230(c)(1) as a substantive provision immunizing providers and users of online services from civil suits based on third-party-generated content, whether or not the website otherwise tried to remove or police that content.

How important is CDA § 230 to the internet companies? Here’s the Electronic Frontier Foundation’s explanation:

[O]nline intermediaries that host or republish speech are protected against a range of laws that might otherwise be used to hold them legally responsible for what others say and do. The protected intermediaries include not only regular Internet Service Providers (ISPs), but also a range of "interactive computer service providers," including basically any online service that publishes third-party content. Though there are important exceptions for certain criminal and intellectual property-based claims, CDA 230 creates a broad protection that has allowed innovation and free speech online to flourish.

This legal and policy framework has allowed for YouTube and Vimeo users to upload their own videos, Amazon and Yelp to offer countless user reviews, craigslist to host classified ads, and Facebook and Twitter to offer social networking to hundreds of millions of Internet users. Given the sheer size of user-generated websites (for example, Facebook alone has more than 1 billion users, and YouTube users upload 100 hours of video every minute), it would be infeasible for online intermediaries to prevent objectionable content from cropping up on their site. Rather than face potential liability for their users' actions, most would likely not host any user content at all or would need to protect themselves by being actively engaged in censoring what we say, what we see, and what we do online. In short, CDA 230 is perhaps the most influential law to protect the kind of innovation that has allowed the Internet to thrive since 1996.

So here's my thought: Why should Congress not condition § 230 immunity on whether companies maintain the technical capacity to deliver interpretable signal in response to lawful wiretap orders? Congress would not need to specify how the companies comply, much less direct that they weaken encryption. If companies want instead to facilitate device hacking—the preferred means of attack for many cryptographers, that’s fine. If they prefer to come up with some other method, that’s fine too.

The basic policy point would be for Congress to say to the internet companies: Don’t expect us to protect you from liability for third-party conduct if you actively design your systems to frustrate government efforts to monitor that third-party conduct.

This all requires only modest changes in the actual law. Congress would, first, need to amend the technical assistance language of both FISA and Title III—the language that requires companies to give reasonable help to authorities trying to effectuate a wiretapto clarify that any company benefiting from CDA § 230 immunity is also obliged by the technical assistance requirement. That could be accomplished by adding the text in bold, using the FISA technical assistance language as model:

upon the request of the applicant, a specified communication or other common carrier, an interactive computer service as that term is defined in 47 USC § 230(f)(2) landlord, custodian, or other specified person, or in circumstances where the Court finds, based upon specific facts provided in the application, that the actions of the target of the application may have the effect of thwarting the identification of a specified person, such other persons, furnish the applicant forthwith all information, facilities, or technical assistance necessary to accomplish the electronic surveillance in such a manner as will protect its secrecy and produce a minimum of interference with the services that such carrier, landlord, custodian, or other person is providing that target of electronic surveillance.

Second, Congress would have to amend 47 USC 230(c)(1) to add something along the lines of the text in bold:

No provider or user of an interactive computer service that maintains the technical capacity to deliver interpretable signal in compliance with orders for electronic surveillance under [FISA] and [Title III] shall be treated as the publisher or speaker of any information provided by another information content provider.

Note that this approach does not make a company liable for the activity of third parties; it does not create a cause of action of any kind. It merely deprives companies of the nuclear response they currently deploy against any suggestion they are liable for the online activities of their users. A company like Apple may well decide thatgiven that it doesn’t host a giant user-contributed web siteits brand reputation still requires that it provide end-to-end encryption for iMessage and FaceTime and that it can rely on defenses outside of § 230 against suits over what users do with its services. Note as well that the proposal would have almost no implications for companies like Twitter, where virtually all user activities are conducted in publicor semi-publicand are therefore easily subject to collection under court order.

I can also imagine a somewhat narrower version of the same approach, one focused only on depriving a company of CDA § 230 immunity only in the context of specific instances in which that company proved incapable of complying with a lawful wiretap order. Something like this:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider except in circumstances in which the provider fails to deliver interpretable signal in response to a valid wiretap order or order for the production of communications content under [FISA] or [Title III].

Either way, the idea is simply for Congress to condition its great gift to the providers on their cooperation on going dark. This approach doesn't require that anyone escrow keys or create back doors. It allows providers to use whatever encryption they deem appropriate. And it doesn't even require that service providers maintain the ability to comply with wiretaps at all; they are free, if they choose, to take their chances in court against the plaintiffs bar—just like a conventional publisher.

It merely makes the service provider immunity on which the modern internet is built contingent on the companies’ cooperation in maintaining governability. Through it Congress can emphasize that CDA § 230 was not a gratuity but a policy calculated to advance the development of the internet. That policy decision is subject to change to the extent necessary to prevent the internet from becoming ungovernable.