On Feb. 25, a three-judge panel of the U.S. Court of Appeals for the Second Circuit heard oral argument in Force v. Facebook, a case about whether Facebook can be held liable for the use of its platform to coordinate and encourage violent attacks by users linked to Hamas. Here at Lawfare, Benjamin Wittes and Zoe Bedell discussed Force when the lawsuit was first filed in 2016 and explained how it might succeed even when others like it had failed. In May 2017, the U.S. District Court for the Eastern District of New York dismissed Force and a companion case, Cohen v. Facebook, holding that Facebook was protected from liability under Section 230 of the Communications Decency Act (CDA). The plaintiffs appealed.
The plaintiffs in the case include the families of Yaakov Naftali Fraenkel, a 16 year old who was kidnapped by Hamas operatives with his friends at a bus stop, shot at point-blank range in the back of the car and then dumped in a field in Hebron for the police to find nearly three weeks later; Taylor Force, a 29-year-old American MBA student and Army veteran stabbed along a boardwalk in Jaffa; Chaya Braun, a three month old thrown from her stroller and slammed into the pavement when a Hamas agent drove his car into a light rail station in Jerusalem; and Richard Lakin, a 76 year old who was shot and stabbed while riding a public bus in Jerusalem. Menachem Mendel Rivkin, who was stabbed at a gas station north of Jerusalem, and his wife, Bracha Rivkin, are also on the suit. All of the victims were U.S. citizens. The plaintiffs allege that Facebook knowingly provided a platform for Hamas to organize, recruit, communicate and operate; that Facebook’s algorithms played a vital role in spreading incitements to violence; and that the use of Facebook by Hamas and individual terrorists was directly tied to the attacks and resulting deaths and injuries of the plaintiffs’ family members.
District court judge Nicholas Garaufis dismissed the lawsuit without prejudice in May 2017. He found that personal jurisdiction over Facebook was properly based on the Force complaint’s Anti-Terrorism Act (ATA)-based claims and pendent jurisdiction, but determined that the plaintiffs’ claim must be dismissed because Section 230(c)(1) of the CDA provides Facebook an affirmative defense to liability for harm caused by speech it hosts on its platform but does not itself produce. To reach that conclusion, Judge Garaufis analyzed both the text of the statute and whether the relevant conduct occurred within the United States or extraterritorially. Courts by default assume that statutes are intended to cover conduct only within the United States and require that Congress clearly demonstrate its intent to extend the coverage of a law extraterritorially. Judge Garaufis determined that Section 230(c)(1) was not intended to apply extraterritorially, but that that was of no consequence because the relevant location for purposes of liability immunity is not where the harmful conduct took place but where the litigation in the matter proceeds, that is, within the United States. “As the situs of the litigation is New York, the relevant ‘territorial events or relationships’ occur domestically.” Accordingly, Facebook was entitled to the immunity provided for in Section 230(c)(1).
On appeal to the Second Circuit, the plaintiffs asserted three things: (1) that the CDA was “never intended to confer immunity for supporting terrorism” and should not shield Facebook from liability “under the ATA for supporting, aiding and abetting, and conspiring with Hamas”; (2) that the “CDA cannot be applied extraterritorially to ATA claims arising overseas that have no U.S. contacts” and “does not apply here given that Facebook’s duties arise not as a publisher of Hamas’s substantive communications” but rather as a provider of material support in the form of communication services to Hamas; and (3) that “[e]ven if CDA immunity were a valid defense, Facebook asserts it prematurely” and must instead present the affirmative defense in its answer to the complaint.
The defendants responded that Section 230 clearly bars the cause of action in this case because Facebook is a provider of an “interactive computer service,” there are “no plausible allegations showing that Facebook itself participated in any way in creating or developing the terrorist content at issue, and Facebook’s use of algorithms that allow users to access content and make connections does not render it a content ‘developer’ for purposes of the CDA.” Additionally, echoing Judge Garaufis, the defendants asserted that the “case does not implicate the presumption against extraterritoriality because the purpose of § 230(c)(1) is to bar certain lawsuits, and the application of that bar in this case occurs domestically. Moreover, there is nothing in law or logic that exempts civil ATA or foreign-law claims from CDA immunity.” The defendants also argued that further deficiencies in the amended complaint, which the district court did not address, provided alternative grounds for dismissal:
All four of Appellants’ claims under the ATA fail as a matter of law because the Amended Complaint does not plausibly allege that Appellants’ injuries occurred “by reason of” Facebook’s activities. 18 U.S.C. § 2333(a). Appellants’ claim for direct ATA liability is also legally deficient because Facebook did not knowingly commit an “act of international terrorism.” Id. Finally, Appellants’ claim for secondary liability under the ATA fails because the Amended Complaint does not plausibly allege that Facebook either aided and abetted terrorist organizations or in any way conspired with them. See 18 U.S.C. § 2333(d) (2016).
The Electronic Frontier Foundation filed an amicus brief in support of Facebook, arguing (1) that the First Amendment prevents the imposition of liability on Facebook for hosting content about terrorism; (2) that imposing liability on Facebook would violate users’ First Amendment rights to receive and gather information about terrorism; (3) that Section 230 of the CDA is essential to internet users’ freedom of expression online; and (4) that failure to apply Section 230 and the First Amendment would harm internet users’ free speech and platforms’ willingness to host that speech.
Chief Judge Robert Katzmann and Judges Christopher Droney and Richard Sullivan heard the case.
Arguing first for the plaintiffs-appellants, attorney Meir Katz explains that there are seven distinct issues Facebook has to win on: (1) that the CDA applies extraterritorially; (2) that Facebook has not raised the Section 230(c)(1) defense prematurely; (3) that Facebook’s active role in facilitating connections between individuals and groups does not amount to development of Hamas context; (4) that the plaintiffs’ ATA claims somehow relate to Facebook’s duties as publisher; (5) that the ATA doesn’t civilly enforce criminal counterterrorism statutes; (6) that the affirmative defense in common law does not yield to statute; and (7) that the later-enacted ATA does not control when it conflicts with the CDA.
First, Katz says, Congress does not legislate extraterritorially unless it clearly says so. Nothing in the CDA evidences Congress’s intent to apply it extraterritorially, and all conduct relevant to the cause of action occurred overseas (i.e., Hamas-connected violence). The judges push back and ask why the liability immunity provision Section 230(c)(1) should not be understood to regulate the wholly domestic conduct of filing a lawsuit, given that the purpose of the exception for interactive computer service providers like Facebook is to ensure they are not subject to domestic litigation. Since the provision is concerned with the plaintiffs’ attempt to hold Facebook liable in U.S. court rather than with the overseas conduct that resulted from Hamas’s use of Facebook, why would there be an extraterritoriality problem? Katz responds that the CDA defense to the cause of action should not be assessed independently of the cause of action. What matters is where the harm giving rise to the cause of action occurred, not where the lawsuit was filed. Section 230(c)(1) is a defense on the merits to the assertion of liability, more closely akin to some sort of qualified immunity than a statute of limitations.
The court asks how Katz would respond to the view that the plaintiffs’ analysis on extraterritoriality creates a gaping loophole in the regulatory structure that Congress set up in the CDA, such that providers of interactive computer services are liable for content posted by third parties so long as the content is posted and displayed outside the U.S. Katz notes that the purpose of the CDA was to protect people, particularly children, inside the United States, and to incentivize platforms to clean up the internet themselves. It was not intended to protect people all over the world from objectionable content, he argues. The analysis, he says, would be different for a common law cause of action, because the presumption against extraterritoriality applies only to congressional enactments.
Next, Katz argues that Facebook improperly raised the Section 230 affirmative defense in a motion to dismiss for failure to state a claim rather than in its answer to the complaint. Only defenses that are inevitably based on the complaint can be raised in a Rule 12(b)(6) motion, he says, and there are no clear allegations in this case that make the defense inevitable. The plaintiffs are not attributing third-party content to Facebook as a speaker, but rather saying that Facebook took an active role in facilitating Hamas’s communication and organizing for violence, so it is not clear that Section 230(c)(1) applies. It is Facebook’s burden, according to Katz, to allege sufficient facts to make the defense appropriate, and it has not done so: Facebook made only legal arguments about its liability under the CDA, not factual arguments about why it should not be deemed liable under any of the plaintiffs’ theories. The court asks what Facebook’s duty is besides that of a publisher or speaker, and Katz says that Facebook also has a duty not to support terrorist organizations, independent of anything they say or do not say. They are subject to primary liability under the ATA for contributing to the harm the plaintiffs suffered, Katz argues, by giving Hamas a platform and facilitating networking.
The court asks Katz for more information about what Facebook’s algorithms allegedly do and why the plaintiffs argue Facebook is responsible for Hamas’s use of the platform and ultimately for the violence suffered by the plaintiffs’ family members. Katz responds that Facebook actively facilitates communication between users of the platform by building algorithms that are designed to bring people with similar traits and interests into contact and by generating targeted content recommendations for each user. The company is not a passive actor, he says, but rather constantly seeks to collect ever more data on its users and harness that data to deliver content and create connections that users want. In the context of Hamas, according to Katz, Facebook’s algorithms allow Hamas and other terrorist groups to reach and influence people worldwide and enable individuals inclined toward violence to find community and receive direction. That sort of networking, where individuals are mostly passive and algorithms do all the work to connect them with others who are like minded, is not possible except on internet platforms such as Facebook. Consequently, Katz argues, Facebook is responsible for the harm that resulted from Hamas’s use of the platform to incite violence.
Asked whether a prosecution against Facebook for providing material support to Hamas would be possible, Katz responds that the plaintiffs have not tried to persuade the Justice Department to bring such a prosecution and that the point of the ATA is to allow private parties who are directly affected by terrorist violence to be involved in lawsuits. The government could bring an action against Facebook but didn’t. Turning to the plaintiffs’ aiding and abetting theory, the court asks how the plaintiffs would distinguish Facebook from a cell phone service provider whose service terrorists used to plan attacks, or an international media service like CNN that broadcasts segments about terrorist groups and recruitment. Wouldn’t holding Facebook liable also open up those sorts of companies to liability? Katz says Facebook is distinguishable because it knew Hamas was using its platform and actively facilitated Hamas’s outreach. It was not passively providing a service or even information, but rather actively fostering communications between people who then caused harm in the real world. The latter, not the former, rises to the level of material support, he says.
Responding for Facebook, attorney Craig Primis argues that the complaint walks right into the core protection of Section 230(c)(1) and that Facebook cannot be held liable for its decision not to remove Hamas content from the platform. The district court was correct to recognize Facebook’s immunity under the CDA and dismiss the case at the 12(b)(6) stage, he says, and that result was consistent with the decisions of courts across the country addressing analogous claims. Chief Judge Katzmann asks Primis about the implications of WesternGeco LLC v. ION Geophysical Corp., the Supreme Court’s most recent decision dealing with the presumption against extraterritoriality. The Supreme Court, Katzmann says, determined that the focus of the statute in WesternGeco— the Patent Act’s general damages provision, 35 U.S.C. §284—was the conduct for which the statute sought to provide a remedy, that is, the infringement. Why, then, shouldn’t the court in this case view the focus of 230(c)(1) as the conduct for which the statute seeks to give immunity, that is, Facebook’s provision of information to terrorists overseas? Primis replies that this case does not fit neatly into the Supreme Court’s two-step approach to extraterritoriality, which asks: (1) whether the presumption against extraterritoriality has been rebutted; and (2) whether the case involves a domestic application of the statute. In WesternGeco, Primis argues, the Supreme Court said that courts can go to the second step without dwelling on the first. Immunity, such as that provided by Section 230(c)(1), is a domestic act and lawsuits against interactive service providers, like that brought by the plaintiffs, are the conduct being regulated. The findings and policy provisions of the CDA make clear that the purpose of the statute is to promote a vibrant and open, minimally regulated internet in the U.S. for the benefit of Americans, Primis argues. The intent of the provision is to protect companies that distribute content, not to regulate overseas conduct.
The court next asks where Facebook’s alleged material support to Hamas takes place, and Primis responds that Facebook believes the plaintiffs have not adequately pleaded material support allegations. In Linde v. Arab Bank, the Second Circuit stated that mere allegations of material support to terrorism are not sufficient to state a claim of direct liability under the ATA—additional facts showing a direct connection to violent acts and efforts to influence the government are necessary. The complaint, in Facebook’s view, does not allege sufficient facts about Facebook’s support for Hamas and is not clear about where the support takes place. Facebook makes extensive efforts to remove terrorist content from its platform but it can’t catch everything, Primis says, which is why Congress provided immunity in Section 230(c)(1).
The court also questions whether it is an issue that Facebook’s algorithms facilitate connections and create networks—in this case, violent ones—that likely would not exist but for the platform. Primis says that Facebook’s algorithms do make user content more accessible and useful to other people on the platform, but that that is no different from, for example, Google’s or LinkedIn’s algorithms. The sorting and dissemination of content is the sort of editorial decision-making by service providers that courts have consistently recognized as protected by Section 230. The plaintiffs’ claim is barred by the CDA, Primis argues, because it really is about the objectionable third-party content that Facebook hosts, not about anything Facebook itself did. Prioritization and presentation of third-party content does not equal content creation.
The court asks whether the liability calculus would be different if representatives of Facebook decided to affirmatively seek out terrorist groups like Hamas, tout the effectiveness of the platform and encourage them to use Facebook. Primis responds that the ATA claim might be stronger in that case, but regardless Facebook would still be immune under the CDA because it is not creating content, merely publishing it.
Primis then turns to the ATA claims and argues that the plaintiffs failed to allege sufficient facts to show that Facebook’s algorithms played any role in the chain of causation, that is, that Facebook was used to support, finance or otherwise facilitate the five terrorist attacks in which the plaintiffs’ family members were harmed. According to Primis, Facebook cannot be held liable for aiding and abetting terrorism because there is no indication that Facebook was used in connection with a particular attack.
The court returns briefly to the issue of extraterritoriality and the focus of Section 230(c)(1), and whether every liability-limiting statute should be exempt from the presumption against extraterritoriality. Primis says statutes have to be read individually, pursuant to the Supreme Court’s framework, and that finding that Section 230(c)(1)’s focus is domestic does not create a slippery slope whereby all liability provisions must be read the same way.
Finally, Katz provides a few rebuttal remarks. He notes that the complaint states clearly that Facebook’s services were provided in Israel and the West Bank, and that Facebook takes an active role in providing a service to its users. The relevant acts of terrorism for ATA purposes were the attacks by Hamas members against the plaintiffs’ family members, and those acts were facilitated by Facebook’s provision of networking services to Hamas. The plaintiffs’ complaint alleges sufficient facts to show that Facebook’s conduct satisfies the criteria for ATA liability, and the case should therefore be allowed to proceed.
The Second Circuit reviews the district court’s dismissal of the plaintiffs’ complaint de novo, meaning that no deference is owed to the district court’s ruling. If the appeals court disagrees with Judge Garaufis’s determination that Facebook is protected from liability by CDA Section 230(c)(1) and believes that the plaintiffs adequately stated a claim against Facebook under the ATA, it can reverse the dismissal and remand to the district court for further proceedings. If, however, the Second Circuit agrees with the dismissal, whether for the reasons the district court gave or for alternative reasons offered by the defense, it will affirm.