Force v. Facebook, Inc.

Court of Appeals for the Second Circuit
934 F.3d 53 (2019)
ELI5:

Rule of Law:

Section 230(c)(1) of the Communications Decency Act shields an interactive computer service from civil liability for third-party content, even when the service uses algorithms to match users with that content, because such algorithmic curation is a protected, traditional editorial function of a publisher.


Facts:

  • Between 2014 and 2016, Hamas, a U.S.-designated foreign terrorist organization, conducted several terrorist attacks in Israel.
  • These attacks resulted in the deaths of American citizens Yaakov Naftali Fraenkel, Chaya Zissel Braun, Richard Lakin, and Taylor Force, and the serious injury of Menachem Mendel Rivkin.
  • Facebook operates a social network where users post content and connect with others, and where algorithms curate a personalized 'newsfeed' and provide 'friend suggestions' based on user data and behavior.
  • Hamas and its members maintained pages on Facebook, which they used to post content encouraging and celebrating terrorist attacks, including specific methods like car-ramming and stabbings.
  • The perpetrators of the attacks on the victims allegedly viewed this Hamas-related content on Facebook prior to the attacks.
  • Plaintiffs allege that Facebook's algorithms facilitated the spread of this content by actively suggesting Hamas-related pages, groups, and friend connections to users who were most likely to be interested in them.
  • Despite its community standards prohibiting terrorist organizations, Facebook allegedly failed to remove openly maintained pages belonging to Hamas leaders and members.

Procedural Posture:

  • Plaintiffs, victims and relatives of victims of Hamas terrorist attacks, sued Facebook, Inc. in the U.S. District Court for the Southern District of New York.
  • On consent of the parties, the action was transferred to the U.S. District Court for the Eastern District of New York.
  • Facebook moved to dismiss the First Amended Complaint for failure to state a claim, arguing it was immune from liability under Section 230(c)(1) of the Communications Decency Act.
  • The district court granted Facebook's motion to dismiss, holding that Section 230(c)(1) barred plaintiffs' claims.
  • The district court then denied with prejudice plaintiffs' subsequent motion for leave to file a second amended complaint, finding it would be futile.
  • Plaintiffs, the appellants, appealed the district court's final judgment to the U.S. Court of Appeals for the Second Circuit, with Facebook as the appellee.

Locked

Premium Content

Subscribe to Lexplug to view the complete brief

You're viewing a preview with Rule of Law, Facts, and Procedural Posture

Issue:

Does Section 230(c)(1) of the Communications Decency Act immunize an interactive computer service provider from liability under federal anti-terrorism statutes for allegedly providing a platform used by a terrorist organization, where the provider's algorithms suggested the organization's content and connections to other users?


Opinions:

Majority - Droney, J.

Yes, Section 230(c)(1) immunizes Facebook from liability because the plaintiffs' claims impermissibly treat Facebook as the 'publisher' of information provided by Hamas. The court reasoned that 'publisher' functions, which are protected by Section 230, include traditional editorial decisions like selecting, arranging, and distributing content. The use of algorithms to personalize newsfeeds and suggest friends or content is a modern, automated form of this protected editorial function. Holding otherwise would eviscerate Section 230, as virtually all online platforms organize third-party content. Furthermore, Facebook does not qualify as an 'information content provider' of Hamas's content under the 'material contribution' test, because it did not contribute to the underlying unlawfulness of the content itself; its algorithms were content-neutral and did not alter the information provided by Hamas. The court also rejected arguments that other statutes, like the Anti-Terrorism Act, implicitly repealed Section 230 or that its application was barred by exceptions for criminal law enforcement or by the presumption against extraterritoriality.


Concurring-in-part-and-dissenting-in-part - Katzmann, C.J.

No, Section 230(c)(1) should not immunize Facebook for the use of its friend- and content-suggestion algorithms. The dissent argued that when Facebook's algorithms proactively suggest connections and build social networks, it goes beyond the traditional editorial functions of a publisher. This 'matchmaking' constitutes Facebook's own affirmative conduct and its own message ('you will like this person/group'), rather than simply publishing third-party content. The legislative history of Section 230 shows it was narrowly intended to encourage the filtering of obscene content, and applying it to immunize platforms that algorithmically connect terrorists is a dangerous 'mission creep' unforeseen by Congress. Therefore, claims based not on the third-party content itself but on the platform's independent action of creating connections between users should not be barred by Section 230.



Analysis:

This decision solidifies and expands the formidable immunity granted to online platforms under Section 230, extending protection to cover sophisticated, algorithmic content curation. By defining algorithmic recommendations as a core 'publisher' function, the court makes it exceedingly difficult to hold platforms liable for harm caused by third-party content, even when their own technology actively promotes it. This precedent significantly raises the bar for plaintiffs seeking to overcome Section 230 immunity, requiring them to show the platform was responsible for the creation or development of the unlawful content itself, not merely its automated distribution. The ruling reinforces the broad interpretation of Section 230 and places the onus on Congress to amend the statute if it wishes to impose greater responsibility on tech companies for the content they amplify.

G

Gunnerbot

AI-powered case assistant

Loaded: Force v. Facebook, Inc. (2019)

Try: "What was the holding?" or "Explain the dissent"