Tawainna Anderson v. TikTok Inc
Precedential (2024)
Premium Feature
Subscribe to Lexplug to listen to the Case Podcast.
Rule of Law:
The immunity provided by Section 230 of the Communications Decency Act (CDA) does not apply to claims predicated on an interactive computer service's own expressive activity, such as an algorithm that curates and recommends third-party content to users, because such algorithmic promotion constitutes the platform's 'first-party speech' rather than 'information provided by another.'
Facts:
- Nylah Anderson, a ten-year-old, was a user of the TikTok video-sharing social media platform.
- TikTok's algorithm curated and recommended a tailored compilation of videos for Nylah's "For You Page" (FYP) based on factors including her age, demographics, and online interactions.
- A video depicting the "Blackout Challenge," which encourages viewers to record themselves engaging in self-asphyxiation, was among the content recommended by TikTok's algorithm to Nylah's FYP.
- After watching the Blackout Challenge video, Nylah attempted to replicate the conduct depicted.
- Nylah unintentionally hanged herself while attempting the challenge and died of asphyxiation.
- Tawainna Anderson is Nylah's mother and the administratrix of her estate.
Procedural Posture:
- Tawainna Anderson, individually and as administratrix of Nylah Anderson’s estate, sued TikTok, Inc. and ByteDance, Inc. in the United States District Court for the Eastern District of Pennsylvania, asserting claims including strict products liability and negligence.
- The District Court dismissed Anderson's complaint, holding that TikTok was immune from liability under § 230 of the Communications Decency Act.
- Anderson appealed the District Court's order of dismissal to the United States Court of Appeals for the Third Circuit.
Premium Content
Subscribe to Lexplug to view the complete brief
You're viewing a preview with Rule of Law, Facts, and Procedural Posture
Issue:
Does Section 230 of the Communications Decency Act immunize an interactive computer service from liability when its algorithm curates and recommends third-party content, such as a harmful challenge video, to a user's 'For You Page,' leading to injury?
Opinions:
Majority - Shwartz
No, Section 230 does not immunize TikTok from liability for harms resulting from its algorithmic recommendations of third-party content. The court reasoned that CDA § 230(c)(1) immunizes interactive computer services only when they are sued for 'information provided by another,' meaning third-party speech. However, when a platform's algorithm curates and recommends third-party content, thereby making 'editorial judgments' about 'compiling the third-party speech it wants in the way it wants,' this activity constitutes the platform's own 'expressive product' or 'first-party speech,' as established by the Supreme Court in Moody v. NetChoice, LLC. Since TikTok's FYP algorithm recommended the Blackout Challenge to Nylah, this was TikTok's own expressive activity, forming the basis for Anderson's claims, and therefore Section 230 does not bar those claims. The court clarified that its conclusion specifically applies to algorithmic promotion not contingent upon specific user input, distinguishing it from merely providing a search function.
Concurring in the judgment in part and dissenting in part - Matey
Yes, Section 230 does immunize TikTok from liability for merely hosting Blackout Challenge videos, but no, it does not immunize TikTok for its knowing distribution and targeted recommendation of such harmful videos. Judge Matey argued that the ordinary meaning of Section 230, informed by its historical context in common carrier law and early internet defamation cases (Cubby v. CompuServe vs. Stratton Oakmont v. Prodigy), was intended to immunize platforms from 'publisher liability'—being strictly liable for merely hosting third-party content. However, Section 230 was not intended to shield platforms from 'distributor liability,' which arises when a platform knows content is harmful yet continues to distribute or actively recommend it. The dissent criticized Zeran v. America Online, Inc. for wrongly conflating publisher and distributor liability, arguing that the statute allows suits for a provider's 'own conduct' beyond mere hosting. Therefore, claims based on TikTok’s knowing distribution and targeted recommendation of videos it knew could be harmful should proceed, as this is TikTok's own conduct, not merely information 'provided by another.'
Analysis:
This decision significantly narrows the scope of Section 230 immunity for social media platforms, particularly concerning their algorithmic recommendation systems. By classifying algorithmic curation as a platform's 'first-party speech,' the Third Circuit creates an important distinction that could expose platforms to liability for harms resulting from their active promotion of content, even if that content was originally created by a third party. This ruling aligns with a growing judicial trend to re-evaluate the broad protections afforded by Section 230 and may compel social media companies to re-examine the design and responsibility of their recommendation algorithms, potentially leading to increased accountability for user safety.
