Dangerous Social Media Trends: Can Social Media Platforms be Held Liable? 

Social Media: The Next Frontier for NFT Copyright Litigation
Photo by Tracy Le Blanc: https://www.pexels.com/photo/person-holding-iphone-showing-social-networks-folder-607812/

From the “Subway Surfer” trend to the “Tide Pod Challenge,” dangerous social media challenges have a way of attracting young audiences. According to the American Psychological Association, “adolescence is a period of heightened susceptibility to peer influence, impressionability, and sensitivity to social rejection,” making children a prime target for social media trends.1[1]Am. Psychological Ass’n, Youth Social Media and Internet Use in 2024, https://www.apa.org/topics/social-media-internet/youth-social-media-2024 (last visited Nov. 3, 2024). But what happens when a child is harmed – or even dies – from attempting one of these challenges? Who should be held liable? This article will examine the changing landscape of Section 230 in light of Anderson v. TikTok and explore the changes in social media platforms’ potential liability in these actions.  

History of Section 230 

Section 230 of the Communication Decency Act2[2]47 U.S.C. § 230 (2020). is often referred to as “the 26 words that created the internet.”3[3]Ariel Silverbreit & Jonathan Mollod, A Final Bow for Section 230: Latest Plea for Reform Calls for Sunset of Immunity Law, https://newmedialaw.proskauer.com/2024/06/11/a-final-bow-for-section-230-latest-plea-for-reform-calls-for-sunset-of-immunity-law/ (June 11, 2024). The law protects online service providers from liability for content posted by users of their services.4[4]47 U.S.C. § 230 (2020), supra note 2. This means social media platforms are generally not liable for user-generated content, unlike traditional publishers. Before Section 230, courts held platforms that hosted harmful third-party content were considered publishers of that harmful content, as long as they tried to moderate and remove it.5[5]U.S. Dep’t of Justice, Section 230 Review https://www.justice.gov/archives/ag/department-justice-s-review-section-230-communications-decency-act-1996 (Oct. 20, 2024). In other words, publisher liability for social media platforms meant they were treated as the creators of harmful content, even if a user created and posted it.6[6]Stratton Oakmont, Inc. v. Prodigy Servs.Co., 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995). They also could be held liable as distributors, meaning if they knew or had reason to know that the content being posted on their platform was harmful, they would bear responsibility.7[7]Cubby, Inc. v. CompuServe Inc., 776 F. Supp.. 135 (S.D.N.Y. 1991).

This presented a lose-lose situation to online platforms – if they chose to moderate harmful third-party content, they would be held liable as a publisher, and if they knew or should have known about the harmful content on their platform they could be held liable as a distributor of that harmful content.8[8]U.S. Dep’t of Justice, Section 230 Review (Oct. 20, 2024). Imposing both publisher and distributor liability on online platforms would place an impossible burden on them to either over or under-regulate third party content.9[9]Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997). Section 230 changed this by letting online platforms host and moderate content with less risk of liability. 

Anderson v. TikTok, Inc. 

Anderson v. TikTok illustrated the substantial harm that can result from participating in social media challenges.10[10]Id. at 181. In Anderson, ten-year-old Nylah Anderson accidentally asphyxiated herself while participating in the Blackout Challenge,11[11]Id. a challenge that encouraged users to choke themselves with belts, purse strings, or anything similar until passing out.12[12]Id. at 182. The Blackout Challenge was viral on TikTok at the time and Nylah discovered it through her For You Page (“FYP”).13[13]Id. TikTok’s FYP uses information like age, online interactions, and other metadata to tailor content to users. Tawainna Anderson, Nylah’s mother, brought a claim against TikTok for causing the death of her daughter. She alleged that TikTok was aware of the Blackout Challenge; that TikTok allowed users to post videos of themselves participating in the Blackout Challenge; and that TikTok recommended and promoted the Blackout Challenge videos to minors’ FYP through its algorithm, including at least one such video to Nylah’s FYP, resulting in her death.14[14]Id. The United States District Court for the Eastern District of Pennsylvania held that Anderson’s claims were barred by Section 230 because the claims were “inextricably linked to the manner to which the defendants chose to publish third-party user content through screening, arrangement promotion, and distribution of content.”15[15]Anderson v. TikTok, Inc., 637 F. Supp.3d 276, 281 (E.D. Pa. 2022).

The case was later appealed, and the United States Court of Appeals for the Third Circuit held TikTok’s FYP algorithm is TikTok’s own “expressive activity.”16[16]Anderson v. TikTok, Inc., 116 F.4th 180, 184 (3d Cir. 2024). The company’s algorithm was considered speech, so Section 230 – which protects a passive publisher’s republication of material – would not necessarily shield the platform from a plaintiff’s claims.17[17]Id. The Third Circuit relied on the Supreme Court’s decision in Moody v. NetChoice, LLC, in which the Supreme Court held “a platform’s algorithm that reflects ‘editorial judgments’ about ‘compiling the third-party speech it wants in the way it wants’ is the platform’s own ‘expressive product.’18[18]Moody v. NetChoice LLC., 144 S. Ct. 2383, 2394 (2024). Therefore, it is protected by the First Amendment.”19[19]Id. The Third Circuit noted that “had Nylah viewed the Blackout Challenge video through TikTok’s search function,” in contrast to viewing it “through her automated FYP, then TikTok may be viewed more like a repository of a third-party content than an affirmative promoter of such content.”20[20]Anderson v. TikTok, Inc., 116 F.4th 180, 194 (3d Cir. 2024).

What Does the Future of Social Media Look Like? 

The Third Circuit decision offers further evidence that Section 230’s future is truly up-in-the-air. As technology evolves and algorithms become more targeted, it is necessary for social media platforms to regulate content, especially when it comes to the promotion of harmful content. It is clearly not enough for online platforms to self-regulate using their own content moderation systems and terms of service.  Even with safeguards like age restrictions and content warnings, 10-year-old Nylah Anderson died from a TikTok challenge.  

The 117th and 118th Congress introduced legislation addressing content moderation practices, including bills that addressed the publication, removal, or display order of content by amending or repealing Section 230.21[21]Clare Y. Cho, et al., Defining and Regulating Online Platforms (Cong. Research Serv. Aug. 25, 2023). Other bills created content moderation requirements that involve certain content to be published or removed, which could raise First Amendment concerns.22[22]Id. It seems that striking a balance between public safety and free speech has proven to be a challenging task for legislators, as the debate raises many questions and concerns about publisher liability and the viability of changing platform guidelines. Hopefully, new regulations in the near future lead to less user exposure to harmful content and better protect children like Nylah Anderson.  

By: Kashish Shamsi, Class of 2026

[1] Am. Psychological Ass’n, Youth Social Media and Internet Use in 2024, https://www.apa.org/topics/social-media-internet/youth-social-media-2024 (last visited Nov. 3, 2024).

[2] 47 U.S.C. § 230 (2020).

[3] Ariel Silverbreit & Jonathan Mollod, A Final Bow for Section 230: Latest Plea for Reform Calls for Sunset of Immunity Law, https://newmedialaw.proskauer.com/2024/06/11/a-final-bow-for-section-230-latest-plea-for-reform-calls-for-sunset-of-immunity-law/ (June 11, 2024).

[4] 47 U.S.C. § 230 (2020), supra note 2.

[5] U.S. Dep’t of Justice, Section 230 Review https://www.justice.gov/archives/ag/department-justice-s-review-section-230-communications-decency-act-1996 (Oct. 20, 2024).

[6] Stratton Oakmont, Inc. v. Prodigy Servs.Co., 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995).

[7] Cubby, Inc. v. CompuServe Inc., 776 F. Supp.. 135 (S.D.N.Y. 1991).

[8] U.S. Dep’t of Justice, Section 230 Review (Oct. 20, 2024).

[9] Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).

[10] Id. at 181.

[11] Id.

[12] Id. at 182.

[13] Id.

[14] Id.

[15] Anderson v. TikTok, Inc., 637 F. Supp.3d 276, 281 (E.D. Pa. 2022).

[16] Anderson v. TikTok, Inc., 116 F.4th 180, 184 (3d Cir. 2024).

[17] Id.

[18] Moody v. NetChoice LLC., 144 S. Ct. 2383, 2394 (2024).

[19] Id.

[20] Anderson v. TikTok, Inc., 116 F.4th 180, 194 (3d Cir. 2024).

[21] Clare Y. Cho, et al., Defining and Regulating Online Platforms (Cong. Research Serv. Aug. 25, 2023

[22] Id.

Related Posts
Total
0
Share