Articles & Publications 10.29.24

TikTok’, The “Blackout Challenge”, and the New Limits on Section 230 Immunity

In a recent landmark decision, the U.S. Court of Appeals for the Third Circuit ruled that TikTok could be held liable in a wrongful death suit stemming from the “Blackout Challenge” — a disturbing trend promoted through the platform’s algorithm. This decision in Anderson v. TikTok, Inc. and Byte Dance, Inc. could redefine the boundaries of Section 230 of the Communications Decency Act (CDA), which has long shielded internet platforms from liability related to third-party content. Now, the Third Circuit’s reasoning may open the door for the U.S. Supreme Court to revisit Section 230, possibly ending the broad immunity social media platforms currently enjoy.

The facts behind Anderson are tragic. A ten-year-old girl died after attempting the “Blackout Challenge,” a dangerous trend that appeared on her “For You” page on TikTok. The challenge, which involved self-asphyxiation, encouraged participants to record and share their attempts online. TikTok’s algorithm specifically curated this content for her, displaying it on a page designed to recommend engaging videos. Her estate has now sued TikTok, arguing that the algorithm’s design and content recommendations contributed to her death. According to the plaintiffs, TikTok’s recommendation functioned as its own “speech,” making TikTok responsible for promoting the content, even if it didn’t create it.

Previously, the case was dismissed based on Section 230 immunity, a protection Congress introduced to shield interactive computer services from being held liable for third-party content. The goal was to avoid hindering the development of the then-nascent internet industry. However, the Third Circuit disagreed with the dismissal.

The court held that TikTok is not merely hosting third-party content but actively recommending specific content to users, transforming it from a passive platform to one engaging in “expressive conduct.” TikTok’s algorithm, according to the court, isn’t simply displaying content but making targeted recommendations that can influence user behavior — including a ten-year-old child’s decision to try the “Blackout Challenge.” This shift marks a critical distinction: while Section 230 protects platforms from liability for hosting third-party content, it may not extend to the algorithms they use to curate and promote that content.

In a strongly worded concurrence, one judge argued that platforms like TikTok cannot claim immunity while their algorithms push harmful content to vulnerable users, like children. According to the concurrence, the original purpose of Section 230 was never to permit “causal indifference” to severe harm resulting from these recommendations. The opinion goes on to criticize the misuse of Section 230, calling out the “cauldron” of unregulated content being promoted to at-risk audiences with “no oversight, no accountability, no remedy.”

For businesses, particularly those operating internet platforms or using recommendation algorithms, the Anderson decision underscores the importance of reviewing content moderation, algorithmic design and terms of service. The decision suggests that platforms could now face liability not only for hosting harmful content but for the way their algorithms promote it. This shift is especially relevant for companies whose content recommendations could reach vulnerable audiences, such as children or those prone to influence. With Section 230 immunity potentially eroding, companies may need to adopt proactive measures — from content monitoring to refining algorithms — to limit exposure to liability.

The Anderson decision may signify a pivotal shift as courts begin re-evaluating Section 230. As the internet has evolved, so has the question of liability for content recommendations — especially when platforms play an active role in promoting specific content. For TikTok and similar platforms, Anderson signals a possible future in which algorithms are no longer insulated by Section 230, opening a new chapter in the ongoing debate over platform accountability.