In a tragic case, Nylah Anderson's family sued TikTok after the 10-year-old died in December 2021 while attempting the "blackout challenge" promoted by the platform's algorithm. The lawsuit, which alleged negligence and product liability, challenged the protections of Section 230, a law that shields platforms like TikTok from liability for user-generated content. While a Pennsylvania court initially ruled in TikTok's favor, the U.S. Court of Appeals for the Third Circuit later disagreed, stating that TikTok’s algorithmic recommendations amounted to "first-party speech," making the platform partially responsible for the tragic event.
Carla Varriale-Barker notes how the case highlights the significance in testing the limits of Section 230 immunity: "It is unfortunate that it took a young person's tragic death to lead to a reexamination of the complexities of Section 230 and the immunities it bestows upon publishers of third-party content. The debate over the robustness of Section 230's applicability will, and should continue — not only to prevent further needless losses of human life, but also to expand the conversation about the intersection of free speech and legality in America," said Varriale-Barker. Read more here (subscriber based).