A U.S. appeals court has reopened a lawsuit against TikTok brought by the mother of a 10-year-old girl who died after attempting a dangerous viral challenge known as the “blackout challenge.” The challenge, which involves participants choking themselves until they lose consciousness, was allegedly shown to the child by TikTok’s algorithm on her “For You” feed. This decision challenges the typical legal protections internet companies have under federal law.
The mother, Tawainna Anderson, claims that TikTok’s recommendation algorithm exposed her daughter, Nylah Anderson, to the harmful challenge in 2021.
According to the lawsuit, TikTok’s algorithm selected the video for Nylah based on her likely interests, despite reports of other children having died while attempting the same challenge. Nylah was found unresponsive in her home in Chester, Pennsylvania, after attempting the challenge using a purse strap. Despite her mother’s attempts to resuscitate her, Nylah died five days later.
Described by her family as a joyful child, Nylah’s death prompted her mother to seek accountability for the content promoted on TikTok.
Court Challenges Section 230 Protections
Originally, a lower court dismissed the lawsuit by citing Section 230 of the Communications Decency Act of 1996, a law that typically protects internet platforms from being held responsible for content posted by users. However, the Philadelphia-based 3rd U.S. Circuit Court of Appeals ruled that this legal protection does not extend to TikTok’s algorithmic recommendations.
Judge Patty Shwartz, writing for the three-judge panel, stated that TikTok’s choices to promote certain content to specific users amount to the company’s own speech, not just third-party content. This decision indicates that Section 230 does not cover the actions of TikTok’s algorithm, which actively recommends content to its users.
Judicial Concerns Over Safety and Profit
Judge Paul Matey, in a partially agreeing opinion, criticized TikTok for focusing on profits over user safety, suggesting the company knowingly exposed children to potentially harmful content to maximize engagement.
Matey argued that while Nylah likely did not understand the life-threatening risk, TikTok’s algorithm specifically targeted content to her, knowing she would watch it. This action, he said, cannot be shielded by the immunity typically provided by Section 230.
The decision to revive the lawsuit sends the case back to the lower court for further proceedings. Jeffrey Goodman, the lawyer representing Nylah’s mother, argued that this ruling could prompt more rigorous examination of Section 230 protections, especially as technology and social media increasingly impact people’s lives. Goodman emphasized that the ruling highlights the limits of legal protections for social media companies, aiming to prevent other families from experiencing similar tragedies.
TikTok and its parent company, ByteDance, have not commented on the court’s decision. In related news, The Information reported that Nicole Iacopetti, TikTok’s head of content strategy and policy, will leave the company on September 6, 2024.
Featured Image courtesy of Jakub Porzycki/NurPhoto via Getty Images
Follow us for more tech news updates.