A coalition of parents and lawyers is preparing multiple lawsuits against child gaming platform Roblox, following a federal case accusing the site of failing to protect children from sexual exploitation.
Last week, Louisiana Attorney General Liz Murrill filed a lawsuit alleging that Roblox “knowingly and intentionally” failed to implement proper safety measures to shield minors from predatory behavior and child sex abuse materials (CSAM). Roblox pushed back in an official statement, saying: “We dedicate vast resources to supporting a safe infrastructure including advanced technology and 24/7 human moderation, to detect and prevent inappropriate content and behavior — not only because it’s important to us but because it is such a critical issue and so important to our community.”
Parents and Law Firms Taking Action
The first in a series of lawsuits is being filed on behalf of parents and their underage children by Dolman Law Group, which has already submitted five complaints. One of the cases, filed in the Northern District of California, accuses Roblox of enabling sexually exploitative content through moderation choices, including offering suggestive avatar customizations and failing to block usernames with hidden pedophilic phrases.
A representative from Dolman Law Group said they are investigating around 300 additional allegations of sexual exploitation on the platform. They also noted that the majority of complaints involve users under 16, with many cases involving young girls. At least seven other law firms are reportedly pursuing similar investigations, some of which also extend to platforms like Discord.
Spotlight on AI Moderation System
Critics have focused on Roblox’s new AI-powered moderation system, Sentinel, which was introduced to detect risks such as grooming and child endangerment in real time. Roblox reported that in the first half of 2025, Sentinel flagged around 1,200 potential cases of child exploitation, which were forwarded to the National Center for Missing and Exploited Children (NCMEC).
Despite these numbers, some legal representatives argue that the scale of cases slipping through shows Sentinel is far from adequate.
This is not the first time Roblox has faced scrutiny. In 2023, a group of parents filed a class action lawsuit accusing the company of “negligent misrepresentation and false advertising,” claiming the platform falsely promoted itself as safe for children despite poor moderation.
Other lawsuits have challenged Roblox’s in-game purchasing system, Robux, comparing it to “illegal child gambling.”
In response to mounting legal pressure, Roblox has rolled out stricter safety protocols in recent years, including parental monitoring tools, tighter chat restrictions, and age verification for teen users.
What The Author Thinks
While Roblox has made progress with AI moderation and stricter safeguards, the sheer number of cases coming forward shows the system is far from foolproof. A platform with millions of children should set the gold standard for safety, not play catch-up after lawsuits. The lawsuits may force Roblox to confront what parents have been saying for years — that a platform built for kids needs to prioritize safety over engagement and profits.
Featured image credit: Oberon Copeland via Unsplash
For more stories like it, click the +Follow button at the top of this page to follow us.