
Concerns over child safety on Roblox have intensified after an independent developer said existing safeguards, including new age verification checks, do not adequately protect young users, many of whom are under 13.
The platform remains widely used. In the UK, it ranks as the most popular gaming service among children aged eight to 12. Globally, it averaged more than 80 million daily players in 2024, with roughly 40% of users below the age of 13.
Developer Raises Concerns After Safety Interview
In an interview with BBC Radio 5 Live, an independent developer identified as “Sam” said parents should monitor children continuously while they use the platform. If constant supervision is not possible, he said children should not be allowed to play. Sam contacted the broadcaster after hearing an earlier segment featuring Matt Kaufman, who described the platform’s safety systems, including mandatory age verification checks introduced in the UK in January 2026.
Sam, who develops games on a contract basis for Roblox and also volunteers with an online safety non-profit, said his observations differ from the company’s public statements. He said he had seen cases where users were encouraged to interact inappropriately with strangers. He also described reports of individuals attempting to move conversations off-platform, which violates Roblox rules.
Platform Structure And Content Moderation Challenges
The platform allows users to create and share games in an open, interactive environment. Creators assign descriptions and content maturity labels, which determine age suitability. Revenue can come from advertising, paid access, or participation in Roblox’s creator programme, where developers receive payments based on engagement or new user activity.
Critics argue that these systems do not sufficiently limit exposure to harmful material. Allegations include the presence of inappropriate or violent user-generated content. Sam said he had encountered games depicting school shootings such as Sandy Hook and Columbine, as well as recreations of locations linked to criminal cases. He added that reports submitted through internal channels are not always acted upon, estimating that around 30% are accepted.
Roblox Response And Safety Measures
Roblox responded in a statement, saying safety remains a priority. The company said it uses advanced safeguards and filters to reduce harmful content and communication. It added that its age verification system is certified by independent experts and restricts children to interacting with users of similar ages by default. The company also said it monitors behaviour continuously and prompts users to verify their age again if actions appear inconsistent with their stated age.
Ongoing Scrutiny And Policy Developments
The issue has drawn attention before. In March 2025, Roblox chief executive Dave Baszucki told BBC News that the company takes extensive steps to protect younger users. He also advised parents to rely on their own judgement when deciding whether their children should use the platform.
Additional safety changes have since been introduced, including expanded age verification globally and restrictions preventing children from chatting with adults. However, regulatory responses vary across countries. Russia and Turkey have banned the platform, citing child safety concerns. Indonesia has included Roblox in a ban for users under 16, set to take effect on 28 March.
In Australia, a broader social media ban for under-16s does not currently include Roblox, though some groups have called for its inclusion. In the UK, the government has begun consultations on potential restrictions covering children’s access to social media, along with measures such as time limits and curfews. Trials of these measures were announced on Wednesday.
Featured image credits: Digital Mom Blog
For more stories like it, click the +Follow button at the top of this page to follow us.
