A new study has claimed that Instagram’s tools, which are designed to protect teenagers from harmful content, are failing to prevent them from seeing posts about suicide and self-harm. The research, conducted by a coalition of child safety groups and cyber researchers, also said that the social media platform, which is owned by Meta, encouraged children “to post content that received highly sexualised comments from adults.” The study found that 30 out of 47 safety tools for teens on Instagram were “substantially ineffective or no longer exist.” The investigation was carried out by the U.S. research center Cybersecurity for Democracy, with experts including whistleblower Arturo Béjar, on behalf of child safety groups including the Molly Rose Foundation.
The Molly Rose Foundation was set up after the death of Molly Russell, who took her own life at the age of 14 in 2017. A coroner later concluded that she died while suffering from the “negative effects of online content.” The researchers said they found significant issues with the tools after setting up fake teen accounts for their investigation. They found that only eight of the 47 safety tools they analyzed were working effectively, meaning teens were being shown content that violated Instagram’s own rules for young people. This included posts describing “demeaning sexual acts” and autocompleted search terms that promoted suicide, self-harm, or eating disorders.
Disputed Findings and Meta’s Response
Meta has disputed the research and its findings. A company spokesperson told the BBC that the report “misrepresents” its safety tools and that the company’s protections have led to teens seeing less harmful content on Instagram. “Teen Accounts lead the industry because they provide automatic safety protections and straightforward parental controls,” the spokesperson said. The company had introduced these teen accounts in 2024 with the promise of better protections and more parental oversight, and the feature was expanded to Facebook and Messenger in 2025.
Andy Burrows, chief executive of the Molly Rose Foundation, said the findings suggested Meta’s teen accounts were “a PR-driven performative stunt rather than a clear and concerted attempt to fix long running safety risks.” However, Meta told the BBC that the research fails to understand how its content settings for teens work. “The reality is teens who were placed into these protections saw less sensitive content, experienced less unwanted contact, and spent less time on Instagram at night,” a spokesperson said.
A Call for Stricter Regulation
A government spokesperson told the BBC that under the new Online Safety Act, platforms are now legally required to protect young people from damaging content, including material promoting self-harm or suicide, and they can “no longer look the other way.” Dr. Laura Edelson, co-director of the report’s authors, said that “these tools have a long way to go before they are fit for purpose.” The findings have renewed a debate about the effectiveness of self-regulation by social media firms, with critics arguing that the industry’s focus on engagement and profit often comes at the expense of user safety.
Author’s Opinion
This report highlights a fundamental conflict between a platform’s profit-driven algorithm and its stated commitment to user safety. No matter how many safety tools are implemented, the core algorithm’s job is to maximize engagement, which often leads to the amplification of sensational or harmful content. This creates a no-win situation where regulators and child safety advocates will always be playing catch-up. While the Online Safety Act is a necessary step, its true test will be in its ability to force platforms to fundamentally change their underlying systems, not just add a layer of ineffective “safety” tools on top. The findings confirm that for companies, the financial incentive to collect data often outweighs the ethical obligation to protect the most vulnerable users, a tension that will continue to fuel regulatory battles for the foreseeable future.
Featured image credit: ROBIN WORRALL via Unsplash
For more stories like it, click the +Follow button at the top of this page to follow us.