A group of whistleblowers from Meta has accused the company of covering up internal research that showed harmful effects of its virtual reality products on children. The claims center on allegations that Meta’s lawyers intervened to shape or suppress findings that revealed risks such as grooming, harassment, and exposure to violence in its VR platforms.
Allegations of Manipulated Research
Jason Sattizahn, a researcher who worked on Meta’s VR safety studies, said the company was fully aware that underage users were active on its platforms. He alleged that Meta prioritized profits over safety, deliberately erasing or altering data that reflected poorly on the company.
Together with five other current or former employees, Sattizahn has provided documents to Congress. He and Cayce Savage, Meta’s lead researcher on youth VR user experiences, are scheduled to testify before the U.S. Senate Judiciary Subcommittee on Privacy, Technology, and the Law.
Meta spokesperson Dani Lever said the company has supported 190 research projects through Reality Labs since 2022, including those focused on youth well-being. Lever rejected the whistleblowers’ claims, saying the selected examples were being “used to fit a predetermined and false narrative.”
Deleted Evidence and Internal Warnings
The whistleblowers allege that Meta actively deleted records of harassment cases involving minors to protect its reputation. In one instance, Sattizahn said his supervisor ordered him not to collect data that might implicate the company with regulators. Another case involved a teenager reporting repeated harassment of his younger sibling through Meta’s VR devices, but the official report downplayed the issue by excluding direct references to sexual harassment.
Documents also reveal that employees repeatedly warned about younger children bypassing age restrictions, raising further concerns that Meta failed to address known risks.
Author’s Opinion
The allegations against Meta are troubling but not surprising. Large tech firms often treat safety research as a liability rather than a responsibility, especially when it threatens engagement and revenue. If these claims are accurate, Meta prioritized growth over safeguarding children, erasing data that should have driven stronger protections. This is another example of how corporate profit motives can clash with public safety — and why external oversight is essential.
Featured image credit: Jezael Melgoza via Unsplash
For more stories like it, click the +Follow button at the top of this page to follow us.