
Meta said it has started using artificial intelligence systems to analyze photos and videos for visual indicators that could suggest a Facebook or Instagram user is under 13, as the company expands its enforcement efforts against underage accounts across its platforms.
The company said the system examines “general themes and visual cues,” including factors such as a person’s height or bone structure, to estimate a user’s age. Meta stated that the technology is not facial recognition and does not identify a specific individual in an image. Instead, the company said the AI combines visual analysis with text-based and behavioral signals gathered from profiles, posts, comments, captions, bios, and user interactions to determine whether an account may belong to someone below the minimum age requirement.
“We want to be clear: this is not facial recognition,” Meta said in a blog post announcing the changes. The company added that combining visual insights with text and interaction analysis allows it to “significantly increase” the number of underage accounts it can identify and remove.
Meta Plans Wider Rollout Across Platforms
Meta said the visual analysis system is currently active in select countries, with plans for a wider rollout later. The company also said it intends to expand the technology into additional areas of its services, including Instagram Live and Facebook Groups.
If Meta’s systems determine that a user could be underage, the company said it will deactivate the account. Users would then need to complete Meta’s age verification process to restore access and avoid permanent deletion.
Child Safety Scrutiny Continues
The announcement follows mounting scrutiny over child safety on Meta’s platforms. Weeks earlier, a jury in New Mexico ordered the company to pay $375 million in civil penalties after finding that Meta had misled consumers about platform safety and exposed children to risks. The ruling also required Meta to implement fundamental platform changes. Following the decision, Meta threatened to shut down its social media services in the state.
The New Mexico case is one of several legal actions facing Meta and other major technology companies over child protection measures and online safety practices involving minors.
Teen Account Restrictions Expand
Meta also announced new geographic expansions for its “Teen Accounts” system on Instagram. The company said it is extending the feature to 27 European Union countries and Brazil. Teen Accounts place younger users into stricter default settings designed to limit unwanted interactions and harmful content exposure.
Under these settings, teens can receive direct messages only from people they already follow or are connected with. Harmful comments are hidden automatically, while accounts are set to private by default.
The company added that it is now bringing the same technology to Facebook in the United States for the first time. Meta said the feature will expand to the United Kingdom and the European Union in June.
Featured image credits: Level Agency
For more stories like it, click the +Follow button at the top of this page to follow us.
