
Meta has begun removing Australian users under the age of 16 from Instagram, Facebook, and Threads ahead of a nationwide social media ban that comes into force on 10 December, placing hundreds of thousands of teen accounts at risk of deactivation.
Account removals begin ahead of national ban
Meta said last month it had started notifying users aged between 13 and 15 that their accounts would begin shutting down from 4 December. The removals began one week before Australia’s new law takes effect.
Authorities estimate that about 150,000 Facebook accounts and 350,000 Instagram accounts will be affected. Threads, which can only be accessed through an Instagram account, is also impacted.
From 10 December, social media companies face fines of up to A$49.5m (US$33m, £25m) if they fail to take what the government calls “reasonable steps” to prevent under-16s from holding accounts.
Meta outlines compliance position
A Meta spokesperson told the BBC on Thursday that “compliance with the law will be an ongoing and multi-layered process.”
The spokesperson added that Meta believes a more standardised and privacy-focused system is needed, arguing that app stores should be required to verify user age at download and seek parental approval for users under 16. This, Meta said, would remove the need for young users to verify their age separately across multiple services.
Meta also said that users identified as under 16 are allowed to download and save their posts, videos, and messages before their accounts are deactivated.
Teens who believe they were wrongly classified may request a review. They can submit a video selfie for age verification or provide a driver’s licence or other government-issued identification.
Platforms covered by the ban
Alongside Instagram, Facebook, and Threads, the ban applies to YouTube, X, TikTok, Snapchat, Reddit, Kick, and Twitch.
YouTube, which was initially excluded from the ban before being added, said the law was “rushed.” The company warned that banning under-16s from holding accounts, which currently come with parental controls, could make the platform “less safe.”
Government response and policy aims
The Australian government says the ban is intended to reduce children’s exposure to harmful content and online risks. Critics have argued the policy could isolate some young people who rely on social platforms for social connection and drive them toward less regulated parts of the internet.
Communications Minister Anika Wells said on Wednesday that early implementation issues were expected but said the measure was about protecting younger users.
She said the law was designed to protect “Generation Alpha,” referring to those under 15, and future generations. Wells described children as being drawn into a “dopamine drip” once they gain access to smartphones and social media. She also referred to social media algorithms as being designed to be habit-forming.
Research findings on youth social media use
The government commissioned a study earlier this year that found 96% of Australian children aged 10 to 15 use social media. Seven out of 10 reported being exposed to harmful content, including misogynistic material, violence, and content linked to eating disorders and suicide.
The study also found that one in seven children had experienced grooming-type behaviour from adults or older children. More than half said they had been victims of cyberbullying.
Australia’s ban, the first of its kind at a national level, is being closely observed by governments in other countries.
Featured image credits: Freepik
For more stories like it, click the +Follow button at the top of this page to follow us.
