
Apple has begun rolling out age verification checks for iPhone and iPad users in the UK, requiring individuals to confirm whether they are adults before accessing certain services, including apps restricted to users aged 18 and above.
After installing the latest iOS 26.4 update, users receive a prompt asking them to verify their age. Verification methods include submitting credit card details or scanning an official ID, according to an Apple support page. Users who do not confirm their age, or who are identified as underage, will have web content filters enabled automatically.
Verification Process And Device Restrictions
Upon updating their devices, users encounter a message stating: “UK law requires you to confirm you are an adult to change content restrictions.” Apple can also rely on existing account information, such as stored payment methods or account history, including how long an account has been active, to help determine a user’s age. Children under 13 are not permitted to create an account without approval from a guardian.
The introduction of these checks extends existing safeguards, though UK regulations do not explicitly require device-level age verification. In 2025, new provisions under the Online Safety Act required platforms and websites to strengthen protections for children, particularly those hosting adult content.
Regulatory Support And Legal Context
Ofcom described the update as a “real win for children and families.” A spokesperson said the regulator had worked with Apple and other services to ensure safety rules can apply across different systems and environments.
Existing UK laws already require certain websites, including those hosting pornography, to implement age verification measures. These requirements have prompted debate over how user data is collected and protected.
Privacy Concerns And Criticism
Opposition to the rollout has emerged from privacy advocates. Silkie Carlo, director of campaign group Big Brother Watch, said the update restricts how people access online content. She described the system as placing a “chokehold” on internet use and compared the software changes to coercive practices, arguing that users are effectively forced to comply or face restricted device functionality.
Carlo said the measures require users to provide sensitive personal information, including identification documents or payment details, and warned that such data could be vulnerable to breaches. She added that while child safety is important, it should not rely on broad data collection requirements imposed by large technology companies.
Wider Industry Debate And Government Trials
The rollout comes amid ongoing discussions within the technology sector about limiting children’s exposure to harmful online material and managing social media use. The UK government is conducting a trial involving 300 teenagers, testing different restrictions on social media access. These include full app blocks, overnight restrictions, and daily usage limits of one hour, while a control group experiences no changes.
The trial runs alongside a public consultation examining whether the UK should introduce restrictions similar to those proposed in Australia, where policymakers have considered making it illegal for individuals under 16 to access certain social media platforms.
Featured image credits: Freerange Stock
For more stories like it, click the +Follow button at the top of this page to follow us.
