
The UK government said it will ban so-called nudification apps that use artificial intelligence to digitally remove clothing from images, as part of a broader strategy aimed at reducing violence against women and girls. The announcement was made Thursday alongside new legislative plans that would make it illegal to create or supply AI tools designed to generate non-consensual sexually explicit imagery.
New Offences Target AI Image Manipulation
Under the proposed laws, the creation and distribution of nudification or “de-clothing” apps would become a criminal offence. These tools use generative AI to make realistic images or videos that appear to show individuals without clothing, even though no such image exists.
The government said the new offences would build on existing laws that already criminalize the creation of sexually explicit deepfake images without consent. Such acts are currently covered under the Online Safety Act.
Technology Secretary Liz Kendall said the measures are intended to prevent technology from being used to abuse, humiliate, or exploit women and girls. She said the new offence would ensure that those who profit from or enable nudification apps face legal consequences.
Concerns Over Harm And Child Abuse Risks
Experts and child protection groups have raised repeated concerns about the growth of nudification apps and their potential to cause serious harm. These tools have been used to generate fake nude images of both adults and children, including material that may constitute child sexual abuse imagery.
In April, the Children’s Commissioner for England, Dame Rachel de Souza, called for a full ban on nudification apps. In her report, she said that while creating such images is already illegal, the technology that enables their creation should also be prohibited.
Collaboration With Tech Firms
The government said it plans to work with technology companies to develop methods to combat intimate image abuse. This includes continued collaboration with UK-based safety technology firm SafeToNet.
SafeToNet has developed AI software it says can identify and block sexual content and prevent cameras from operating when sexual imagery is detected. The government said this technology builds on existing tools used by platforms such as Meta, which already deploy filters to detect and flag potential nudity, often to prevent children from sharing intimate images.
Industry And Charity Responses
Child protection charities have previously urged stronger action against nudification technology. The Internet Watch Foundation said 19 percent of under-18 users who contacted its Report Remove helpline confirmed that some or all of their explicit imagery had been manipulated.
Kerry Smith, chief executive of the IWF, welcomed the proposed ban, saying nudification apps have no legitimate purpose and increase the risk of harm to children. She said the imagery produced by such tools is often circulated in illicit online spaces.
The NSPCC also supported the move but said it was disappointed that the proposals did not include mandatory device-level protections. The charity has called for stronger requirements on technology companies to detect and prevent the spread of child sexual abuse material, including in private messaging services.
Further Measures On Child Protection
The government said it also plans to make it impossible for children to take, share, or view nude images on their phones. In addition, it is seeking to outlaw AI tools specifically designed to create or distribute child sexual abuse material.
The new measures are part of a wider government strategy aimed at addressing online abuse and reducing violence against women and girls.
Featured image credits: MrPaloma
For more stories like it, click the +Follow button at the top of this page to follow us.
