DMR News

Advancing Digital Conversations

Google Partners with UK Nonprofit to Detect and Remove Nonconsensual Images

ByDayne Lee

Sep 22, 2025

Google Partners with UK Nonprofit to Detect and Remove Nonconsensual Images

Google is partnering with the U.K. nonprofit StopNCII to bolster its efforts at combating the spread of nonconsensual intimate images, also known as revenge porn. The search giant will begin using StopNCII’s hashes, which are unique digital fingerprints of images and videos, to proactively identify and remove this content from Search.

StopNCII’s system is a tool that helps adults prevent their private images from being shared online by creating a unique identifier, or hash. These hashes are then provided to partner platforms like Facebook, allowing them to automatically identify and remove matching content. The intimate imagery itself never leaves the user’s device, as only the hash is uploaded to StopNCII’s system. In a blog post, Google stated that it had heard from survivors and advocates that more needs to be done to reduce the burden on those affected by the content.

Joining a Growing List of Partners

Google’s partnership with StopNCII comes about a year after Microsoft integrated the tool into its Bing search engine. This makes Google a later adopter of the technology compared to its rival. Other companies that have partnered with StopNCII to combat this type of content include Facebook, Instagram, TikTok, Reddit, Bumble, Snapchat, and X.

The new partnership with the nonprofit marks Google’s latest move in its efforts to combat nonconsensual intimate images. Last year, the company made it easier for people to request the removal of deepfake nonconsensual intimate images from Search and also implemented changes to make them harder to find. The company has a long-standing commitment to improving search rankings to reduce the visibility of this type of content as well.

What The Author Thinks

This partnership, while a positive and necessary step, highlights the reactive nature of content moderation in the tech industry. Google’s slow adoption of this technology, a full year after a major competitor like Microsoft, shows that these large platforms often wait for public pressure or for a competitor to move first before implementing critical safety features. As AI makes it easier to create and spread nonconsensual intimate images and videos, a proactive system like StopNCII is no longer a luxury but an essential tool for platforms to demonstrate genuine responsibility. This partnership is a necessary move towards a more responsible digital ecosystem, where the burden of preventing this content is shifted away from the victim and onto the platforms themselves.


Featured image credit: ray_explores via Flickr

For more stories like it, click the +Follow button at the top of this page to follow us.

Dayne Lee

With a foundation in financial day trading, I transitioned to my current role as an editor, where I prioritize accuracy and reader engagement in our content. I excel in collaborating with writers to ensure top-quality news coverage. This shift from finance to journalism has been both challenging and rewarding, driving my commitment to editorial excellence.

Leave a Reply

Your email address will not be published. Required fields are marked *