Google partners with UK nonprofit to detect and remove nonconsensual intimateimages from Search

Google is partnering with the U.K. nonprofit StopNCII to bolster its efforts at combating the spread of nonconsensual intimate images, also known as revenge porn. The search giant will begin using StopNCII’s hashes, which are digital fingerprints of images and videos, to proactively identify and remove non-consensual intimate imagery on Search.

StopNCII helps adults prevent their private images from being shared online by creating a unique identifier, or hash, representing their intimate imagery. These hashes are then provided to partner platforms like Facebook, allowing them to identify and remove matching content from their platforms or services automatically. It is worth noting that the private imagery itself never leaves the device, as only the hash is uploaded to StopNCII’s system.

Google stated that its existing tools allow people to request the removal of nonconsensual intimate images from Search, and that it has continued to launch ranking improvements to reduce the visibility of this type of content. The company also noted it has heard from survivors and advocates that given the scale of the open web, more must be done to reduce the burden on those who are affected by it.

Google has been slow to adopt StopNCII’s system, as its partnership with the nonprofit comes a year after Microsoft integrated the tool into Bing. Other companies that have already partnered with StopNCII include Facebook, Instagram, TikTok, Reddit, Bumble, Snapchat, OnlyFans, and X.

The search giant’s partnership with the nonprofit marks its latest move to combat nonconsensual intimate images. Last year, Google made it easier to remove deepfake nonconsensual intimate images from Search and made them harder to find.