The Fight Against Synthetic Nude Images: Microsoft Partners with StopNCII to Combat Revenge Porn
With the rise of generative AI tools, a troubling issue has emerged on the internet: the proliferation of synthetic nude images that closely resemble real individuals. This poses a significant threat to victims of revenge porn, who may find their privacy violated and their images circulating online without their consent. In response to this growing problem, Microsoft has taken a significant step to empower revenge porn victims and prevent the spread of these explicit images through its Bing search engine.
Microsoft recently announced a groundbreaking partnership with StopNCII, an organization dedicated to assisting victims of revenge porn in creating digital fingerprints of explicit images, whether real or fake, on their devices. These digital fingerprints, technically known as “hashes,” are then used by StopNCII’s partners to identify and remove these images from their platforms. By joining forces with StopNCII, Microsoft’s Bing search engine now stands alongside other major platforms such as Facebook, Instagram, TikTok, and Reddit in leveraging digital fingerprints to combat revenge porn.
In a blog post, Microsoft revealed that it had already taken action on over 268,000 explicit images identified through Bing’s image search during a pilot program with StopNCII’s database. While Microsoft had previously offered a direct reporting tool for such content, the company acknowledged that this approach was insufficient in addressing the scale of the issue.
Addressing the limitations of user reporting alone, Microsoft stated in its blog post, “We have heard concerns from victims, experts, and other stakeholders that user reporting alone may not scale effectively for impact or adequately address the risk that imagery can be accessed via search.” This proactive stance taken by Microsoft highlights the company’s commitment to combating the spread of revenge porn and protecting the privacy and dignity of individuals online.
While Microsoft and other platforms are taking steps to address the issue, concerns remain about the prevalence of AI-generated deepfake nude images. Sites that specialize in creating synthetic nude images are already causing problems for individuals, including high school students. Despite the lack of a comprehensive AI deepfake porn law in the United States, efforts are being made at the state and local levels to address the issue.
San Francisco prosecutors recently filed a lawsuit to shut down 16 of the most notorious “undressing” sites, demonstrating a proactive approach to combatting nonconsensual deepfakes. According to a tracker created by Wired, 23 American states have enacted laws to address nonconsensual deepfakes, while nine states have rejected proposed legislation on the matter.