2024-09-12 10:05:57

Microsoft has created a new policy to deal with image abuse.

The tech giant has “heard concerns from victims, experts, and other stakeholders” over the years when it comes to intimate imagery of people being shared online without their consent and has teamed up with StopNCII.org to combat the issue.

In a blog post, Microsoft said: “On July 30, Microsoft released a policy whitepaper, outlining a set of suggestions for policymakers to help protect Americans from abusive AI deepfakes, with a focus on protecting women and children from online exploitation. Advocating for modernized laws to protect victims is one element of our comprehensive approach to address abusive AI-generated content – today we also provide an update on Microsoft’s global efforts to safeguard its services and individuals from non-consensual intimate imagery (NCII).

“We have heard concerns from victims, experts, and other stakeholders that user reporting alone may not scale effectively for impact or adequately address the risk that imagery can be accessed via search. As a result, today we are announcing that we are partnering with StopNCII to pilot a victim-centered approach to detection in Bing, our search engine.

The post explained that their collaborative platform will enable adults from around the globe to protect themselves and encouraged others to make use of reporting to StopNCII if they feel they need to.

“StopNCII is a platform run by SWGfL that enables adults from around the world to protect themselves from having their intimate images shared online without their consent. StopNCII enables victims to create a “hash” or digital fingerprint of their images, without those images ever leaving their device (including synthetic imagery). Those hashes can then be used by a wide range of industry partners to detect that imagery on their services and take action in line with their policies. In March 2024, Microsoft donated a new PhotoDNA capability to support StopNCII’s efforts. We have been piloting use of the StopNCII database to prevent this content from being returned in image search results in Bing. We have taken action on 268,899 images up to the end of August. We will continue to evaluate efforts to expand this partnership. We encourage adults concerned about the release – or potential release – of their images to report to StopNCII. ”

Visit Bang Premier (main website)