Microsoft has announced a new proactive program to remove images found in its Bing search engine that contain what might be labeled as harmful intimate images of people. The company is launching this program in partnership with an organization called StopNCII.
In a blog post, Microsoft says StopNCII allows victims who have had intimate images of themselves posted online without their consent to let them create a digital fingerprint, or "hash" of their photos. Those hashed images, which include ones made by generative AI, don't leave their device. The images can then be detected by online services for removal.
Microsoft donated its own PhotoDNA technology, which has been used in the past to detect online images of child abuse photos, to StopNCII. It added:
We have been piloting use of the StopNCII database to prevent this content from being returned in image search results in Bing. We have taken action on 268,899 images up to the end of August. We will continue to evaluate efforts to expand this partnership. We encourage adults concerned about the release – or potential release – of their images to report to StopNCII.
In addition to taking down such images, Microsoft says that it prohibits the use of its online services to send out threats of posting such images to others. It adds:
This includes asking for or threatening a person to get money, images, or other valuable things in exchange for not making the NCII public. In addition to this comprehensive policy, we have tailored prohibitions in place where relevant, such as for the Microsoft Store. The Code of Conduct for Microsoft Generative AI Services also prohibits the creation of sexually explicit content.
Microsoft has its own reporting portal site, where people can submit the web page or even the URL where the non-consensual intimate image is located. Microsoft will take action on those links if appropriate.
0 Comments - Add comment