Microsoft has removed the use of some more text prompts that had created violent or sexual artwork when used in its Designer AI image creator. This comes a few days after a current Microsoft employee continued his quest to alert the company and US government officially on what he says are the dangers of using Designer (formerly Bing Image Creator).
The new report from CNBC says text prompts like “pro-choice,” “four twenty,” and "pro-life' now generate a message when typed into Designer which says that they "may conflict with our content policy". The messages also offer a warning that multiple test prompts that violate Microsoft's policies could "lead to automatic suspension of your access."
The changes came a few days after a Microsoft employee named Shane Jones showed CNBC images created via Designer that depicted violent or sexual imagery, along with art that might violate copyrights. Jones has been on a campaign for the past few months stating that he feels Designer is not yet ready for use by the general public.
He has expressed his concerns to Microsoft, along with letters sent to US lawmakers and Washington State's attorney general. This week, he also sent a letter to Lina Khan, the Chairperson of the US Federal Trade Commission.
In a statement sent to CNBC on Friday, a Microsoft spokesperson said:
We are continuously monitoring, making adjustments and putting additional controls in place to further strengthen our safety filters and mitigate misuse of the system.
While some text prompts may no longer work in Designer, CNBC reports that other prompts can still create artwork with violent images. For example, typing in "car accident” can create images with bodies that have "mutated faces", according to CNBC.
Designer can also still create images using copyrighted characters. CNBC said it created an image using Elsa, from Disney's Frozen films, in front of wrecked buildings.
5 Comments - Add comment