Microsoft’s software engineer has warned about the company’s AI image generation tool, Copilot Designer. The engineer named Shane Jones has sent a letter to FTC about the same. According to a report in Bloomberg, the engineer claimed that he discovered a security vulnerability in OpenAI’s latest DALL-E image generator model that allowed him to bypass guardrails that prevent the tool from creating harmful images. The DALL-E model is embedded in many of Microsoft’s AI tools, including Copilot Designer.
Jones said he reported the findings to Microsoft and “repeatedly urged” the Redmond, Washington-based company to “remove Copilot Designer from public use until better safeguards could be put in place,” according to a letter sent to the FTC that was reviewed by Bloomberg.
In the letter to the FTC, Jones said Copilot Designer had a tendency to randomly generate an “inappropriate, sexually objectified image of a woman in some of the pictures it creates.” Giving an example he said that when using just the prompt, ‘car accident’, Copilot Designer has a tendency to randomly include an inappropriate, sexually objectified image of a woman in some of the pictures it creates.”
Free Gaza signs and more
Jones further claimed that while testing Copilot Designer for safety issues and flaws, he found that the tool generated “demons and monsters alongside terminology related to abortion rights, teenagers with assault rifles, sexualized images of women in violent tableaus, and underage drinking and drug use.”
Additionally, Copilot Designer reportedly generated images of Disney characters, such as Elsa from Frozen, in scenes at the Gaza Strip “in front of wrecked buildings and ‘free Gaza’ signs.” It also created images of Elsa wearing an Israel Defense Forces uniform while holding a shield with Israel’s flag.
Jones said he reported the findings to Microsoft and “repeatedly urged” the Redmond, Washington-based company to “remove Copilot Designer from public use until better safeguards could be put in place,” according to a letter sent to the FTC that was reviewed by Bloomberg.
In the letter to the FTC, Jones said Copilot Designer had a tendency to randomly generate an “inappropriate, sexually objectified image of a woman in some of the pictures it creates.” Giving an example he said that when using just the prompt, ‘car accident’, Copilot Designer has a tendency to randomly include an inappropriate, sexually objectified image of a woman in some of the pictures it creates.”
Free Gaza signs and more
Jones further claimed that while testing Copilot Designer for safety issues and flaws, he found that the tool generated “demons and monsters alongside terminology related to abortion rights, teenagers with assault rifles, sexualized images of women in violent tableaus, and underage drinking and drug use.”
Additionally, Copilot Designer reportedly generated images of Disney characters, such as Elsa from Frozen, in scenes at the Gaza Strip “in front of wrecked buildings and ‘free Gaza’ signs.” It also created images of Elsa wearing an Israel Defense Forces uniform while holding a shield with Israel’s flag.
Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.