Microsoft employee warns FTC about Microsoft’s AI image generator


Microsoft AI engineer Shane Jones warns that the company’s AI image generator, Copilot Designer, creates sexual and violent content and ignores copyright laws.

Jones, who is not involved in the development of the image generator, volunteered to red-team the product for vulnerabilities in his spare time.

He found that the Image Generator could generate violent and sexual images, including images of violent scenes related to abortion rights, underage drinking, and drug use.

Last December, he shared his findings internally with Microsoft and asked the company to withdraw the product. Microsoft did not comply.



Jones stresses that he contacted Microsoft’s Office for Responsible AI and spoke with Copilot Designer’s senior management – without receiving a satisfactory response.

In January, Jones wrote a letter to U.S. senators and met with members of the Senate Committee on Commerce, Science, and Transportation.

Now he is escalating: in a letter to the chairwoman of the U.S. Antitrust Commission, Lina Khan, and Microsoft’s board of directors, he is demanding better safeguards, transparency, and a change in the adult rating of the Android app.

He also called for an independent review of Microsoft’s AI incident reporting process, claiming that problems with the image generator were known to OpenAI and Microsoft before it was released last fall.

Jones has worked at Microsoft for about six years and currently holds a principal software engineering manager.


weird, egocentric responses. According to CNBC, the image prompts used by Jones continue to work despite numerous warnings. Microsoft deflects critical questions by saying it is working to improve its AI technology.

OpenAI at least has a better handle on text and image moderation, especially with DALL-E 3, thanks to its ChatGPT integration.

But even Google, which has been much slower and perhaps more cautious than Microsoft and OpenAI, has problems with its image generator producing historically inaccurate images, such as Asian-looking people in Nazi uniforms when you ask for a soldier in a World War II uniform.

These examples show how difficult it is for companies to control generative AI. Unlike Microsoft, however, Google has taken its image generator offline.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top