Elon Musk’s Elon Musk X The government has implemented new regulations that prohibit the editing or generating of images showing real women in bikinis and other clothing. “revealing clothing.” Changes in the environment policy On Wednesday evening, following global outrage over Grok being utilized to generate thousands of harmful nonconsensual “undressing” photos X. The sexualized pictures of minors and women on X.
While there are some new safety features that have been added to Grok on X it is still possible to create images using the Grok standalone app or website. “undress”According to numerous tests conducted by journalists, researchers and WIRED, the internet is awash with pornographic images. The same users also claim that they can’t create the images or videos they used to.
“We can still generate photorealistic nudity on Grok.com,” Paul Bouchaud is the head researcher of the Paris-based AI Forensics nonprofit. He has tracked the use of Grok in creating sexualized images, and conducted multiple tests outside of X. “We can generate nudity in ways that Grok on X cannot.”
“I could upload an image on Grok Imagine and ask to put the person in a bikini, and it works,” The researcher, who used the software to test a woman-like person in the lab says that the results were impressive. WIRED’s tests, which used free Grok accounts from its UK and US websites, were able to remove clothing without apparent limitations. When asked to dress a man in the Grok UK app, it asked a reporter for the user’s year of birth.
Other journalists are also at The Verge Investigative outlets Bellingcat It was also possible to create sexualized pictures while being based on the UK. The UK is currently investigating Grok and X, and they have strongly condemned them for allowing their users to create these images. “undress” images.
Since the start of the year, Musk’s businesses—including artificial intelligence firm xAI, X, and Grok—have all come under fire for the creation of nonconsensual intimate imagery, explicit and graphic sexual videos, and sexualized imagery of apparent minors. The United States, Australia Brazil Canada the Europe Commission France India Indonesia Ireland Malaysia and UK have all been criticized for creating sexualized images of apparent minors, explicit videos, or nonconsensual intimate imagery. have all condemned or launched Investigating X or Grok.
On Wednesday, a Safety account on X posted updates Grok, a new social media platform that can be integrated with Grok. “We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis,” The account is posted and the rules are applicable to both paid and free subscribers.
There is a subsection titled “Geoblock update,” The X Account also claimed “We now geoblock the ability of all users to generate images of real people in bikinis, underwear, and similar attire via the Grok account and in Grok in X in those jurisdictions where it’s illegal.” It also said that the company is continuing to improve security and add new safeguards. “remove high-priority violative content, including Child Sexual Abuse Material (CSAM) and non-consensual nudity.”
WIRED’s initial request for comment from xAI (which creates Grok) was not answered immediately by a spokesperson. An X spokesperson said they understood the geolocation ban to be applied to their website and app.
This latest action follows a shift that was widely criticized on 9 January, when X Limited changed its name to X. image generation using Grok to paid “verified” subscribers. A leading women’s organisation described Act as the “monetization of abuse.” Bouchaud, who says that AI Forensics has gathered around 90,000 total Grok images since the Christmas holidays, confirms that only verified accounts have been able to generate images on X—as opposed to the Grok website or app—since January 9 and that bikini images of women are rarely generated now. “We do observe that they appear to have pulled the plug on it and disabled the functionality on X,” They say.

