‘Abusive’ AI undressing trend is taking over X thanks to Elon Musk’s Grok, analysis reveals
- by Independent
- Jan 06, 2026
- 0 Comments
- 0 Likes Flag 0 Of 5
Support Now
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.
Read more
More than half of all AI-generated images on Elon Muskâs X are of adults and children with their clothes digitally removed, according to new research.
Analysis from the Paris-based non-profit AI Forensics revealed that the degrading trend is dominating the platform, despite the social media firm committing to crack down on illegal content.
âOur analysis of tens of thousands of images generated by Grok quantifies the extent of the abuse,â Paul Bouchaud, a researcher at AI Forensics, said in a statement shared with The Independent.
âNon-consensual sexual imagery of women, sometimes appearing very young, is widespread rather than exceptional, alongside other prohibited content such as Isis and Nazi propaganda â all demonstrating a lack of meaningful safety mechanisms.â
Around 2 per cent of the images generated by Grok depicted persons that appeared to be 18 years old or younger, AI Forensics said, while 6 per cent involved public figures.
UK regulator Ofcom noted that it is illegal to create or share non-consensual intimate images or child sexual abuse material, including AI-generated content.
âWe are aware of serious concerns raised about a feature on Grok on X that produces undressed images of people and sexualised images of children,â an Ofcom spokesperson said.
âWe have made urgent contact with X and xAI to understand what steps they have taken to comply with their legal duties to protect users in the UK.â
In response to a public statement by Ofcom on X, Grok posted an altered image of the Ofcom logo in a bikini.
The European Commission also said on Monday that it was âvery seriouslyâ looking into complaints about explicit and non-consensual images on X.
Mr Musk, who took over the platform formerly known as Twitter in 2022, said his company would crack down on the trend.
âAnyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content,â he posted on X.
An X spokesperson said: âWe take action against illegal content on X, including child sexual abuse material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary.
Some cyber experts have claimed that this approach is reactive, calling instead for safety guardrails to be built in to AI tools from the start.
âSocial media companies need to treat AI misuse as a core trust and safety issue, not just a content moderation challenge,â said Cliff Steinhauer, director of information security and engagement at the National Cybersecurity Alliance.
âAllowing users to alter images of real people without notification or permission creates immediate risks for harassment, exploitation, and lasting reputational harm... These are not edge cases or hypothetical scenarios, but predictable outcomes when safeguards fail or are deprioritised.â
More about
Please first to comment
Related Post
Stay Connected
Tweets by elonmuskTo get the latest tweets please make sure you are logged in on X on this browser.
Energy




