Payment processors were against CSAM until Grok started making it
- by The Verge
- Jan 26, 2026
- 0 Comments
- 0 Likes Flag 0 Of 5
AI
Posts from this topic will be added to your daily email digest and your homepage feed.
Follow In the past, payment providers have been aggressive about cutting access to websites thought to have a significant presence of CSAM — or even legal, consensually produced sexual content. In 2020, Mastercard and Visa banned Pornhub after a New York Times article noted the prevalence of CSAM on the platform. In May 2025, Civitai was cut off by its credit card processor because “they do not wish to support platforms that allow AI-generated explicit content,” Civitai CEO Justin Maier told 404 Media. In July 2025, payment processors pressured Valve into removing adult games.
In fact, at times financial institutions have threatened people and platforms because it seems like they didn’t want reputational risk. In 2014, adult performer Eden Alexander’s fundraiser for a hospital stay was shut down by payments company WePay because of a retweet. Also in 2014, JPMorganChase abruptly shut down several porn stars’ bank accounts. In 2021, OnlyFans briefly tried to ban sexually explicit content because banks didn’t like it. (Widespread backlash to the move quickly made OnlyFans reverse itself.) This is legal, consensual sexual content — and it was deemed too hot to handle.
“The industry is no longer willing to self-regulate for something as universally agreed on as the most abhorrent thing out there.”
But Musk’s boutique revenge porn and CSAM generator is, apparently, just fine.
It’s a striking reversal. “The industry is no longer willing to self-regulate for something as universally agreed on as the most abhorrent thing out there,” which is CSAM, says Lana Swartz, the author of New Money: How Payment Became Social Media, of the inaction by Stripe and the credit card companies.
Visa, Mastercard, American Express, Stripe, and Discover did not return requests for comment. The US Financial Coalition Against Child Sexual Exploitation — an industry group composed of payments processors, banks, and credit card companies — also did not return a request for comment. On its website, FCACSE brags that “As a result of its efforts, the use of credit cards to purchase child sexual abuse content online has been virtually eliminated globally.”
Except, of course, on X.
Sexualized images of children are not the only problem with Grok’s image generation
In the past, “people who did completely legal stuff were cut off from banks,” notes Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence. There are incentives to overenforce boundaries around questionable images — and traditionally, that’s what the financial industry has done. So why is X different? It’s run by Elon Musk. “He’s the richest man in the world, he has close ties to the US government, and he’s incredibly litigious,” says Pfefferkorn. In fact, Musk has previously filed suit against the Center for Countering Digital Hate; in a now-dismissed lawsuit, he claimed it illegally collected data showing an increase in hate speech after he bought the platform formerly known as Twitter.
Sexualized images of children are not the only problem with Grok’s image generation. The New York Times estimated that 1.8 million images the AI generated in a nine-day time period, or about 44 percent of posts, were sexualized images of adult women — which, depending on how explicit they are, can also be illegal to spread. Using different tools, the Center for Countering Digital Hate estimated that more than half of Grok’s images contained sexualized imagery of men, women, and children.
The explosion of sexualized images took place after Musk posted an AI-edited image of himself in a bikini on December 31st. A week later, X’s head of product, Nikita Bier, posted that the previous four days were also the highest-engagement days on X ever.
Lawyer Carrie Goldberg, whose history includes challenging Section 230 in a stalking lawsuit against Grindr and another suit that ultimately shut down chat client Omegle, is representing Ashley St. Clair, the mother of one of Musk’s children, in a case against X. St. Clair is one of many women Grok undressed — and now she’s suing the platform, arguing that X has created a public nuisance. “In the St. Clair case we are only focused on xAI and Grok because they are so directly liable from our perspective,” she said in an email. “But I could envision other sources of liability.” She specifically cited distributors like Apple and Google’s app stores as areas of interest.
“A lot of this could end up in court, and it’s going to be up to judges to make decisions about what’s ‘sexually explicit.’”
Please first to comment
Related Post
Stay Connected
Tweets by elonmuskTo get the latest tweets please make sure you are logged in on X on this browser.
Energy




