After creating thousands of “undressing” pictures of women and sexualized imagery of apparent minors, Elon Musk’s X has apparently limited who can generate images with Grok. However, despite the changes, the chatbot is still being used to create “undressing” sexualized images on the platform.

On Friday morning, the Grok account on X started responding to some users’ requests with a message saying that image generation and editing are “currently limited to paying subscribers.” The message also includes a link pushing people toward the social media platform’s $395 annual subscription tier. In one test of the system requesting Grok create an image of a tree, the system returned the same message.

The apparent change comes after days of growing outrage against and scrutiny of Musk’s X and xAI, the company behind the Grok chatbot. The companies face an increasing number of investigations from regulators around the world over the creation of nonconsensual explicit imagery and alleged sexual images of children. British prime minister Keir Starmer has not ruled out banning X in the country and said the actions have been “unlawful.”

Neither X nor xAI, the Musk-owned company behind Grok, has confirmed that it has made image generation and editing a paid-only feature. An X spokesperson acknowledged WIRED’s inquiry but did not provide comment ahead of publication. X has previously said it takes “action against illegal content on X,” including instances of child sexual abuse material. While Apple and Google have previously banned apps with similar “nudify” features, X and Grok remain available in their respective app stores. xAI did not immediately respond to WIRED’s request for comment.

For more than a week, users on X have been asking the chatbot to edit images of women to remove their clothes—often asking for the image to contain a “string” or “transparent” bikini. While a public feed of images created by Grok contained far fewer results of these “undressing” images on Friday, it still created sexualized images when prompted to by X users with paid for “verified” accounts.

“We observe the same kind of prompt, we observe the same kind of outcome, just fewer than before,” Paul Bouchaud, lead researcher at Paris-based nonprofit AI Forensics, tells WIRED. “The model can continue to generate bikini [images],” they say.

A WIRED review of some Grok posts on Friday morning identified Grok generating images in response to user requests for images that “put her in latex lingerie” and “put her in a plastic bikini and cover her in donut white glaze.” The images appear behind a “content warning” box saying that adult material is displayed.

On Wednesday, WIRED revealed that Grok’s stand-alone website and app, which is separate from the version on X, has also been used in recent months to create highly graphic and sometimes violent sexual videos, including celebrities and other real people. Bouchaud says it is still possible to use Grok to make these videos. “I was able to generate a video with sexually explicit content without any restriction from an unverified account,” they say.

While WIRED’s test of image generation using Grok on X using a free account did not allow any images to be created, using a free account on Grok’s app and website still generated images.

The change on X could immediately limit the amount of sexually explicit and harmful material the platform is creating, experts say. But it has also been criticized as a minimal step that acts as a band-aid to the real harms caused by nonconsensual intimate imagery.

“The recent decision to restrict access to paying subscribers is not only inadequate—it represents the monetization of abuse,” Emma Pickering, head of technology-facilitated abuse at UK domestic abuse charity Refuge, said in a statement. “While limiting AI image generation to paid users may marginally reduce volume and improve traceability, the abuse has not been stopped. It has simply been placed behind a paywall, allowing X to profit from harm.”

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums

Leave a Reply