Tech: Grok Generated An Estimated 3 Million Sexualized Images — Including...

Tech: Grok Generated An Estimated 3 Million Sexualized Images — Including...

We already knew xAI's Grok was barraging X with nonconsensual sexual images of real people. But now there are some numbers to put things in perspective. Over an 11-day period, Grok generated an estimated 3 million sexualized images — including an estimated 23,000 of children.

Put another way, Grok generated an estimated 190 sexualized images per minute during that 11-day period. Among those, it made a sexualized image of children once every 41 seconds.

On Thursday, the Center for Countering Digital Hate (CCDH) published its findings. The British nonprofit based its findings on a random sample of 20,000 Grok images from December 29 to January 9. The CCDH then extrapolated a broader estimate based on the 4.6 million images Grok generated during that period.

The research defined sexualized images as those with "photorealistic depictions of a person in sexual positions, angles, or situations; a person in underwear, swimwear or similarly revealing clothing; or imagery depicting sexual fluids." The CCDH didn't take image prompts into account, so the estimate doesn't differentiate between nonconsensual sexualized versions of real photos and those generated exclusively from a text prompt.

The CCDH used an AI tool to identify the proportion of the sampled images that were sexualized. That may warrant some degree of caution in the findings. However, I'm told that many third-party analytics services for X have reliable data because they use the platform's API.

On January 9, xAI restricted Grok's ability to edit existing images to paid users. (That didn't solve the problem; it merely turned it into a premium feature.) Five days later, X restricted Grok's ability to digitally undress real people.

But that restriction only applied to X; the standalone Grok app reportedly continues to generate these images. Since Apple and Google host the apps — which their policies explicitly prohibit — you might expect them to remove them from their stores. Well, in that case, you'd be wrong.

So far, Tim Cook's Apple and Sundar Pichai's Google haven’t removed Grok from their stores — unlike similar “nudifying” apps from other developers. The companies also didn’t take any action on X while it was producing the images. That’s despite 28 women’s groups (and other progressive advocacy nonprofits) publishing an open letter calling on the companies to act.

The companies haven't replied to multiple requests for comment from Engadget. To my knowledge, they haven't acknowledged the iss

Source: Engadget