Latest: It's Not Weird To Want A Generative AI Disclosure On Games 2026

Latest: It's Not Weird To Want A Generative AI Disclosure On Games 2026

Steam's competitor has criticized its AI disclosure policy as pointless, but if consumers want to know, why shouldn't they?

This week: Looked at RAM prices and despaired at the prospect of putting off a PC upgrade for another year. (Hang in there, RTX 2070 Super!)

Before I start, I should disclose that I am bald and thus have limited need for shampoo.

That was for fellow bald guy and Epic Games CEO Tim Sweeney, who's been lampooning Steam's generative AI disclosure requirement by likening it to demanding that everyone disclose their haircare routines. According to Sweeney, the policy "makes no sense for game stores, where AI will be involved in nearly all future production."

I respect Sweeney for mobilizing Epic's Fortnite fortune to challenge Apple and Steam's 30% cut, but I've got to side with the House of Gabe on this one: I don't see how it benefits consumers to pretend that generative AI is run-of-the-mill software. Steam's disclosure requirement is one of very few checks that have been placed on tech that, for one thing, has been credibly accused of automating plagiarism.

It's definitely a complicated question for developers. Does it count if you used Photoshop's generative fill tool while making concept art that was never intended for the public eye? Or if you used Claude to generate a few code snippets? Or if someone in marketing used ChatGPT to make a spreadsheet?

But it's obvious what most consumers really want to know: Are we getting art, writing, music, and voices that were conceived and made by human brains and hands and vocal cords, with the expected influences from other artists, or did they outsource some of the work to sloppy culture-production machines that have been trained on other people's creations without consent, in a way they never could have expected?

If there were no merit to generative AI's bad reputation, I'd also push back on Steam's disclosure requirement for reinforcing nonsense or superstition, but these aren't baseless concerns. OpenAI has admitted that its products cannot work without training on copyrighted material. It argues that this qualifies as fair use. There is obviously disagreement on that point, and the AI industry has only managed to kick the legal side of this battle down the road by sucking up to politicians and promising big paydays to the media companies powerful enough to complain.

The significant power requirement of AI data centers is also not disputed, so that public concern can't be dismissed,

Source: PC Gamer