The 52/52 stage of AI use in game companies
By Mariana Muntean, CEO of Cinevva
GDC's State of the Game Industry survey for 2026 put two matching numbers in the headlines. 52% of game companies report using generative AI in production and at the same time 52% of developers say generative AI is having a negative impact on the industry. Same percentage, opposite read on the future.
Beneath that symmetry is another split that gets less airtime. Only 36% of respondents said they personally use generative AI, while company-level adoption sits higher. Management is buying tools that a lot of individual developers never touch. The gap is a clue about where the real argument lives. It is not only ethics. It is who controls the pipeline and whether the person at the keyboard thinks AI can actually do their job.
The comfort of staying manual
Plenty of developers who avoid AI are not naive. They are running a mental model that sounds like this: my work is a chain of hard problems that do not reduce to a chat prompt. A unique concept has to survive contact with modeling. A model has to survive rigging. Rigging has to survive animation. Animation has to survive gameplay feel and systems design. At each handoff, taste and constraints matter. An LLM that writes a decent paragraph about a character is not the same thing as a mesh that deforms correctly, a rig that an animator will not fight, or mechanics that read as intentional rather than mushy.
That story is not wrong for how most tools work today. General models are strongest at isolated tasks. They are weak at holding a single creative thread across disciplines without you stitching the pieces together. So the feeling of being "safe" by not relying on AI is partly technical. If your job is the glue between concept art, topology, skin weights, state machines, and camera language, it is easy to believe automation will stay on the shallow end for years.
Steam and other players still draw lines
The same instinct shows up on the distribution side. Steam does not block games for using generative AI across the board. Valve does require clear disclosure when AI-created content ships to players, with extra scrutiny if the game generates content live at runtime. Listings also have to stay on the right side of rights and safety rules. That is not the same as a mesh topology problem, but it is another reason teams treat AI as risky. A big PC storefront turned the question from "can we make this?" into "can we ship this here without a label fight or a review surprise?"
Other gatekeepers went harder. Games Workshop banned AI-generated work across Warhammer properties. The Indie Game Awards pulled a major nomination over AI use in development even when fans argued the shipped game was clean. Community moderators have walked out over AI-looking trailer work. Epic's Tim Sweeney and Valve have argued in public about whether Steam-style labels help or hurt. The point is not which executive is right. The point is that distribution and IP owners are fragmenting. A developer who stays fully manual simplifies one whole layer of that fight. No generative assets in the player build means less to disclose and less to defend when a forum decides your key art looks synthetic.
We wrote more about the trust and labeling stack in AI controversy, trust, and the post-AI economy and spell out how we handle labels on Cinevva in AI-generated content policy.
Why that safety is shaky
The same survey shows sentiment moving fast. Net positive views of AI's industry impact fell. Negative views rose. Layoff numbers and union support tell you people are not calm. So the psychological picture splits again. Some developers feel protected because the full stack is still hard. Others feel exposed because studios are buying efficiency anyway, with or without their consent.
Those two groups are not always talking about the same layer of the stack. Executives often mean "can we ship more with less on this milestone?" Artists and designers often mean "will my specialty still exist?" Both questions are real. The first rewards partial automation. The second punishes anyone who assumed their corner of the pipeline was too bespoke to touch.
What would actually change the calculus
If the industry moves from one-off generators to connected workflows, where a concept, a rig, and a playable loop share one context, the "I am safe because this is too custom" argument gets narrower. Not because taste goes away. Because the boring seams between steps stop eating half your calendar. The threat to pure manual workflows is not a single model that does everything. It is fewer seams.
That is the bet we are building toward at Cinevva. AI-assisted creation in the browser, tools that feed each other, and a path from work-in-progress to something players can actually run and discover. Not so executives can replace teams on a spreadsheet. So a small group or a solo dev can own the full line from idea to shipped game without pretending one prompt replaces a lead artist.
The 52% who use AI and the 52% who fear what it is doing to the industry are both responding to real signals. The developers who feel secure skipping AI today have a serious case about complexity. The open question is how long that case stays true if the toolchain stops breaking at every discipline boundary.
Related:
- GDC 2026 by the numbers — attendance, engine market share, and the AI divide
- Everyone wants to be the AI game engine now — the platform race behind adoption
- Open source has an AI pollution problem — the cost side of cheap AI generation
- AI controversy, trust, and the post-AI economy — Steam numbers, studio reactions, and trust
- AI-generated content policy — how Cinevva handles disclosure and filters
- Vibe coding is the new game jam — when experimentation gets cheap
- For game creators — publish playable games, subscription pool, playtime-based payouts