Leading academic calls for games industry action on generative AI: Video Games Industry Memo, 26/10/2023
Dr Tommy Thompson calls for industry to create ‘traffic light’ system to protect indies from AI risk
Video game AI expert calls for industry action on generative AI.
Starfield blasts Xbox’s Q1 revenues soar to new heights.
Cities: Skyline 2 provides zen comfort in this week’s releases.
The big read - Leading academic calls for games industry action on generative AI
A leading expert on artificial intelligence (AI) and video games has called on industry bodies to develop a new “traffic light system” to help companies understand the potential risks of using AI, and particularly generative AI, within the development of video games.
Dr Tommy Thompson, an AI consultant, former King's College London lecturer and creator of the AI and Games YouTube channel, has warned that small games businesses are vulnerable to using an AI service in an inadvertently damaging manner because they lack the legal and commercial expertise of their Triple A counterparts.
And without the creation of easy-to-use guidance to assess risk, Tommy fears smaller developers without experience of using AI within development will only see the opportunities of the technologies rather than the challenges - opening them up to harm in the process.
Rise of the machine (learning)
To understand how the gap has emerged, we have to take a brief look back at how AI has been used within the development of video games.
Game AI, the sub-discipline of AI responsible for familiar mechanics such as powering non-player character movement, has been used by developers of all shapes and sizes for a number of decades.
However, there has been “a silent revolution” in game development over the past 15 years which has seen machine learning used within the production processes of leading Triple A titles to comparatively little fanfare.
“It happened because game developers were looking at how they make games and - of course - games were getting bigger, becoming more expensive and becoming more difficult to make,” Tommy explains.
As a result, leading developers sought ways to speed up their work and make intelligent decisions faster. So they sought out tricky problems, without obvious answers, which had lots of data around it and applied machine learning techniques to find solutions.
This led to the emergence of player analytics in the late 2000s which, importantly, was one of the first areas where the industry successfully applied machine learning as part of the development process - using masses of data about how players engaged with a game (e.g. how often a gun was used in an online shooter) to finetune their experience.
But this triumph led to another question: how else could machine learning be used within video game development?
The result was an unheralded AI arms race in the industry, with Triple A games businesses creating R&D divisions dedicated to AI - often in partnership with universities and academic institutions - to explore wider applications of the technology. These include EA’s SEED, Ubisoft’s La Forge and support for Xbox from Microsoft Research’s Game Intelligence.
Over the course of the decade, this led to a range of machine learning led advances that have helped make noticeable positive differences to the development and maintenance of video games.
EA, for example, saved thousands of hours of work on the Mass Effect: Legendary Edition through a process which rapidly upscaled old textures. Call of Duty has applied machine learning to its RICOCHET anti-cheat system to prevent cheats from prospering. Ubisoft, meanwhile, used its research to create its “Commit Assistant” to help developers understand whether the code they’re about to commit to a build is likely to break it.
But while machine learning quietly revolutionised a number of areas of Triple A game development in particular, integrating AI into development has been done in a conservative manner.
“The games industry is notoriously risk averse,” said Tommy. “It doesn’t like the idea of embracing a new technology just in case there is an end game with it.”
While new technology may be exciting, it will only be used by a major developer if it works, in Tommy’s words, “in service of the development process.” In short, if it hinders the creation of a game by disrupting development or exposing a company to legal risk then it doesn’t get in.
As a result, the biggest games businesses used their R&D teams to identify areas where AI could improve development over a long period of time to allow them to resolve problems steadily. But they did so quietly behind closed doors for commercial reasons, shutting knowledge away from the wider industry in the process.
And in doing so it inadvertently created a problem: what happens if the wider industry suddenly needed to know how to skeptically assess widely available AI tools?
Generative AI - the attack from outside
Step forward, Generative AI. The hype around the technology has been as much about its accessibility as the content it puts out, offering an opportunity for everyone to benefit from AI in their day-to-day working lives.
But to the major powers in the games industry, generative AI doesn’t represent that. Instead, it’s seen as a way of introducing major risk to the development process.
“Generative AI looks like an attack because it is coming from outside the industry from people who haven’t worked in games saying this will change development,” Tommy explains. “And we say not yet, not in its current form or ‘God no, no-one wants to use it’.”
There are two reasons why generative AI is currently widely distrusted by developers at the top of the industry.
First, the output of generative AI doesn’t offer enough consistency or quality to fit with the carefully authored approach of most video games.
Along with wider concerns about generative AI’s propensity for, as Tommy eloquently puts it, “making shit up”, the output of most large language models (LLMs) is too bland and non-specific for the development of thoughtfully stylised games.
This makes it hard to use the technology precisely within a design flow, either adding layers of complexity that developers don’t want or theoretically limiting its use to comparatively smaller tasks (e.g. writing incidental dialogue for NPCs) that are less transformative than promised.
Second, even if developers were considering using generative AI in theory there’s a big reason why they’re currently giving it a wide berth: they don’t trust what’s going into it.
In comparison to machine learning, where studios have invested time, money and their own data into making sure the process works both in development and from a legal perspective, generative AI is considered to be a black box.
Despite the efforts of third party services to reassure businesses about what data its models are trained on, many games companies have adopted the position that if they can’t be 100% sure that an LLM has swallowed up content it shouldn’t have - such as copyrighted material - then it shouldn’t be used.
This is particularly true for the more conservative businesses operating at the top of the industry, who are on high alert for copyright, IP and data processing issues.
“No legal team in any studio will tell you that GPT is an actual option to use right now within games production,” Tommy says. “They are all thinking about building LLMs internally using their own data that can do something to help them, but only in a way that doesn’t lead to a legal quandary.”
However, that caution amongst Triple A developers has been built by tip-toeing through the practical, legal and reputational repercussions of applying machine learning to development processes.
Independent developers have little or none of that experience to call upon. Instead, they’re likely to see the many upsides of using a generative AI service - such as allowing them to scale their content production to keep up with the big boys - without being aware of the many commercial and legal risks of doing so.
For example, Steam, arguably one of the most laissez-faire storefronts, has decided to stop companies releasing games with generative AI content (as Tommy has written on his own Substack) unless the developer can prove it has trained the generative model solely on its own data.
By taking such an approach, Tommy warns that this could be a start of a bigger battle between generative AI and the industry that may see smaller businesses who used the tech in good faith landed in hot water.
“Steam saying no to this leads to an almost 100% chance that Nintendo, Sony and Microsoft follow,” says Tommy.
“So we’ve got a noisy generative AI industry saying ‘you can create a game by yourself using these tools!’ But they’re not saying ‘oh now you won’t be able to sell it anywhere.’ And that is a real problem.”
Red means stop
Despite the challenges posed by the use of generative AI within games, Tommy has proposed a relatively simple solution for the industry to consider adopting.
He is advocating for the creation of a ‘traffic-light’ risk assessment for the general industry to help small businesses check the potential applications of AI in their game against common commercial, legal or corporate reputational risks.
By bringing together experts from Triple A businesses, who have already built easy-to-use risk assessments internally, Tommy believes that it is possible to create a principles-led assessment tool that’s useful for all developers without asking the biggest developers to spill confidential secrets.
But to achieve it, Tommy believes that industry associations need to take an active role in brokering the conversation and encouraging major companies to share best practice.
“They have to collaborate with the rest of the industry to reach out to companies to say “what’s your experience with it” [AI in game development] and how do you create a set of guidelines, processes and documentation to allow developers to explore how it works.”
Crucially, Tommy believes that this will only be successful if associations fully understand the repercussions of AI in games themselves.
“Right now, the stance is ‘there’s a lot of opportunity for using AI in the video games industry.’ Well, no shit; that’s been the answer for 20 years. We’ve now reached the point where we collectively need to have better education on applying this technology. It’s only going to get more pervasive, more powerful and the risks of it will increase. So we need a stance now.”
Otherwise we may find out that while AI in games doesn’t present an existential risk to society, it may present an existential risk to unknowing games businesses instead.
News in brief
Starfield-ing role: Xbox recorded its highest-ever Q1 revenue, generating $3.9bn from its gaming business in large part as a result of the launch of Starfield, which drove both sales and uptake of Game Pass. We wrote about the game’s significance to Xbox’s strategy in a test run VGIM here.
“Fuck you, we’re not paying” - Mobilegamer.biz has the inside story of the Unity Runtime fee debacle. It reveals that senior staff were not briefed on the policy, that the announcement was rushed as part of a “kill AppLovin” strategy and that a senior games exec told John Riccitiello to do one to his face. Delicious.
Money, Money, Money - Sweden’s games market had a bumper year in 2022, according to trade body Dataspelsbranchen. The Swedish industry generated a turnover of 86.5bn Swedish Kronor (roughly £650m), up 47% year-on-year despite the post pandemic slowdown.
Fazed and bemused: Esports organisation Faze Clan has been picked up by GameSquare for $17m. The Verge describes this as a “steep drop-off” from its previous $725m valuation; we think that’s a charitable way to put it.
Bowser’s Army: Bowser (Doug) has said that the reason Nintendo of America doesn’t have unionised staff is because the company has cultivated an excellent working environment. Try telling the Koopalings that *cowers from the audible booing*.
Ins and outs
Adrian Hon, founder of Six to Start, has left the company...Paul Houlders is the new Global Head of Art at Technicolor Games…There have been lay-offs at Roblox in Shenzhen…Game Anglia is looking for a Programme Manager…There’s a Film, Gaming and Entertainment Partnerships Director role going at BRIDGE…
Events and conferences
Scottish Games Week, Multiple locations - 30th October - 3rd November
Paris Games Week, Paris - 1st November - 5th November
Golden Joysticks, London - 10th November
The Game Awards, Los Angeles - 7th December
Games of the week
Cities: Skylines II - Paradox published SimCity alike gets a much anticipated sequel to the delight of digital urban planners.
Alan Wake 2 - Another long sought after sequel by a Finnish studio (two in a week, eh?) arrives to scare the living daylights out of everyone.
Just Dance 2024 - Ubisoft once again confidently answers the question “what am I buying my nieces for Christmas this year?”
Metal Gear Solid Master Collection V1 - Seven Metal Gear games in a single collection, Konami? That’s insane.
Before you go…
A modder has replaced the Xenomorph in Alien: Isolation with Thomas the Tank Engine.
It’s the most terrifying piece of Thomas content since The Fat Controller bricked up Henry in a tunnel.
Got a tip for our ins and outs, events and conferences or games to watch sections? Or do you just fancy a chat? Drop me an email here.