Proponents of generative artificial intelligence have said the technology is poised to upend countless tasks, processes and industries – and video-game development is no exception, if startup Artificial Agency has its way.
The Edmonton-based company, founded by former Google DeepMind employees, has raised US$16-million to build out its generative AI gaming engine. The technology will allow game developers to build more realistic and interactive characters that can take actions based on what’s happening within the game, according to the company, rather than following prescripted behaviours. Investors in the round include Radical Ventures, Toyota Ventures and Flying Fish Partners, among others.
Co-founder and chief executive Brian Tanner said Artificial Agency has been working with a handful of gaming studios to develop its technology and expects to widely release the engine next year.
Artificial Agency is not alone in bringing generative AI to the US$262-billion-a-year video-game industry. California-based startup Inworld AI, for example, is using AI to create dialogue for non-player characters, or NPCs, and quests within games. Artificial Agency will also have to overcome skepticism from some gaming studios, and ensure its technology works consistently. Just as chatbots can hallucinate and make up information, generative AI within games can also misfire.
The co-founders of Artificial Agency, who include Michael Johanson, Alex Kearney and Andrew Butcher, were AI researchers at the DeepMind office in Edmonton, which parent company Alphabet Inc. shut down in 2023 as part of a restructuring. Edmonton is a hub for reinforcement learning research, a branch of AI that concerns building agents that learn by interacting with an environment. That’s partly owing to Richard Sutton, a University of Alberta professor and world-renowned expert in reinforcement learning who headed Edmonton’s DeepMind office. Prof. Sutton is also an angel investor in Artificial Agency.
Gaming environments have long been a training ground for building AI agents, in part because games have clear goals and rules. “We always wanted to find a way to take the technology we were building at work and bring it into the games industry,” Mr. Tanner said.
Artificial Agency, founded last year, has built what it calls an AI-powered behaviour engine that can be used by a wide variety of gaming developers. One application is to breathe more life into the characters that populate video-game worlds. Typically, every line of dialogue and action is scripted by developers.
“They build these complex structures called behaviour trees, which is a really large flowchart intended to have every possible situation that can happen, and the sequence of actions the character can take,” Mr. Tanner said. The process is time-consuming and can result in characters that feel flat and limited, he added. With generative AI, the company says that characters can have a fuller, more natural set of responses without a lot of hard-coding, but still allowing the developer to have control.
The technology could also allow for developers to build AI companions within games. Dr. Kearney showed The Globe and Mail a brief demo in Minecraft, where the company’s AI technology was powering another character named Aaron. Dr. Kearney could chat with Aaron and complete tasks together, none of which was preprogrammed, she said. Aaron fetched bread when Dr. Kearney said she was hungry and tossed her a hunk of meat when she said she’s gluten-free. When asked to watch her back, Aaron dutifully slew a zombie.
Dr. Kearney could also talk with Aaron outside of the game through Discord, an online chat platform. “We see our behaviour engine being able to power characters that exist not just in one game for a short period of time, but in a long-term way,” she said.
The use of generative AI within video games is still in the early stages. Julian Togelius, an associate professor at New York University who studies AI and gaming, has played a number of gaming demos with generative AI, not all of which were impressive. “If you want to do NPCs really well, you need a lot of authoring,” he said. Still, a recent preview he saw from Inworld and Ubisoft was an improvement over past demos.
Not every game studio is going to be enthusiastic about generative AI, owing to concerns about the impact on jobs and the creative process. “If you go into a typical medium or large game studio, you would probably encounter a low-intensity culture war happening,” Prof. Togelius said.
Scripting dialogue and behaviour might consume time and resources, but it can result in a cohesive product that reflects the developer’s intent. “Even if we did have the most advanced AI that you could direct with a prompt, it wouldn’t be as meaningful because the author wouldn’t be as deeply present,” said Tanya X. Short, co-founder of Kitfox Games in Montreal.
But the technology could create entirely new categories of games, said Daniel Mulet, a principal with Radical Ventures and board member of Artificial Agency. He’s just not sure what those games might look like. “The biggest developments in any technology category happen once it’s been in the wild,” he said. “In the hands of developers, something interesting will come out.”
Hallucinations are another potential issue. In one of the demos that Prof. Togelius tried, he was able to convince a character that a horse was in the scene, even though the scenario was equine-free. Likewise, in one of Dr. Kearney’s Minecraft tests, the AI-powered character insisted it was chopping wood, even though it did not have the ability to do so.
Mr. Tanner is not overly concerned about AI-powered NPCs running wild. “We give the developers so much control over exactly how these runtime decisions are made and how they’re validated that they have a lot of power to ensure these sorts of things don’t happen in their games,” he said.