If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

The Friendly Ghost in the Machine

Spirit AI's new take on intelligent NPCs and online safeguarding

As an industry, we're fairly blase about the term artificial intelligence. Many of us deal with it on a daily basis, whether that's because we're building it or cursing it as it crushes us in yet another game of Civilization, but either way, AI gets thrown around pretty easily in gaming, and it often doesn't mean what it means elsewhere. Often, game AI is a collection of rules of thumb and loose heuristics, simultaneously cheating to emulate human behaviour and hamstringing itself to negate some of its natural advantages. It's a far cry from whatever keeps Elon Musk awake at night.

Spirit AI wants to create something different.

The London company is headed up by Steve Andre, who worked at IBM alongside the team developing Watson - the AI which won Jeopardy. A life-long avid gamer, Andre decided to put together a team which could start to look at the developments in advanced cognitive AI and bring them to games. To that end, he assembled a team which includes several AI specialists and engineers; award-winning writer Emily Short; game designer and assistant NYU professor Mitu Khandaker-Kokoris, and COO/CFO Jon Green, who Andre had worked with at both IBM and Fujuitsu. Together, they're putting together a raft of AI projects based on emergent technology - all aimed at making game development easier and communities more welcoming.

The first two products on the cards for the firm are Character Engine and Ally, two distinct uses of a proprietary technology which use many shared elements. Broadly, Character Engine is an AI system designed to power NPCs, allowing developers to create characters which respond to both text and verbal commands and interrogation in believable ways, whilst Ally is intended to patrol online chat interactions, helping to police them for abuse and inappropriate behaviour.

"We're making games more accessible by making them more about people and interactions with people"

"The reason I'm so excited about what we're doing is that I see both of our bits of tech as making games accessible in different ways," says chief creative officer Khandaker-Kokoris. "We're making games more accessible by making them more about people and interactions with people, and when you do that, they become more interesting to a much wider audience because everyone has an idea about what social interactions are and what the model for them is. So we're playing on that instead of other aspects.

"We've been in a phase of making sure that the product is right for the market, that the tools fit the paradigm for the way in which AAA writers want to write. We want this to fit into the workflow that people already have so they can create more dynamic exciting games. We're working with Bossa, they're our first major UK beta partner. They're great - Imre (Jele, Bossa CCO) is really willing to be experimental, they want to do interesting stuff in this space.

"Imre actually coined the phrase 'the walking simulator of conversations' because you're able to head off in almost any direction, but you're being pulled intelligently towards a conclusion by someone with their own agenda. Essentially the Character Engine demo I'll show you is the proof of concept for that: 'what does it feel like when you're interrogating a robot in a room on a one-to-one deep interaction'."

"In the demo you'll see a robot lady who has built into her an emotional response engine, a configurable personality engine," adds Jon Green as we turn to face a screen for a live demonstration. "She understands social context, she has knowledge which is structured and unstructured, and she generates her own memories, so as you play though a game she is effectively creating history of her own run time behaviour. If you burn down the shop next door, and NPCs know you did it, your consequences have long term actions."

"She understands social context, she has knowledge which is structured and unstructured, and she generates her own memories, so as you play though a game she is effectively creating history of her own run time behaviour"

The demonstration begins with a seated female android, who can be addressed via text input for now bu will eventually recognise voice commands, too. Her register uses natural vocabulary, and I'm assured that the robotic cadence of her voice is something which isn't permanent. Spirit hopes to have a minimum viable product of the system ready for GDC next week, and will be showing it off to potential clients at the San Francisco show.

It's November when we speak, and for an early demo of such a complex piece of tech, it's impressive. Gradual probing of the robot's whereabouts at the time of a murder reveal subtleties of personality and an ability to contextualise conversation from existing knowledge and previous parts of the conversation. She can lie and be lied to, can be caught out in those lies and challenge inconsistencies in the logic of her interrogator. As well as exhibiting emotional responses to perceived aggression or weakness in conversation, she'll also respond to physical intimidation, something which I learn will be more important in forthcoming VR support. For now, the conversational technology is impressive enough on a traditional display.

"There are no script trees at all, it's a free-form salience program which selects the right solution from a list of data, says Green. "Part of the system is that it goes through a number of classifiers which define topic, emotion, social state, then it finds a confidence level and decides whether it knows it's a definite answer, in which case it will go to a specific solution, or whether it needs to create an answer from its own data."

However, even though there aren't pre-written answers in the more traditional branching sense, this isn't purely procedurally generated speech - Khandaker-Kokoris and Green are keen to emphasise that creative writing is still going to be an important part of development whilst using the system, that it will not replace writers.

"One of the things we've worked hardest on, is, because this isn't a script tree with nodes, we worked very hard on how to show the writer what the consequences are of the data you're putting in and what comes out," Green explains. "Because it's not linear you have to show it in terms of options and potential outcomes in playthroughs. We're going to have to end up with some really innovative visualisations of storylines. Not in terms of linear story beats, but in terms of what the data sets you've fleshed out are and what's missing, so you avoid all false positives.

"Sometimes when you sit someone down in front of this and tell them to ask a question, they freeze. They don't know what to say - it's not an ABC choice. We're getting to the stage where we're going to have to re-educate gamers on how to play"

"Sometimes when you sit someone down in front of this and tell them to ask a question, they freeze. They don't know what to say - it's not an ABC choice. We're getting to the stage where we're going to have to re-educate gamers on how to play."

The AI will also help to push players in the right direction if they've taken a long break from a game and forgotten what they're doing, or if they've simply forgotten the particular details of a quest. Because they have an agenda and a bank of information to draw upon, they can give nudges in the right direction, answer questions which reestablish the facts and offer exposition. Steve Andre also says that they've been working on using the tech in another capacity, too - as an online tutorial assistant. With an in-depth knowledge of a game's systems and an accurate idea of player progression, the team's tech should be able to offer contextually accurate advice to players faced with complex systems.

"Quite a number of clients come back to the problem of losing early starters quite a lot," says Andre. "People come in, bounce off the difficulty and never come back. Hopefully we can help to address that."

The other half of Spirit's early product line-up: Ally, is arguably even more ambitious. Again, the focus for the online chat guardian is contextual sensitivity, something which Khandaker-Kokoris says the tech can build confidence in by asking game, situation, or player-specific questions.

"Essentially the big paradigm around it is detecting instances of harassment and problematic situations, she explains. "We have that happening across three different levels of triage. What those look like is very different depending on the type of game. On the player side of the experience, when we do detect one of these instances, we ask you if you're OK.

"I don't know why that hasn't been the default paradigm for some time, just asking people if they're OK when we think they might not be. The reason we wanted to approach it that way is because there's so much nuance when you're chatting online. If you're talking to friends, you might be fine using all sorts of language that you're not comfortable using with strangers."

In this way, Ally can appreciate that some robust back and forth between friends might not carry the same sort of impact as the same language or behaviour might from strangers. Because Ally can appreciate non-verbal input such as movement or inappropriate proximity in the same way Character Engine can, this also extends to behaviour like in-game stalking, lurking or, in the case of VR 'touching'. However, it's also key to the project that it realises that some people are fine with heavy language from strangers too, which is why it will process the potential threat in context then check with the player to see if they require assistance before taking action.

"It puts the player at the centre," says Green. "Everything else on the marketplace is doing this with a list of keywords, which is a very blunt instrument, there's no context. We let the player define what makes sense for them - either over a long period or in a specific game session. That can be specific words or people - the system understands the context of all of that. As it goes through the three levels of triage it assigns a weighted average based on the abuse signature, what the player thinks about it and whether the abuser has done it before, that can trigger either a system intervention, which is ally-bot, or, if it's more extreme, a human intervention."

"It's up to the community managers to decide what happens next," adds Khandaker-Kokoris. "It might be Allybot asking that player if they're OK with it, or offering to filter that language. From our side, there's no default response, we just do the detection, it's all very configurable. I don't know of any other system which takes such a player centric approach, and that's ultimately really what matters. We care about protecting players."