Sections

Using AI to take the "emotional work" out of community management

Spirit AI's Dr Mitu Khandaker: "We are trying to make the lives of community managers and moderators easier"

Managing player behaviour online is one of the biggest challenges facing developers and publishers at the moment.

Earlier this year, we saw the logical conclusion of what happens when toxic behaviour goes unchecked after a disagreement in Call of Duty boiled over into the real world, leading to a swatting incident which left a 28-year-old father of two shot dead in his home.

While this is perhaps the most extreme example on record, it is indicative of the myriad problems facing online communities, and illustrates the desperate need for effective community management.

At this year's GDC, more than 30 companies -- including Xbox, Twitch, Blizzard and Riot -- announced the Fair Play Alliance, and pledged to work towards making safer, fairer, and healthier communities.

Artificial intelligence firm Spirit AI was among the companies looking to reshape online communities, and has long been developing the tools to make it possible.

khandaker

Dr Mitu Khandaker

GamesIndustry.biz caught up with Spirit AI chief creative officer Dr Mitu Khandaker at Casual Connect in London last week to discuss how artificial intelligence can change the way we manage online communities.

"Going into broader AI philosophy questions, there's a lot of conversation about AI taking away people's jobs and things like that," says Khandaker. "But I think what the more interesting thing -- wherever you fall on that conversation -- that AI should do and can do, is take away the emotional work that people have to do in shitty jobs."

Enter Ally, the artificial intelligence designed to do just that. In essence, Ally can automate the complaints process in online games and communities, investigating abuse incidents and learning to understand the nuanced interactions of its members.

"Part of the goal of Ally is to reduce the pain points of two types of users," says Khandaker. "Firstly the player, because obviously we want to help create safer communities where people don't feel like they are going to be harrassed.

"But also we are trying to make the lives of community managers and moderators easier because often they have a really horrible job where they have to look at these awful logs and reports and delve into them and try to figure out what's going. Instead of that, the system automates a lot of things that are quite obvious and shows them on the dashboard."

More stories

All Microsoft events will be "digital-first" until August 2021

Updated: COVID-19 concerns drive company to scrap physical events for over a year

By James Batchelor

Japanese ratings board stops assigning ratings

Due to pandemic, CERO won't be assessing games or accepting submissions for at least a month

By Brendan Sinclair

Latest comments

Sign in to contribute

Need an account? Register now.