New community management AI from FaceIt bans 20,000 players in trail run

Esports tournament platform reduces number of toxic messages in Counter-Strike games by 20%

Esports tournament platform FaceIt has revealed its new community management AI, which has already banned 20,000 players for toxicity.

Trained through machine learning to address toxicity at scale, Minerva has overseen a 20% reduction in toxic messages on Counter-Strike: Global Offensive matches.

Developed with the help of Google Cloud and Jigsaw, the programme has analysed over 200 million chat messages over the last few months, and marked seven million as toxic.

Minerva went fully automated in August, after months of training to minimise false positives. FaceIt reports that Minerva is able to take a decision in just a few seconds after a match has ended, and issue warnings or bans accordingly.

During its first six weeks of activity, Minerva issued 90,000 warnings and 20,000 bans for verbal abuse and spam. Minerva also drove down the number unique players sending toxic messages by around 8%.

"In-game chat detection is only the first and most simplistic of the applications of Minerva and more of a case study that serves as a first step toward our vision for this AI," said FaceIt in a blog post.

"We're really excited about this foundation as it represents a strong base that will allow us to improve Minerva until we finally detect and address all kinds of abusive behaviors in real-time."

More stories

Half-Life Alyx developers want to make a full-scale Half-Life game for consoles

"This ice has been broken, now we're hoping to smash through the ice completely"

By Rebekah Valentine

Developing for Xbox Game Pass: "I could never pitch these ideas to a publisher"

Xbox, Double Fine, Obsidian, Mojang and InXile discuss how Game Pass is changing the games they make

By Christopher Dring

Latest comments

Sign in to contribute

Need an account? Register now.