Skip to main content

New community management AI from FaceIt bans 20,000 players in trail run

Esports tournament platform reduces number of toxic messages in Counter-Strike games by 20%

Esports tournament platform FaceIt has revealed its new community management AI, which has already banned 20,000 players for toxicity.

Trained through machine learning to address toxicity at scale, Minerva has overseen a 20% reduction in toxic messages on Counter-Strike: Global Offensive matches.

Developed with the help of Google Cloud and Jigsaw, the programme has analysed over 200 million chat messages over the last few months, and marked seven million as toxic.

Minerva went fully automated in August, after months of training to minimise false positives. FaceIt reports that Minerva is able to take a decision in just a few seconds after a match has ended, and issue warnings or bans accordingly.

During its first six weeks of activity, Minerva issued 90,000 warnings and 20,000 bans for verbal abuse and spam. Minerva also drove down the number unique players sending toxic messages by around 8%.

"In-game chat detection is only the first and most simplistic of the applications of Minerva and more of a case study that serves as a first step toward our vision for this AI," said FaceIt in a blog post.

"We're really excited about this foundation as it represents a strong base that will allow us to improve Minerva until we finally detect and address all kinds of abusive behaviors in real-time."

Read this next

Ivy Taylor avatar
Ivy Taylor: Ivy joined GamesIndustry.biz in 2017 having previously worked as a regional journalist, and a political campaigns manager before that. They are also one of the UK's foremost Sonic the Hedgehog apologists.
Related topics