New community management AI from FaceIt bans 20,000 players in trail run

Esports tournament platform reduces number of toxic messages in Counter-Strike games by 20%

Esports tournament platform FaceIt has revealed its new community management AI, which has already banned 20,000 players for toxicity.

Trained through machine learning to address toxicity at scale, Minerva has overseen a 20% reduction in toxic messages on Counter-Strike: Global Offensive matches.

Developed with the help of Google Cloud and Jigsaw, the programme has analysed over 200 million chat messages over the last few months, and marked seven million as toxic.

Minerva went fully automated in August, after months of training to minimise false positives. FaceIt reports that Minerva is able to take a decision in just a few seconds after a match has ended, and issue warnings or bans accordingly.

During its first six weeks of activity, Minerva issued 90,000 warnings and 20,000 bans for verbal abuse and spam. Minerva also drove down the number unique players sending toxic messages by around 8%.

"In-game chat detection is only the first and most simplistic of the applications of Minerva and more of a case study that serves as a first step toward our vision for this AI," said FaceIt in a blog post.

"We're really excited about this foundation as it represents a strong base that will allow us to improve Minerva until we finally detect and address all kinds of abusive behaviors in real-time."

More stories

Black Desert Online and the diverging MMO ecosystem

Pearl Abyss America CEO Jeonghee Jin discusses the changing face of MMO monetisation

By Haydn Taylor

Adapting to the evolution of esports

Major League Gaming co-founder Mike Sepso explains why the original MLG wouldn't work today, and how his new start-up Vindex is positioned for the future

By Brendan Sinclair

Latest comments

Sign in to contribute

Need an account? Register now.