Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Xbox enforcements to protect players rose to 10.2m in H2 2022

80% of all actions taken were based on proactive moderation, before players reported the incidents

Sign up for the GI Daily here to get the biggest news straight to your inbox

Microsoft has released its second Xbox Transparency Report, sharing more information on how it is tackling inappropriate content and conduct on its gaming platform.

For the six months ended December 31, 2022, Xbox reported there were 10.19 million enforcements in total, ranging from account suspension to content remove.

This is up 40% from the 7.31 million actions taken during the first half of last year

There were 24.47 million player reports made during H2 2022, 12.97 million (45%) of which were regarding communications between users. 11.40 million (41%) where regarding content, while 3.11 million (11%) concerned user-generated content.

However, only 2.11 million were 'reactive' enforcements, i.e. based on player reports. The other eight million (80% of total) were 'proactive,' driven by automatic and AI-powered content moderation tools.

In fact, Microsoft reports its tools assessed over 20 billion human interactions during the six-month period.

Just shy of three quarters of all enforcements – 7.51 million, to be precise, were tackling the ongoing problem of inauthentic accounts, such as bot-created or automated accounts the "create an unlevel playing field," according to Xbox.

Xbox also reports a 450% increase in the number of enforcements against vulgar content when compared to the first half of 2022, and 390% increase in 'content-only' enforcements.

There were also 549 reports to National Center for missing and exploited children, and 1,361 referrals to Crisis Text Line.

229,870 players appealed against the suspension of their account, but only 6% were successful.