EA made a significant announcement last week that had nothing to do with FIFA, Star Wars, Dragon Age or Battlefront. Or rather, it had something to do with all of them.
Prior to its digital EA Play event, the firm revealed the 'Positive Play Charter', a set of rules over what is and isn't acceptable within its games. EA has laid out community guidelines in the past, but this charter is different. It doesn't read especially like a corporate list of legal rules; its language is more direct and accessible -- "real talk" is how EA describes it -- it sets out what's expected from its users, and details the consequences of any indiscretion.
Online toxicity is not exclusive to video games, but it is certainly prevalent -- EA says that 58% of its players have experienced toxic behaviours in the last year. Publishers have been criticised for a seeming reluctance to deal with such issues, so what caused EA to make such a public move?
"We have been focused on this for the last couple of years, and that's because it's become very clear that multiplayer games, big online communities, are now central to the majority of gaming experiences," says EA's chief marketing officer Chris Bruzzo. "This is no-longer an ancillary area. Which means, okay, we can no-longer be a bystander.
"Now more than ever, toxic behaviour is unacceptable, racism unacceptable, sexism, homophobia, hate speech, it's unacceptable"
"Last year we did an event called the Building Healthy Communities Summit. We brought together lots of perspectives, with streamers, YouTubers, influencers, as well as community managers, hardcore players and casual players. It was an all-day event with lots of discussion and some realisations. We made a series of commitments that came out of that.
"We launched new tools for players to report abuse, we improved escalation policies for dealing with harmful behaviour, we trained all of the EA employees who are involved in direct community management around toxicity. We started to enforce much stricter community guidelines. We actually stepped forward to start to protect moderator and managers and streamers and others from toxic behaviour. That was all stuff we did starting last year.
"But this year, we asked: What form does this take going forward? And it became a larger concept, which is this idea that you read in the charter, which is the idea of positive play. We are being clear about what that means and we are going to hold players and everyone else accountable to that.
"When you add in what has happened around COVID-19, which has heightened the importance of communities engaging digitally, because there is a lot of that going on. And then add the issues that we are experiencing where racial injustice is front and centre -- which you could say we're experiencing in the US, but actually if you look at the protests, activism and conversation, it's global. We started to think that, gosh, now more than ever, toxic behaviour is unacceptable, racism unacceptable, sexism, homophobia, hate speech, it's unacceptable. And it's a time for determining what action we will take, and that's what you see us doing here."
Part of the change comes from the realisation that publishers can no-longer rely on the platform holders to do the policing.
"It used to be, if you were a developer, you could say Microsoft and Sony take care of that for us. Because that's where account creation is, that's where party systems are, that's where social systems are... But that's no-longer the case. Now everybody has to play a role. We have social communities operating literally inside a single game, which is not leveraging Microsoft or Sony technology. It's actually leveraging the social features that we have built.
"The other interesting dimension is cross-platform play. Realise now that it's up to the game makers to do things like create matches between PlayStation, Xbox and PC players... We are creating those connections that don't sit on any one platform. That increases the responsibility we have for ensuring that the social interactions are positive.
"That's why one of the big moves that Respawn made with the Ping system [a means to communicate with teammates without having to use voice chat], and so much of the player-to-player communication they created in Apex Legends was designed to eliminate opportunity for toxicity. There's a ton of really cool player engagement that can occur inside Apex using the Ping system and the conversation wheel, and everything else, that makes the game fun and feel very social, with lots of interactions, but no toxicity.
"We do have to partner with people and nobody can be a bystander. Developers have to have great tools and controls, so do the platforms, and so do the systems that exist around gaming. I'm talking about things like Discord and Twitch, which are also really important areas where we have to work together so that we are creating an environment that is focused on play."
"Cross-play increases the responsibility we have for ensuring that social interactions are positive"
It's the "focused on play" element that Bruzzo keeps coming back to. Although social platforms like Facebook are used for different purposes, EA's communities are all about playing games. It allows EA to have tighter controls over what is and isn't allowed.
"In a study that we did, 58% of players say they experienced some form of toxic or disruptive behaviour in the last year. And it is one of the primary reasons that they choose not to play. So we don't really need much more evidence than that to say that toxicity is ruining play. We want everyone to play, so it's time for us to stand up"
He adds: "Apex Legends has excellent retention. We think that is directly correlated towards all kinds of things that are great about that game, including how positive the community is based on some really strong feature decisions that Respawn made. We stress out about helping players overcome obstacles in their ability to progress in the game, and about onboarding players so that they don't churn out because they didn't understand how a game operates. We put a lot of effort into that because we want players to have fun. Then we have this other insidious factor that shows up, which is actually one of the more likely reasons for leaving the game.
"It's natural for us. We want players to play for as long as they want, so all that effort we put into making the gameplay accessible and understandable, we should put as much effort into the community."
EA says it's committed to contributing more human resources to the issues, and that its customer service organisation now has the responsibility of monitoring and managing accounts and behaviours. They've already reviewed thousands of account names, tags and content, which has seen 3,500 different pieces of content being eliminated, including Nazi symbols, homophobic and racist slurs, and more.
"It does take a lot of effort and people, but it's not going to be possible unless we leverage software, too," Bruzzo explains. "The good news is that we are a software organisation. We write software and we have really talented technologists here who are hard at work constantly improving our tools for monitoring and eliminating toxicity from our systems."
Much of what EA is trying to stamp out are issues that you can find throughout society -- racism, sexism, homophobia and all forms of hate speech. But there is also a form of toxic behaviour that is unique to video games: cheating.
"Cheating leads to higher animosity, higher animosity leads to higher toxicity"
"Cheating also ruins play and is not a healthy community," Bruzzo says. "Using software or hacks or exploits that gives a player an unfair advantage, this is a major problem and players report it as significantly ruining their play experience and causing them to not want to play anymore.
"But here is the other thing it does -- it makes players angry. This is not passive entertainment... I am investing many, many hours into getting better and better at this game. One thing that makes gaming unique is that players play because they want to feel progress, and when it feels like I can't progress because someone is cheating or because things are not fair, I am even more emotional because I have invested a lot to get to this point. And so when unfair play leads to higher animosity, higher animosity leads to higher toxicity in the community. We see a direct line between those things. We have to address that."
EA says that it is being clear about what is expected of players, and they fully intend to hold people to account and then publish the results of doing that. However, there are still some vague areas in the guidelines. For instance, the firm says: "We're not here to drop the ban hammer, unless we absolutely need to." So what does constitute a ban?
"It's repeated, it's intentional... 'Absolutely need to' is where you've been warned but you continue with the behaviour. Or when the behaviour is so clearly egregious... Hate speech, unacceptable, or if it's on such a level of harm that you don't get multiple strikes.
"Things we have tolerated in the past, we will not tolerate anymore, which includes mistreatment of our community managers"
"Those are the things that can get you banned. Most often it will be repetitive toxic behaviour that does it. Often the best solution is to warn, which can lead to modifying the behaviour and everybody moves on and we don't have to ban a player. That's the best cycle, and it's actually the most common cycle."
Yet there is zero tolerance around hate speech, so things like racism or sexism. "We're not talking about making mild inappropriate commentary about the other gender," Bruzzo clarifies. "We're talking about sexual harassment. That is unacceptable."
He continues: "There were many years where the conventional opinion was 'that's just gamers being gamers.' And you know what? That era is over. Things that we have tolerated in the past, we will not tolerate anymore, which includes mistreatment of our employees and our community managers. But also, it includes when we work with expert players and influencers and content creators -- they have to sign an agreement over the guidelines they must abide by."
It isn't just about punishing bad behaviour, Bruzzo says, but also rewarding those that do behave well by giving them more access to the teams that make the games. But beyond even that, EA says its "positive play" commitment is more than stronger policing of communities. The mission to stamp out toxicity in online gaming goes across the business, from development to marketing, community management to PR.
"A part of positive play is including everyone, and including everyone is not only about how we manage the communities, but also what types of stories we tell," Bruzzo tells us. "It becomes a lens for everything. It becomes a lens for our marketing and for game development and for telling more diverse and inclusive stories.
"Electronic Arts' strong belief in inclusion has been showing up for years. The kind of openness we have in The Sims around gender and sexual orientation and player choice, or the relationship openness in Bioware games, or the inclusion of women's national teams in FIFA. Historically, these are all strong investments that EA has made, but we have lots more that we can do. When we think about positive play for everyone, we can not only create cool systems, like the ping system, but we can incorporate more characters from diverse backgrounds and tell more interesting stories."
He concludes: "We want everyone to feel like they can play. We want there to be fair play, we want there to be healthy play and we want there to be healthy communities. And now is the right time to use very clear language that we are standing up for positive play."