If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Community management post-Christchurch massacre | Opinion

Veteran product manager James Kozanecki says the industry can no longer dismiss toxic behavior as harmless trolling

Late last week an Australian troll-turned-terrorist massacred 50 people (at time of publication) as they prayed inside a mosque. His crimes were broadcast live on Facebook and he gave warning to fellow internet commenters on notorious haven for hatred and abuse 8Chan (shoutout to Mark). In the alleged terrorist's manifesto he talks about a variety of political topics, but it's his 'shitposting' and regular dropping of memes which stand out the most. Clearly, internet culture has played a role in shaping his beliefs.

The days of ignorantly being able to claim, 'Oh it's just a meme, it's harmless' are over.

I've been working in online community management for more than 12 years, beginning my career at a time when brands only considered communities which existed on their own directly hosted channels. As technology and communities have evolved, so has the way we communicate and prioritise our audiences. In the eyes of many businesses, community management is now a vital area. It allows fans to build rapport with a brand, with the hope they will transact again in the future.

"We decide how to deal with this problem before it spreads through the community and into the real world"

However as we measure our KPIs and return on investment, we as community managers must not forget the role we play in building spaces online for people to congregate and interact. While we can't control everything our communities say and do, we can take responsibility for properly enforcing and responding appropriately to extremist content.

We set the tone; we decide what is and isn't acceptable; we decide how to deal with this problem before it spreads through the community and into the real world.

The games industry has had more than its fair share of community backlash thanks to GamerGate and other organised harassment campaigns. To this day we've yet to see the industry take a meaningful stance against these campaigns. A few companies have, but they're mostly indie (no disrespect to them, but they lack the audience and clout to make an impact). The big publishers have failed to take a hard stance against toxic communities, and most of the big media outlets have also sat back and let the harassment go unchallenged. This must change.

Setting stricter rules and actively enforcing them without the fear of how it will impact the bottom line will probably cost organisations players/readers, but the social responsibility we have as custodians of online communities demands it. I've seen numerous brands play down or ignore anti-social behaviour of key community members due to their financial or social importance to the business. "Oh it's just a racial slur or homophobic remark," they say as they attempt to reform a toxic player (and keep them as a customer) yet again, rather than send them away.

"As community managers we have been delivered a textbook of toxic behaviour to be aware of"

There are a lot of learnings to take away from last week's tragedy, but as community managers we have been delivered a textbook of toxic behaviour to be aware of. I'm not suggesting we report every person who posts a copypasta meme onto our channels. But we should be vigilant, and use our understanding of online culture to assess such postings. Further, we shouldn't be afraid of taking permanent steps to eliminate inflammatory members from our communities.

The internet has done a wonderful job of knocking down borders, but it has also greatly diminished the personal responsibility we would normally take in a real life scenario. This won't change overnight, and without a doubt it's an uphill battle. Though that's not an excuse to sit back and do nothing. Inaction has led us to where we are now.

Please don't take me as saying this is a problem the games industry is solely responsible for; we are not. Allow me to make this very clear: anyone who owns or runs an online community -- not just in gaming -- has a role to play when it comes to identifying and countering negative behavior online.

You might think that I am taking the high and mighty road here, and perhaps you are right. What I've written above is easy to preach and difficult to practise, but let me leave you with these passing thoughts. Last week's shooter most likely interacted with our platforms. He probably played our games and interacted with our players. How do you feel knowing that your channels gave him an avenue to spread his messages?

James Kozanecki is the product manager for World of Tanks in South East Asia & Australia and New Zealand. He has over 12 years of experience working in strategic communications and community management.

Related topics
Author

James Kozanecki

Contributor

Comments