Sections

Community management post-Christchurch massacre | Opinion

Veteran product manager James Kozanecki says the industry can no longer dismiss toxic behavior as harmless trolling

Late last week an Australian troll-turned-terrorist massacred 50 people (at time of publication) as they prayed inside a mosque. His crimes were broadcast live on Facebook and he gave warning to fellow internet commenters on notorious haven for hatred and abuse 8Chan (shoutout to Mark). In the alleged terrorist's manifesto he talks about a variety of political topics, but it's his 'shitposting' and regular dropping of memes which stand out the most. Clearly, internet culture has played a role in shaping his beliefs.

The days of ignorantly being able to claim, 'Oh it's just a meme, it's harmless' are over.

I've been working in online community management for more than 12 years, beginning my career at a time when brands only considered communities which existed on their own directly hosted channels. As technology and communities have evolved, so has the way we communicate and prioritise our audiences. In the eyes of many businesses, community management is now a vital area. It allows fans to build rapport with a brand, with the hope they will transact again in the future.

"We decide how to deal with this problem before it spreads through the community and into the real world"

However as we measure our KPIs and return on investment, we as community managers must not forget the role we play in building spaces online for people to congregate and interact. While we can't control everything our communities say and do, we can take responsibility for properly enforcing and responding appropriately to extremist content.

We set the tone; we decide what is and isn't acceptable; we decide how to deal with this problem before it spreads through the community and into the real world.

The games industry has had more than its fair share of community backlash thanks to GamerGate and other organised harassment campaigns. To this day we've yet to see the industry take a meaningful stance against these campaigns. A few companies have, but they're mostly indie (no disrespect to them, but they lack the audience and clout to make an impact). The big publishers have failed to take a hard stance against toxic communities, and most of the big media outlets have also sat back and let the harassment go unchallenged. This must change.

Setting stricter rules and actively enforcing them without the fear of how it will impact the bottom line will probably cost organisations players/readers, but the social responsibility we have as custodians of online communities demands it. I've seen numerous brands play down or ignore anti-social behaviour of key community members due to their financial or social importance to the business. "Oh it's just a racial slur or homophobic remark," they say as they attempt to reform a toxic player (and keep them as a customer) yet again, rather than send them away.

"As community managers we have been delivered a textbook of toxic behaviour to be aware of"

There are a lot of learnings to take away from last week's tragedy, but as community managers we have been delivered a textbook of toxic behaviour to be aware of. I'm not suggesting we report every person who posts a copypasta meme onto our channels. But we should be vigilant, and use our understanding of online culture to assess such postings. Further, we shouldn't be afraid of taking permanent steps to eliminate inflammatory members from our communities.

The internet has done a wonderful job of knocking down borders, but it has also greatly diminished the personal responsibility we would normally take in a real life scenario. This won't change overnight, and without a doubt it's an uphill battle. Though that's not an excuse to sit back and do nothing. Inaction has led us to where we are now.

Please don't take me as saying this is a problem the games industry is solely responsible for; we are not. Allow me to make this very clear: anyone who owns or runs an online community -- not just in gaming -- has a role to play when it comes to identifying and countering negative behavior online.

You might think that I am taking the high and mighty road here, and perhaps you are right. What I've written above is easy to preach and difficult to practise, but let me leave you with these passing thoughts. Last week's shooter most likely interacted with our platforms. He probably played our games and interacted with our players. How do you feel knowing that your channels gave him an avenue to spread his messages?

James Kozanecki is the product manager for World of Tanks in South East Asia & Australia and New Zealand. He has over 12 years of experience working in strategic communications and community management.

Related stories

SuperData: Sekiro sells 1.4m digital units in March, beating Dark Souls III launch

Battle royales continue to drive growth alongside console games as premium PC digital sales decline

By Rebekah Valentine

Epic devs speak out on Fortnite crunch

"I was working at least 12-hour days, seven days a week, for at least four or five months" after survival shooter became a mainstream phenomenon

By Brendan Sinclair

Latest comments (8)

Ron Dippold Software/Firmware Engineer A month ago
This is a wake-up call for a lot of people. The killer yelled 'Remember lads, subscribe to PewDiePie!' just before he opened fire. Felix claims to be 'sickened' about it, and maybe he is (this could hurt his ads), but he's been race-baiting, giving shout-outs to white nationalists, and using racial slurs for years and writing it off as harmless jokes. People have responded, also for years, that white nationalism isn't a harmless joke and here we are. That the killer gave him a shout-out was the least surprising thing about the shooting.
7Sign inorRegisterto rate and reply
Marianne Monaghan Lead Cross-site Producer, Hangar 13 GamesA month ago
Edit: The post I responded to has been deleted.
Let's not get into a load of "whataboutism" to deflect from the point of this article: that community managers and the game companies that employ them have a challenge in dealing with toxic posters, and the fact that this shooter sounds like one of our own raises a disturbing question of whether and when shit posters are dangerous in the real world. How to respond in our communities is a question worthy of well-intentioned debate and I know that there are several valid perspectives. Hysterical screeds from anonymous posters are less helpful.

Edited 1 times. Last edit by Marianne Monaghan on 18th March 2019 7:23pm

3Sign inorRegisterto rate and reply
Brendan Sinclair North American Editor, GamesIndustry.bizA month ago
@Marianne Monaghan: Like James said, we're all responsible for our own communities, and we do our best to keep this one healthy. Sorry it might have made your comment a bit confusing to others.
1Sign inorRegisterto rate and reply
Show all comments (8)
Bob Johnson Studying graphics design, Northern Arizona UniversityA month ago
There's always been nutjobs out there. Crazy folks. Nothing has changed.

The only difference is today's nutjobs grew up with youtube and facebook.
1Sign inorRegisterto rate and reply
Bob Johnson Studying graphics design, Northern Arizona UniversityA month ago
The solution to trolling on forums is just eliminate the forums and comment sections. Make people leave feedback via snail mail as a means of separating the wheat from the chaf or keeping the riff raff out.. Or get innovative and charge people $.50 USD to leave a comment instead of using a postage stamp.

Use Q&A with questions submitted ahead of time or open up a forum for discussion for a day and staff it with an on-duty full time moderator for the day.

Edited 1 times. Last edit by Bob Johnson on 19th March 2019 5:24pm

0Sign inorRegisterto rate and reply
RobertGoodfellow Grade 9, Self-EmplyedA month ago
James Kozanecki, looking at your name I assume you, like me, are Polish, at least of polish decent. Do you not remember what happened to your people in WW2? What happened AFTER WW2 when our people were occupied by the Soviet Union? My grandfather fled Poland with my mother as a baby before the Nazi's came and killed all the Jews in my family. The communists got the rest of my relatives over the years. My grandfather helped smuggle people out of that country. As a child, me and the other young boys, we would listen to the stories of those who escaped, because my grandfather was afraid it would take generations to free Poland.

Your talk sounds just like what the Communists would say as an excuse to censor, control, and punish people who spoke out against "The Party". The best disinfectant is sunlight. People are smart. People are intelligent. People know right from wrong.

You assume people are naive and weak. That if a stray thought pops in the wrong head, people will turn to evil. Well, as someone who works with schizophrenics and the mentally ill, I can assure you, it doesn't work like that. NO ONE, who is THAT emotionally fragile, will be able to carry out this sort of complex attack. Trust me. The truly crazy people can't FUNCTION in society, and the shooter at Christchurch... unfortunately... was not insane. He was quite sane.

Not a sociopath either. That's the problem. I have met people like him. I have had great success in talking down people like him. But I need to know where they are. I need to know they exist. I can't FIX someone like the shooter if I don't know who they are.

If you censor people, then you create echo chambers where their bad ideas reflect back at them over and over and over until they become the Christchurch shooter. These people not only need to speak their ideas, we need to AMPLIFY them, so professionals, such as I, can hear them and do what we do best:

Fix them.

Do not let fear guide you. Remember your past and remember what was used to punish and enslave our people.

The Tools Of Evil Cannot Be Used For Good. Evil Always Begets Evil.
3Sign inorRegisterto rate and reply
Klaus Preisinger Freelance Writing A month ago
RobertGoodfellow is absolutely right on all accounts.
/deletecomment & /banuser followed by /ignoranceisbliss is not a responsible choice.

The game industry never wanted to raise an entire generation of children, youth and young adults with forums and social media. They never asked for it. But here we are, look at the forums and the problem is not a woman with a metal arm in WW2, the droprate of the loot, the graphical bug, the audio glitch, or some balance issue. The problem is people and how they use a set of rules to trigger and eliminate each other. All of it governed by convenience driven extremism, because interacting with people calmly takes time, costs too much, or may dispel an illusion necessary for optimized monetization. hence it is absurd to believe the least toxic community was the one which bans all its toxic users. If anything, that teaches people to be sociopaths. Keep up pretense, goat others to step into your traps built close to the line and then report others when they cross it in response to get rid of them.

Next time weekly engagement rates of 10h and more are celebrated, stop for a moment and think. You are no longer just a video game company. You made a social connection that competes with parents and teachers in terms of direct weekly engagement. Rules do not solve half as much as reason will. Spread reason not rules.
3Sign inorRegisterto rate and reply
Shane Sweeney Academic A month ago
Robert, while I do agree with you in principle, but letís bring it back to Video Game moderation. What can we do as an industry?

I doubt amplifying toxic behaviour in online games or gaming communities benefits the wider players or connects people with mental health workers.
2Sign inorRegisterto rate and reply

Sign in to contribute

Need an account? Register now.