Skip to main content

Tackling online abuse in the games industry is "not optional"

2019 in Review: "Whatever the platforms are doing, we've got to behave as if the cavalry is not coming," says digital security expert

As we leave 2019 behind and consider what dark future 2020 has in store for us, it's a good time to reflect on some of the things which defined last year and will likely define this one too. It almost goes without saying as it's become so mainstream at this point, but the cloud of targeted online harassment still hangs over this industry like California wildfire smog, and the situation doesn't appear to be improving.

The most high profile and perhaps extreme incident this year of this year was when Ooblets developer Glumberland received "thousands, if not tens of thousands" of abusive messages following its announcement of an exclusivity deal with the Epic Game Store. This was joined by the targeted harassment of a gay journalist -- stoked by by right-wing YouTuber Steven Crowder -- and the ESA leaking personal information of over 2,000 games journalists which sparked a wave of abuse, especially against women and members of the LGBTQ+ community.

Following months of controversy around the Epic Game Store in particular, Epic finally denounced the "disturbing trend which is growing and undermining healthy public discourse." But harassment of developers and journalists alike is nothing new. It's so accepted at this point that the constant level of abuse is background noise to our daily lives. Someone, somewhere, is getting harassed everyday because of video games. For anyone lucky enough to not be targeted directly, it's like low-level static that is just simply endured. The threat of online harassment comes with the territory these days.

Anita Sarkeesian

Anita Sarkeesian is a media critic, anti-harassment advocate, and founder of non-profit educational organisation Feminist Frequencey. She was one of the main targets of Gamergate, and has faced extreme levels of abuse over the years.

"The harassment that we saw explode in the games industry is now very mainstream," she tells us. "It's a part of internet culture in a very horrifying way. It's a part of our political structures and political discourse, so it's so much bigger than just the games industry, and it always was. This was never exclusive to geek communities or gaming communities. But now we're seeing just how influential these mobs can be, and how these strategies and tactics are used to perpetuate fascism and white supremacy and other sorts of hate ideologies."

While it's recognised that a culture of online harassment shouldn't be the status quo, enacting change is another problem entirely. The origins of this problem are complex, and the tools to combat it limited. It's an issue which is only exacerbated by the dispute over accountability; does responsibility lie with the police, government, platform holders, employers, or individuals?

Tall Poppy is an online safety consultation firm which helps companies insulate their employees from harassment and hacking. Speaking with, co-founder Leigh Honeywell says that harassment in the games industry isn't necessarily unique, but there is a virulence to it that appears "qualitatively different" from what she has seen in other fields.

As she explains, the core of abusive behaviour is entitlement; this goes some way in helping us understand its prevalence within gaming and wider geek culture, where it ties in very closely to fandom. Whether it's Star Wars introducing minority and women characters, or Steam being sidelined in favour of the Epic Game Store, Honeywell describes it as "jilted entitlement."

Leigh Honeywell

"It's entitlement to other people's affection, or time, or energy, or labour or bodies," she says. "And I think we see that thread really strongly in online harassment situations, where there is those sort of ties to fandom."

For anyone working in the games industry, witnessing friends and colleagues become victims of targeted online abuse can be disheartening and even baffling. If you've ever spent an extended period of time on Twitter, you'll know how difficult it is to escape the churning waves of toxicity. "For a lot of harassers, there's sort of a refusal to acknowledge the humanity of the people they are targeting," says Honeywell. "It's fundamentally dehumanisation, and that's one of the things I find most concerning about a lot the harassment situations."

Some months ago put a call out to anyone who had found themselves the victims of targeted harassment, in order to better understand their experiences.

Journalist Marijam Didžgalvytė, whose work often focuses on the intersection between video games and politics, told us how she was pulled into numerous abusive threads, as her Twitter was flooded with "vile comments." In some instances, her employer was even tagged into the threads, and it all culminated with threats of violence.

Didžgalvytė suggests harassment is driven largely by economic alienation and the "so-called crisis in masculinity."

"At the end of the day, I don't necessarily even want them punished per se," she said. "My political philosophy employs empathy in attempting to understand the motivations behind people's actions."

"The harassment that we saw explode in the games industry is now very mainstream. It's a part of internet culture in a very horrifying way"

Anita Sarkeesian, Feminist Frequency

Another respondent, who wished to remain anonymous, told of us of how they kept a bladed weapon in their home following the ESA data leak earlier this year. By their own admission, this person is not a prominent figure in the industry, but quickly found themselves subjected to relentless homophobic abuse, spam, viruses, unlisted phone calls and mysterious text messages filled with threats of physical violence and vandalism.

"I stopped reading them for my mental health's sake," they said. "I couldn't believe that such a small figure as me was receiving this kind of harassment."

The person said they wanted to see the ESA fined for what happened, with the money being donated to researching cyber security and helping the victims.

"I personally do not want reparations, I just want accountability for this, otherwise nothing will truly be done," they said. Unless action is taken, they said, people in the industry will continue to be endangered.

A study from the International Federation of Journalists found that 64% of female journalists had experienced online abuse. Of those, 47% did not report the incident, and 36% admitted self-censorship in the face of abuse.

Similarly, Amnesty International commissioned Ipsos MORI to conduct an online poll of women aged 18 to 55 in eight countries. The poll found that on average 23% of women had experienced online abuse or harassment. This was highest in the US, where abuse rates were 33%.

It's an issue that needs to be taken seriously. So often this harassment is a reaction to inclusion and representation in media, fuelled by prejudice, and directly attacking members of the games industry.

"We should be changing the entire culture of the industry where such abuse does not just come with the territory of actually working in games," Didžgalvytė said.

"We should be changing the entire culture of the industry where such abuse does not just come with the territory of actually working in games"

Marijam Didžgalvytė, journalist

"I broadly kept quiet about my abuse, as I didn't wish to be seen as a 'victim', 'troublemaker', unattractive to be hired, which is actually really messed up. I'm hoping the next generation of women in games won't have to go through the same experiences and lack of support."

The sort of online abuse people experience in this industry is oppressive and authoritarian. This is a creative sector; we only hurt ourselves when we fail to address the problem and people can see no path out other than self-censorship or catering to an angry mob's demands.

As Azmina Dhrodia wrote in The New Statesman, online abuse is a human rights issue: "Amnesty's poll shows that a large proportion of women in the countries polled believe social media platforms and governments alike are inadequately dealing with an issue that is both driving women off social media platforms and deterring women from speaking freely online.

"Without urgently addressing this serious human rights issue, we risk further silencing women during a time when their voices  -- to some extent --  are finally being heard."

The UK government recently released its Online Harms White Paper, which identified the "serious psychological and emotional impact" of online harassment. Overwhelmingly, the paper suggests the responsibility lies with platform holders to safeguard the wellbeing of their users, and proposed a number of regulatory and voluntary initiatives.

The proposed regulatory framework would see companies take more responsibility, and would be overseen and enforced by an independent regulator. This would see companies having to fulfill a new legal duty, revolving around transparency, delivering appropriate and timely responses, demonstrating they are fulfilling their duty of care, cooperating with law enforcement, and providing easy to access user complaint functions. Companies would also have to invest in development of safety technology to reduce the burden on users, and the regulator would be able to levy "substantial fines."

"Victims are dependent on the police's appetite and when an investigation is launched the matter is largely out of the victim's control"

Amy Bradbury, Harbottle & Lewis

Additionally, it suggested the government develop an online media literacy strategy in collaboration with stakeholders such as digital and news organisations, educators, researchers, non-profits, and interest groups.

The white paper also noted that many companies claim a strong track record in online safety, but this is at odds with the responses to a previous government investigation into internet safety strategy, which found that only 20% of people felt their concerns were taken seriously by social media companies.

What makes this situation all the more dire is not just the current lack of preventative measures, but the inadequate attempts to stymie abuse and protect users who have found themselves a target.

Amy Bradbury is a senior associate in the media and information group of UK legal firm Harbottle & Lewis. She tells us that while civil action can be taken against online abusers, it can be expensive and time consuming. This is complicated further by anonymity and what country the abuser lives in. Yet another layer of complexity is added by where the platform holder is based, their slow and often resistant response to requests, and reluctance to release user information for fear of breaching data protection law.

"On the criminal side of the law, the police are faced with a rising tide of online reports," says Bradbury. "Victims are dependent on the police's appetite and when an investigation is launched the matter is largely out of the victim's control.

"This all highlights a need for the platforms to take a more responsible approach to their users and their wellbeing, and to put in place both proactive and effective reactive measures to guard against abuse and harassment."

But where does all of this leave us? Well, back where we started. The power to prosecute online abusers is functionally inadequate given the scale of online harassment, and even though there are some systems in place, there are multiple roadblocks.

"The fact that they said there wasn't anything they could do for me left me incredibly helpless"

Anonymous games journalist

Victims have very few options available to them, other than to go dark and hope it doesn't happen again. Some will begin to self-censor in order to avoid enduring something so horrible ever again. One journalist told us they tried contacting the police after receiving abuse, but was informed that nothing could be done.

"The fact that they said there wasn't anything they could do for me left me incredibly helpless," they said.

So with responses like this, it's hardly surprising that most people don't even bother with law enforcement. As noted by Didžgalvytė: "Practice shows that the police do not take such threats seriously."

In a statement issued to, a spokesperson from the UK Home Office said: "Nobody should have to endure online harassment, bullying or hate-filled abuse, which is why the Government is working to make the UK's online environment a safer place for everyone... We are clear that what is illegal offline is illegal online and all perpetrators should feel the full force of the law."

Facebook issued a similar statement: "There is no place for trolling, bullying or harassment on Facebook and we will remove credible threats of physical harm to individuals when reported to us. Facebook is governed by a set of Community Standards. These standards set out limits for acceptable behaviour and content. If they are broken, we move quickly and take appropriate action when we are made aware of it."

As did Twitter: "We believe that people who don't feel safe on Twitter shouldn't be burdened to report abuse to us. Previously, we only reviewed potentially abusive Tweets if they were reported to us. This isn't acceptable so we've made it a priority to reduce the burden on people to report."

You'd be forgiven for skim-reading those statements, as it's nothing we haven't seen countless times before. It's hard to take Twitter and Facebook seriously the problem of online harassment shows no sign of abating.

"If we're going to keep this as a thing that we have in our culture and society, they have to address these abuse issues. It's not optional"

Leigh Honeywell, Tall Poppy

Is anything actually being done though? The aforementioned white paper is certainly a positive step which proposes direct action and encourages a global approach for online safety. This is joined by a review from the Law Commission into the current legislation around abusive online communication, and hate crime legislation.

Meanwhile, Facebook tells us it has tripled the size of its security team and invested in AI and machine learning to help keep users safe. The social media giant has also been working on its Safe Communities Initiative which proactively identifies and removes posts or groups that break Facebook terms of service.

Unfortunately though, moderating toxic content comes with its own human cost, as was revealed by an investigation into working conditions at Cognizant-operated Facebook moderation sites in America, which found that staff had developed symptoms of post-traumatic stress disorder after extended exposure to extreme graphic content. Over at Twitter, 38% of abusive content is surfaced proactively for human review rather than relying on user reports. This number is up from 20% in 2018. Additionally, Twitter says since last year it has increased the number of accounts suspended for evading a previous ban on the service by 45%, and tripled the number accounts suspended within 24 hours of being reported.

Given the scale of online harassment though, these numbers feel small, and they feel smaller still when your social media is flooded with blind hate directed at your colleagues and friends. For Honeywell, the platform holders still aren't doing nearly enough.

"Whatever the platforms are doing, we've got to behave as if the cavalry is not coming," she says. "We've got to figure out ways to help people defend themselves, and build up a body of knowledge around responding to online harassment in terms of what we as individuals can build." So as we face another year, one which is sure to be plagued by abuse and toxicity, we have to consider what the responsibility of our industry is, and take appropriate action to protect everyone who is a part of it.

"If we're going to continue with this grand experiment of everyone being able to send a [Twitter] message to anyone in the whole world.... these companies have to actually take responsibility for what happens on their platforms," says Honeywell.

"We're building better, more sustainable, more just, more accurate abuse reporting systems. They have to do it, if we're going to keep this as a thing that we have in our culture and society, they have to address these abuse issues. It's not optional."

Read this next

Ivy Taylor avatar
Ivy Taylor: Ivy joined in 2017 having previously worked as a regional journalist, and a political campaigns manager before that. They are also one of the UK's foremost Sonic the Hedgehog apologists.
Related topics