Skip to main content

Fostering a positive community in VR

At Ludicious X, IGDA executive director Renee Gittins talked about keeping communities safe and fighting toxicity in VR

"Intense" and "visceral": that's how Patrick Harris, now lead game designer at Behaviour Interactive, described what harassment feels like in virtual reality. Back in 2016, Harris was design director at VR studio Minority Media, and he gave a GDC talk about online harassment in VR, describing it as "way, way, way worse" than in traditional games.

"It triggers your fight or flight response," he said during the talk. "[Harassers] can lean in and touch your chest and groin, and it's really scary."

And his experience is far from isolated. The Guardian dedicated an article to harassment in VR back in 2016, and GamesIndustry.biz took a deep dive into the topic several times too, talking to VR studio Cooperative Innovations and Rec Room developer Against Gravity in 2017.

In 2018, research group The Extended Mind surveyed regular VR users, and found that 36% of men and 49% of women among the respondents had experienced sexual harassment in VR.

"The fact that everything is so immersive means that potential abuses can be so much more impactful [in VR]"

The issue has not gone away since; if anything, the medium becoming more prominent has led to more challenges. That is why Renee Gittins, executive director of the International Game Developers Association and CEO at Stumbling Cat, decided last month to give a talk on the topic, which started by recounting Harris' experience.

In this Ludicious X talk, Gittins addressed the best ways to manage communities in virtual reality spaces to prevent harassment and toxicity, and gave words of advice that are applicable beyond just VR.

"The goal of this is to really allow everyone within virtual reality communities to fully appreciate the content and the community that they're a part of, and to find positivity within it," she said. "I wanted to talk about VR in particular because, not only am I very passionate about the medium, but virtual reality allows for immersion, depth of interaction, and feelings that we've never seen before in the medium of interactive technologies.

"The fact that everything is so immersive, the fact that it really feels like you have presence within the world, means that potential abuses can be so much more impactful. So today I'm going to be talking about tools and techniques to help avoid this harassment and to help cultivate positive communities within virtual reality."

The install base for Against Gravity's social VR experience Rec Room reached one million in 2018

Implement personal moderation tools

  • Have a 'hide' button

First on Gittins' agenda was exploring the different layers of moderation that can be combined based on the needs of your community and your game. The first layer, and the most important according to her, is personal moderation.

"Personal moderation is the least susceptible to abuse. It is also the fastest to implement, and the fastest for players experiencing negative behaviours to use and to get their problem solved," she said. "On 'normal' online games, there's already standard personal moderation social tools such as muting a player, kicking them from the group you're in, and blocking the player so that you will not be engaging with them in the future."

"The interaction between two avatars which are being built as real people is where a lot of the abuse happens"

But virtual reality has an additional type of quick moderation that's not used as frequently elsewhere, Gittins continued, and that's the ability to hide a player. This gives players a tool to protect themselves from unwanted attention, without taking the more extreme (and more visible) action of blocking.

"The interaction between two avatars which are being built as real people is where a lot of the abuse happens [in VR experiences]. In other online games, this is less of an issue and so 'hide' hasn't been used as frequently as a moderation tool.

"For a hide, I recommend to implement [it] so both the targeted harasser is hidden and the player who hides them is hidden from that person, so they are removed from each other's views. This allows for interactions with other people within the environment still and, particularly in virtual reality which tends to have smaller communities, it allows you to reuse the same spaces and maintain positive interactions while allowing people to block out players that are bringing about negative interactions."

  • Come up with an interaction level system

Another type of personal moderation is to have an interaction level system, Gittins continued.

"[That's] one that I really highly advocate for. The first time I heard about this system was at Bronycon -- they had a colour tag that you put on your badge. Green meant that you're open to strangers coming up to you and talking to anyone, yellow meant that you want to just talk to your friends, and red meant you were not in a good mood to talk to anyone.

"And that can be easily implemented within virtual reality as well. It can allow someone when they're going into an experience to set what they're feeling like at that moment. So they can say: 'hey, I'm open to interacting with everyone' or they can say: 'I only want to see my buddies right now.'"

Renee Gittins, IGDA
  • Consider enforcing personal spaces

In virtual reality, it can be difficult to prevent people from entering your personal space as there are no physical barriers preventing it from happening. However, Gittins highlighted that there are some tricks developers can use.

"Personal space can shield [you] against someone getting too close, they can protect against someone teleporting into your space in VR [games] that use teleportation for movement. And it can also prevent someone from using objects to invade your personal space, or to provide unnecessary or unwanted distraction.

"Let's say you're playing a billiards game in VR. You might want to make it so that the billiards cue that your opponent is using can't get within two feet of your head. Just so it can't be used to harass."

There are various approaches and techniques developers can use to create enforced personal spaces.

"You can fade out the object or offending person that is entering that space," Gittins continued. "You can also fade out the space that they're invading. So let's say, with that cue example, if you swing it at someone not only does the pool cue fade out of their view when it is close to them, but they also fade out of your view. So it is harder to continue that assault."

Gittins pointed out that this type of rule needs to be made flexible and adjustable to account for the diversity of players and games out there.

"Some people, based on their history, their culture, their backgrounds, might have a larger or smaller personal space that feels comfortable than others," she added. "But even when you are in a player-versus-player combat game where you're punching each other, you might want to consider still implementing personal space and critically sensitive areas, particularly around the chest and groin."

Sports Bar VR is a social VR experience attempting to recreate the atmosphere you'd have in a pub

Community moderation models

The second layer of moderation that Gittins identified is community moderation. She broke it down into three categories -- company moderators, community mods and self moderation -- and detailed what each one can bring to your community management efforts.

The first two categories are largely self-explanatory and have been explored on the GamesIndustry.biz Academy in the past. For example, Kitfox Games' Victoria Tran highlighted the importance of having someone dedicated to moderation within your company in another Ludicious talk.

Having community members helping you moderate your platforms can also be a great tool, as they usually are some of the most passionate people within your community.

But one interesting aspect of community moderation highlighted by Gittins is self-moderation, keeping your VR community safe with an automated moderation system. One example would be that, within a group of players, a vote has to be cast to remove a member that is misbehaving. Perhaps you need six votes against a team member for them to be excluded, or maybe only three -- you'll have to find a sweet spot that makes sense for the size of your community and the type of game you have.

"Some people might have a larger or smaller personal space that feels comfortable than others"

While this type of system is the most susceptible to abuse, it can be a great way to regulate your community.

"There can also be thresholds," Gittins continued. "If someone receives let's say three negative reports against them within a 24-hour period, then they are temporarily muted or blocked from the game, or blocked from matchmaking. This automated moderation based on community feedback can really help nip problematic players in the bud.

"You can [also] implement a sort of karma system. So let's say that you can give negative feedback to a player and you can track [it]. [You can then] automatically take moderation action against a player, based on that sort of total karma value, and how it goes up and down.

"Though it has some susceptibility to abuse, because you could have a group of bad actors together targeting one person [to] get them banned."

Virtual reality social platform VR Chat was a pioneer, having launched in 2014

Separate environments can keep the community safe

Gittins then touched upon how implementing separate environments can help keep your community safe and tidy.

The most common example would be an environment that you only share with players that are in your friends' list. But you can take that approach further and compartmentalise your community.

"This could be just having a set of rooms from one to ten. You can explicitly designate them and say: 'Hey, rooms five and up, adult language is okay. Rooms four and down, please consider this a PG environment.' And just setting up those separations will encourage behaviour that better fits those communities and the experiences that people are looking for within them."

Even if you don't explicitly designate those environments, it's likely that they'll develop their own culture and identity without your intervention, Gittins continued. Similar players will just naturally gather in the same space.

"Automated moderation based on community feedback can really help nip problematic players in the bud"

"People will say: 'Hey in room nine, everyone's really wacky and silly, in room ten everyone's really broody.' And people will find themselves drawn to the rooms and communities that make the most sense for them.

"And then finally there is the island of exile. If you are implementing a reporting system or a karma system, you can put all of the worst bad actors in one environment that is separate from the other environments, just to ensure that they can still play and engage with the game instead of being completely banned. But they will have to interact with people who've been similarly poorly behaved."

Tread carefully though, because your "island of exile" could get out of hand if not managed properly.

By putting all the cheaters in one room, they should soon realise they're just ruining the fun for themselves, and they should be able to find the path to redemption. But bad actors only interacting with bad actors can also be self-perpetuating and just make things worse. You need to be particularly careful in moderating this group.

On the topic of rehabilitation, Gittins noted that it has been shown that if you ask a player to reflect on the negative interactions they've had with other players, just asking those questions will make them more likely to have positive interactions and be a better member of the community going forward.

"And similarly when someone has a bad day and has poor interactions, you probably want to give them an atonement pass," she added. "Because not all negative community members will always be negative community members. There's a way to rehabilitate them, to bring them back into the fold, to let them reflect and improve upon their actions."

Cooperative Innovations' Spaceteam VR

Cultural development for your community

Gittins concluded her talk with a few thoughts on how to develop a positive culture within your community. She reminded that developing your community in a healthy way can be difficult if you didn't set the right tone from the get go.

"Having a code of conduct is so important," she said, echoing what Victoria Tran said in her own Ludicious talk. "The content that you include in your game also sets the tone and the potential behaviours, word usage, and interactions of your community.

"You need to ensure that, particularly in virtual reality where abuse can be so traumatic, [you have] a support line for victims, a way for them to talk about their experiences, to get help.

"Kicking [harassers] out of the community overall is one potential path, but with VR, when communities are so limited, you might want to look at informing them and rehabilitating them, encouraging the reflection.

"And then finally I find one of the best ways to promote a good culture is to celebrate good behaviour. I think that Rec Room does this really well -- they celebrate their community when they create good content, but when they're also kind to each other. And I think that celebrating and showing what you consider good actions, instead of just punishing bad actions, actually can help form that more positive community."

She summed up the content of her talk with the following points:

  • Personal moderation tools are super effective and powerful.
  • Community moderation can be broader reaching and allow you to more easily influence culture.
  • You need to watch out for abuse of moderation systems and consider which ones are right for your game.
  • You should work on fostering a culture overall that is against harassment.

If you're interested in researching this topic a bit more, Gittins pointed to IGDA interest groups in that field -- there's a community management one, as well as a VR/AR/MR one.

"Then we actually have a white paper, 11 pages long, that goes more in depth on all of this, available in our resource library," she concluded. "So let's try to keep virtual reality fun for everyone."

Our GamesIndustry.biz Academy guides about how to make money from video games cover a wide array of topics, from in-depth Steam guides, to how to guarantee your pitch will be rejected, a beginner's guide to bringing a game to market, or how to design better communities. You can read all our guides about selling games on this page.

Marie Dealessandri avatar
Marie Dealessandri: Marie joined GamesIndustry.biz in 2019 to head its Academy section. A journalist since 2012, she started in games in 2016. She can be found (rarely) tweeting @mariedeal, usually on a loop about Baldur’s Gate and the Dead Cells soundtrack. GI resident Moomins expert.
Related topics