Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

YouTube can't be trusted with Google Stadia | Opinion

Years of lax content policy has permitted bigotry, hate groups, and harassment to flourish on YouTube. Now Google wants to use it to sell games directly

Google's Stadia Connect presentation last week was a welcome deluge of info for anyone curious about the particulars of the company's streaming service. It covered pricing models, connection speeds, games, controllers, launch windows, and plenty more, giving a much clearer map of what we can expect from Stadia than we had back in March.

One thing we didn't see, however, was YouTube. In the initial GDC reveal presentation, Google Stadia was impressively shown as working with a kind of YouTube integration, where someone watching a YouTube trailer of (in the example case) Assassin's Creed Odyssey can, at the end of the trailer, click a single button and in "five seconds" be playing the game.

It's a brilliant trick, and one with piles of marketing possibilities for developers and publishers. It's a relatively short mental hop from what Google showed at the reveal to something like free game demos accessed via YouTube trailers, or inventive ways to connect with games your favorite streamers are playing and try whatever cool feat they just pulled off for yourself. Of the many interesting promises of Google Stadia, the potential to turn viewers into players instantly could prove an enormous draw both for those selling games and those interested in them.

But YouTube wasn't a part of last week's Connect presentation, presumably because Google had more important things to discuss at that particular showcase. Fair enough -- Connect was more about pricing than presentation. But YouTube probably could have used a positive story this week after multiple moderation missteps put it back in the media spotlight -- a spotlight it's been smack in the middle of for some time now thanks to years of failed content policy.

"YouTube probably could have used a positive story this week after multiple moderation missteps put it back in the media spotlight"

While the longer story involves years of hand-wavey moderation that has allowed hate speech, conspiracy theories, bigotry, and pedophilia rings to not only exist on the platform but even be recommended to users, the happenings last week centered around Vox journalist Carlos Maza. In a Twitter thread, Maza outlined how YouTuber Steven Crowder had been making years worth of videos "debunking" Maza's own series, Strikethrough. These videos were peppered with racial and homophobic slurs and insults, and Crowder's aggressive targeting of Maza had such an effect that when Maza was doxxed last year, it was in the form of a wave of text messages with one common message: "debate steven crowder."

Of course, Crowder's videos remained on YouTube, selling advertising space and generating money for both Crowder and YouTube alike... until recently. It took Maza's Twitter thread with over 20,000 retweets and over 70,000 likes to get YouTube to take notice, and even then it concluded initially that the videos didn't violate YouTube policy, despite said policy notably containing admonitions against hateful content, harassment, and cyberbullying.

YouTube would reverse its decision a few days later, sort of. In the follow-up tweet, YouTube said it would be demonetizing Crowder's channel, effectively declaring that while videos containing homophobic slurs are appropriate content to exist on YouTube, they're just inappropriate enough that creators don't get to make money off of them. That is, if you can get enough retweets and likes for the company to actually look at the video and demonetize it in the first place.

While all this was going on, YouTube also made a move that every major internet platform ought to have made from day one: it outright banned content that glorifies hate groups, something it somehow hadn't explicitly done before. In its announcement, YouTube noted it would prohibit videos that "justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status." It also included mentions of videos that promote Nazi ideology, or that deny documented historical events such as the Holocaust or the Sandy Hook Elementary shooting.

"Popular content creators seem to dive into hateful content again and again with no lasting repercussions"

This is a hollow triumph. The worst of the accounts that are being removed have been on YouTube for years, allowed to promote and spread the beliefs they espouse, and harass victims caught on the other end. In theory, YouTube touts a "tougher stance" toward supremacist content and has apparently banned hate speech for some time, but those policies have a poor track record of enforcement. How many stories have come out over the years not just about such videos existing on YouTube, but about the platform's algorithm offering inappropriate, hateful, or outright untrue "news" videos presented as recommendations following an entirely harmless initial search?

While Crowder's attacks on a Vox journalist are unlikely to be something consumers might click on to purchase a video game, the kind of content he's being demonetized for isn't unusual at all on gaming YouTube. The biggest name in game streaming dropped a slur on stream last year, and while Ninja apologized, other popular content creators seem to dive into such language and content again and again with no lasting repercussions.

Though nothing's been announced yet, I don't think there's any way we get Google Stadia without (eventually) some sort of YouTube functionality and connection to popular gaming streamers at least, if not content creators at large. In the most optimistic of worlds, that could be a very good thing.

Developers and publishers could partner with content creators to promote their titles and end with a big "Play on Stadia" button, offering more marketing opportunities to game makers and another revenue stream for content creators. It's one that might prove especially lucrative for small developers and creators, assuming there's an in-road into such an arrangement for anyone other than the biggest and richest.

"You don't go from recommending anti-semitic 'pranks' from one of your most popular streamers to taking a meaningful stance that such videos are bad, actually, overnight"

But if and when that happens, YouTube has done an abysmal job of building any kind of trust it can handle such an arrangement. On paper, the company will say it doesn't and hasn't permitted hate speech, but historically it has required a flood of media attention on channel after egregious channel before it ultimately took action on things that were supposedly against its rules.

Even now, YouTube is proudly saying that in addition to finally kicking out hate groups, it's "reducing the spread" of what it calls 'borderline content' through demonetization. In the same blog post from earlier, the examples of such content provided are "phony medical cures" and "claiming the earth is flat," but its response to Steven Crowder indicates that YouTube also views homophobic slurs as falling into a morally gray area that should be allowed to persist.

In a world where YouTube has a meaningful connection to Google Stadia, I have a hard time believing we won't see content creators cashing in on clicks or sales of a game they promoted on the same stream in which they drop "borderline" language, harass another content creator, or spread hateful views. Individual publishers' passive or active acceptance of this kind of behavior by the biggest promoters of their games has been across the board in the past, but with YouTube holding the power to dictate where the 'Buy Now on Stadia' button appears, those putting their games on Stadia may end up not having a choice in the matter.

YouTube's removal of hateful content from its platform is obviously the right one. But it also has years worth of trust to rebuild with the victims of the kind of content it now says is banned, and its audience as a whole. You don't go from recommending anti-semitic "pranks" from one of your most popular streamers to taking a meaningful stance that such videos are actually bad overnight. There's nothing to suggest we won't continue to see the propagation and monetization of such content on YouTube, even if its creators have to make slight adjustments to their formulas to get it to pass whatever limited moderation is put in place.

And there's absolutely nothing inspiring confidence that such content won't eventually worm its way into recommendations or actual videos promoted by Google and YouTube as gateways to Stadia. After all, recommendations are (at least in large part) based on engagement, and what gets more engagement than over-the-top reactions, anger, and finger pointing? On gaming YouTube at any rate, that kind of content remains king.

YouTube has some serious house cleaning to do before it becomes a hub for Google Stadia sales. Even if my predictions are wrong and a solid wall remains between Stadia videos and YouTube's darkest corners, a platform that pays lip service to inclusivity but actively houses and promotes hate speech and harassment isn't one I want to pay money to buy video games. And it's not a platform I trust to shape the future of this medium.

Read this next

Rebekah Valentine avatar
Rebekah Valentine: Rebekah arrived at GamesIndustry in 2018 after four years of freelance writing and editing across multiple gaming and tech sites. When she's not recreating video game foods in a real life kitchen, she's happily imagining herself as an Animal Crossing character.
Related topics