If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

New Ofcom rules calls for Twitch and platforms to better police video content

Sites face fines or suspension if they do not take "appropriate measures" to tackle videos around sexual abuse and racism

UK regulator Ofcom has introduced new measures that target video platforms such as Twitch that encourage stricter actions when it comes to policing inappropriate content and protecting users.

The BBC reports that video service providers -- including Twitch, TikTok, Vimeo and Snapchat -- will be expected to follow the new guidelines. Failure to do so will result in fines or, in particularly serious cases, suspension of the entire service.

The regulations follow rising concern over the content made available through these platforms, with Ofcom claiming a third of all users see videos that are hateful or worse on these sites.

Under the new rules, sites like Twitch will need to provide clear rules for uploading content and enforce them effectively, and simplify their reporting and complaints processes. Sites with adult-only content will need to demonstrate robust age-verification systems.

In particular, Ofcom expects such platforms to take "appropriate measures" in protecting users from content related to child sexual abuse, racism and terrorism.

For years, Twitch has been under fire for inappropriate content on its site and hateful conduct among its users -- not to mention the rampant issues with abusive and unsafe behaviour behind the scenes.

Less popular categories on the site have been known to be targeted by people uploading everything from porn to footage of terrorist attacks, such as the Christchurch shooting. In this instance, Twitch was able to identify the perpetrators and filed a lawsuit against them.

Earlier this year, the company released its first transparency report, which showed that less than 15% of Twitch user reports led to enforcement actions -- and only 2% of reports around hateful conduct and harassment were acted upon.

More recently, the company promised more action in the face of continuing hate raids within the chat feeds for marginalised streamers. It did eventually add phone verification as a way of preventing users from setting up multiple accounts for such attacks, but not before streamers rallied to organise the #ADayOffTwitch protest.

Ofcom acknowledged it will be impossible to catch every instance given the amount of content that is uploaded. The regulator promised to be "rigorous but fair" in its duties, although clarified it will not be responsible for assessing individual videos.

"Online videos play a huge role in our lives now, particularly for children," said Ofcom CEO Dame Melanie Dawes. "But many people see hateful, violent or inappropriate material while using them.

"The platforms where these videos are shared now have a legal duty to take steps to protect their users."

Ofcom will produce a report next year, examining whether 18 selected companies have followed the new guidelines.

The rules are not expected to apply to YouTube, as this falls outside the UK's jurisdiction and is monitored by Irish regulators instead.

However, YouTube will be within the scope of the forthcoming Online Safety Bill, which is expected to lay out measures to tackle online harms on major technology platforms, including Google, Facebook and Twitter.

Today, it was reported that Twitch has suffered a security breach, exposing the site's source code, streamer payouts and more.

Related topics
Author
James Batchelor avatar

James Batchelor

Editor-in-chief

James Batchelor is Editor-in-Chief at GamesIndustry.biz. He has been a B2B journalist since 2006, and an author since he knew what one was