If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Could the UK's Online Safety Bill change gaming worldwide?

Fieldfisher's Frankie Everitt and John Brunning explore the ramifications of the flagship policy and what you should be doing to comply

Creating safe, inclusive and engaging spaces for the gaming community has long been the goal of the gaming industry, with games businesses creating and implementing tools, features and processes to address illegal and unwanted behaviours and keep players safe online.

To date, this approach has largely been self-motivated, with only limited regulation; namely the take down requirements for unlawful content under the e-commerce regulations.

However, this is all set to change with the introduction of the UK's Online Safety Bill: a flagship policy for the UK government and a milestone change in the regulation of online platforms and services, ushering in a new era of accountability online. While headlines have focussed on the social media giants, the Bill, now at Select Committee stage in the House of Commons, will also have a significant impact across the gaming industry.

A draft bill was released publicly in May last year, with both UKIE and TIGA engaging in the process on behalf of the games industry, and the Bill has been significantly strengthened and refined as a result of such scrutiny and feedback. Nevertheless, it retains at its core the same statutory duties of care aimed at keeping users, particularly children, safe online.

Will my game or platform fall within scope?

As currently drafted, the Bill will apply to online user-to-user services and search services where those services have links to the United Kingdom.

While there is some complexity in the detail, broadly speaking you will fall in scope if your online game or platform:

  • allows players or users to generate, upload and share content with others or enables content to be found through search
  • has a significant number of UK users or the UK is a target market
  • can be used in the United Kingdom and there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK

There are some exceptions to this -- one-to-one live oral communications (such as live chat functions) for example will be exempt. However, if your game also includes written chat functionality or enables group chat options, you'll fall back within scope of the Bill.

One particularly challenging (and perhaps unique) question for the gaming industry lies around the definition of the provider of a user-to-user service

One particularly challenging (and perhaps unique) question for the gaming industry lies around the definition of the provider of a user-to-user service. The Bill targets the providers that have control over who may use the user-to-user functionality of the game. The government intends for this to focus on the entity that "directly controls users' access to functionality that enables users to interact or share user-generated content, rather than any other entity that may embed that service or control other aspects of it."

In the context of an online game, each company involved in the development, publishing and distribution of the game will need to assess whether it has control over this functionality. This may be easier said than done and will depend on a number of factors including the contractual model deployed, the role of each party in provision, access and operation of the game, and the relationship with the end player.

It remains to be seen how liability will be shared between devs, publishers and distributors. We expect to see contracts in the future containing detailed requirements, indemnities and liability clauses to take into account these online safety requirements and distribute contractual responsibility.

What will I be required to do?

The Bill imposes extensive duties on regulated services with the goal of establishing a duty of care to protect users against two types of content: (1) illegal content and (2) harmful content.

Illegal content is defined as content that amounts to a terrorist offence, a child sexual abuse offence, or other priority illegal content (to be identified by government in regulations). Harmful content is divided into content that is harmful to children, and content that is harmful to adults, and is broader and more subjective than 'illegal content'.

A lot of online games will likely be accessed by children, meaning the additional child safety duties will be highly relevant to the sector

A number of general duties will be imposed on all regulated services, including online games. These include:

  • the duty to carry out risk assessments for illegal content
  • safety duties to take proportionate steps to reduce and manage the risk of harm to individual users
  • duties to protect the rights to freedom of expression and privacy
  • reporting and redress duties, including to have appropriate reporting systems and complaints processes in place

There are additional duties for regulated services that are likely to be accessed by children, and those that fall within Category 1 services (designed to target a small group of high-risk, high-reach services). While we expect many online games will fall outside of the Category 1 scope, a lot of online games will likely be accessed by children, meaning the additional child safety duties will be highly relevant to the sector, regardless of age certification measures.

What happens if I don't comply?

The potential financial implications for non-compliance are significant. Regulated games companies could be liable for fines from Ofcom of up to £18 million, or 10% of annual global revenue.

In addition, games businesses will also face the inevitable burden and cost of ongoing compliance, which is likely to be a challenge particularly for smaller games companies. While many will want to take a proportionate and risk based approach to their own compliance, the nature of games and the sudden surge in player numbers that can result from viral success will mean that those businesses will need to keep their risk assessment constantly under review.

The potential financial implications for non-compliance are significant: fines of up to £18 million, or 10% of annual global revenue

The Bill also creates a number of criminal offences, including failing to comply with an Ofcom information notice. It also reserves the power for Ofcom to pursue criminal action against senior managers of companies who fail to comply with the Bill, in order to drive compliance.

Past experience of new regulation suggests that, while there is likely to be an initial period of pragmatism from Ofcom, it will not be long before Ofcom starts to address non-compliance in key sectors. With the hyper growth of gaming and interactive entertainment, we expect this sector to be firmly in Ofcom's crosshairs.

What should I be doing now?

While the final text of the Bill will be subject to some change as it is finalised in Parliament, we foresee substantial revisions as unlikely, and would advise that now is the time to start preparing. Here are five things that you can do now to help ensure that you are compliant, by the time the Bill becomes law:  

  • Determine whether your games or platform falls in scope

Assess whether you are likely to fall within scope of the Bill. For example, ask yourself:

  • Does your game or platform enable users to generate and share user content?
  • Does it contain chat functionality (whether voice or written chat) or allow other communications between players?
  • Do you have players in the UK or does your game or platform otherwise target the UK market?
  • Conduct an online safety risk assessment
Frankie Everitt, public and regulatory lawyer at Fieldfisher

All gaming businesses in scope will need to take a proactive approach to tackling illegal and harmful content on their games.

The first step for most companies will be to assess your user base and the risk of harm to those users on the service. The Bill sets out a list of matters to be covered in the risk assessment. You should familiarise yourself with these factors and bring together relevant experts from across your business to begin developing a risk assessment.

Part of this assessment will also include considering whether the game is likely to be accessed by children (and this will be a relatively low bar). If it is, then you will be required to protect under-18s from "harmful" content, even if that content is not criminal. Categories for "harmful" content will be set out in secondary legislation in due course.

  • Review your complaints procedure

The Bill requires games businesses to have a transparent and easy to use complaints procedure that allows for specified types of complaints to be made. In particular, your complaints procedure must:

  • Allow for complaints to be made in relation to the type of content and the duties in relation to the game
John Brunning, games and technology partner at Fieldfisher
  • Provide for appropriate action to be taken when a complaint is upheld (examples of appropriate action might include the removal of flagged illegal content or reinstatement of unfairly removed content)
  • Be easy to access and use for all users, including children
  • Be transparent (for example, each step of the complaints procedure should be set out clearly, including the types of complaints that can be made and what a user can expect to happen from the point at which they make the complaint).The terms of service should also set out the policies and procedures that govern your handling of complaints.

You should start thinking about whether your current complaints procedures meet these requirements and the steps that you may need to take to update them (both in your terms of service, and other policies and procedures).

  • Set up user reporting

Games companies will need to implement systems and processes that allow users and affected persons to report specified types of content and activity. Affected persons include those who might be affected by content, or who may need to assist other users with making a complaint.

Now is a good opportunity to examine your current reporting processes and procedures -- can your users easily find and use the mechanisms in place to report content or behaviour that breaks the rules?

  • Check out Ofcom's interim codes of practice

Service providers will need to implement systems and processes to ensure that detected but unreported child sexual exploitation and abuse content is reported to the National Crime Agency (NCA). Reports must be sent to the NCA in a manner and within timeframes to be set out in regulations in due course.

In the meantime, Ofcom has published helpful interim codes for illegal content. These are useful tools and set out what Ofcom is likely to expect from regulated games in the future under the new regime. You should consider the systems, processes and tools that you currently have in place to detect illegal content, whether that is terrorist content or child sexual abuse material, and the reporting mechanisms available.

At the time of writing, the Online Safety Bill has entered the Select Committee stage in Parliament. While there is no certain timeline for entry into force, given the extent of pre-legislative scrutiny we expect a comparably short period before the Bill becomes law, and enforcement begins.

John Brunning is a games and technology partner at Fieldfisher. He works across the games industry, acting for publishers and distributors, developers, social media platforms, ad techs and back end tech providers. Frankie Everitt is a public and regulatory lawyer at Fieldfisher. She has advised on high profile litigation and in non-contentious matters for technology companies. She has followed the development of the Online Safety Bill over the past two years, and advised clients on its implications.