Ofcom releases new guidance on how the Online Safety Act affects video games
Ofcom has published new guidance explaining how the Online Safety Act applies to video games – and what gaming companies must do to help protect players from harm. The regulator says the law makes online platforms legally responsible for keeping UK users safe – especially children – even if their business is based outside the UK.
The guidance sets out how the rules apply to gaming platforms, what types of online risks exist, and what steps companies should take to stay compliant with the law.
How the rules apply
The Online Safety Act covers any service where users can interact, chat or share content, such as messages, images, audio or videos. That means many online games – especially those with live chat, multiplayer lobbies, or user-created environments – now fall under the same safety rules as social media sites.
Ofcom says that features like matchmaking systems, team chat or open-world voice channels are all examples of user interaction. So even if players are simply connected with strangers in a game lobby, that service must follow the new online safety standards.
The law doesn’t apply to content made purely by the game developer, such as offline story modes or official downloadable content. Developers can also use Ofcom’s online tool to check whether their games are covered by the new rules.
Risks in online gaming
Ofcom’s research highlights that online gaming can expose players – especially teenagers – to harmful or upsetting behaviour. Its Online Experiences Tracker found that many 13 to 17-year-olds are concerned about trolling (47%), abuse or threats (45%), and harassment or “griefing” (37%) while gaming online.
The NSPCC has also raised concerns about grooming through in-game chat, while charities such as Catch-22 and The Children’s Society have supported young victims of online exploitation linked to gaming platforms.
In addition, Ofcom found that games are the third most common place for “nasty or hurtful” behaviour among 8- to 17-year-olds – just behind social media and messaging apps.
What gaming companies need to do
Under the Online Safety Act, gaming companies must take active steps to protect their users. This includes:
- Checking if the law applies to their service
- Assessing the risk of illegal or harmful content
- Putting measures in place to limit those risks
- Reviewing and recording safety processes regularly
- Completing child safety assessments if their games are used by under-18s
Ofcom has also identified 17 types of illegal content risks – including terrorism, child sexual exploitation, hate offences and harassment – as well as 12 types of harm to children such as bullying, violent or abusive content.
Helping platforms comply
To make it easier for gaming platforms to meet their legal duties, Ofcom has created a range of tools and templates, along with a Register of Risks that outlines the biggest areas of concern.
The regulator says its aim is to make the UK “the safest place in the world to be online” – while ensuring games remain a space for creativity, connection and fun.