Skip to content

The Online Safety Act 2023 is a law that aims to protect children, young people and adults when they are online. 

It sets strict rules for platforms on website to protect users from harmful content, such as misinformation, inapproriate material, and cyberbullying.  

Passed into law in October 2023, the Online Safety Act applies to companies providing online services in the UK and those whose services can be accessed by UK users. 

What Does It Do? 

The Act makes those delivering services online, like social media companies and search providers, more responsible for looking after the safety of their users.  

It means they need to make sure their systems, and how they run them, offer as little chance as possible that they can be used for illegal activity. That includes stricter age checks for users, more content controls for adults, and making sure illegal content is filtered or taken down as quickly as possible. 

Providers also need to provide easier ways to report content and problems and get support when they need it.  

This should make it safer for you to be online and use these platforms, without the risk of seeing something inappropriate or being scammed.  

What kind of content? 

The Online Safe Act covers a range of content which could be harmful to young users, including: 

  • Cyberbullying and harrassment 
  • Misinformation 
  • Graphic or violent content 
  • Self-harm and suicide content 
  • Illegal content 
  • Age-inappropriate content 

Photo by Nelly Antoniadou on Unsplash

Who Does The Act Apply To? 

The Online Safety Act must be met by companies who provide: 

  • Search services 
  • Services that allow users to post content online 
  • Services that allow users to interact with each other 

This includes: 

  • Websites 
  • Apps 
  • Social Media 
  • Cloud storage 
  • Video sharing platforms 
  • Online forums 
  • Dating services 
  • Instant messaging services. 

If a company is based outside of the United Kingdom, but has customers within the UK or can be accessed by users within the UK, they must comply with the Online Safety Act.  

Who Makes Sure They Do What They Are Supposed To? 

In the United Kingdom, Ofcom is the independent regulator of Online Safety. That means they are the ones who publish guidelines and codes to set out how providers should make sure they are as safe as they can be. They can monitor and if necessary, punish anyone who doesn’t do what they are supposed to. 

In April 2025 Ofcom published 40 new practical measures that firms needed to meet as part of the Online Safety Act. The steps were chosen to protect young people from seeing harmful content, protect them from strangers, and keep them safe online.

You can read more about the measures on the Ofcom website.

 
What happens if they don’t do what they are supposed to? 

Ofcom have powers to fine providers up to £18 million or 10% of the qualifying money they make. (whichever is biggest).  

They can also take criminal action against senior managers who don’t follow through on responding to Ofcom requests.  

In the most extreme cases, Ofcom will work with the courts to stop providers generating money or being accessed within the UK.  

How safe do you think you are online? Take our quiz to find out!

What is the Online Safety Act 2023? 

What is the Online Safety Act 2023? 

Share

You also might like these

How would you describe yourself?

We want to make sure we show you the kind of information you are looking for, so we have one quick question to get you started!