Insights

The Online Safety Bill receives Royal Assent

9/11/2023

The Online Safety Bill received Royal Assent on Thursday 26th October 2023, marking a milestone for internet safety and accountability. 

The Online Safety Act (the "Act") imposes new legal requirements on social media platforms and internet services. Said to take a 'zero-tolerance approach' to protecting people, especially children, from harm online, the Act places legal responsibility on tech companies to:

  • prevent and remove illegal content;
  • enforce age limits and use age-checking measures where there is content harmful to children; 
  • ensure more transparency about the risks and dangers posed to children on their sites; and 
  • provide parents and children with clear and accessible ways to report problems online. 

Failure to comply with the rules could see these companies receiving a hefty Ofcom fine of up to £18 million or 10% of the platform's global annual revenue, whichever is bigger. Not only could this lead to significant potential fines of billions of pounds, but, if tech companies cease to take the steps required by Ofcom to protect children, their bosses may even face prison.

Ofcom, the regulator for online safety in the UK, is rapidly moving to implement the new laws. The regulator has already published their intended regulatory approach and timelines for implementation and have set out how they will drive change in line with the Act and support services to comply with their new obligations.  

Whilst companies themselves will decide the safety measures they adopt in order to comply, Ofcom expects the Act's implementation to ensure greater online safety for people in the UK by delivering four outcomes: 

  • Stronger safety governance in online firms; 
  • Online services designed and operated with safety in mind; 
  • Choice for users so they can have meaningful control over their online experiences; and
  • Transparency regarding the safety measures services use, and the action Ofcom is taking to improve them, in order to build trust. 

The online safety regulator will also set out guidance and codes of practice on how companies can comply with their duties. These are planned to be rolled out in three phases; the first consultation for illegal harms duties (Phase One), which relate to child sexual abuse material, terrorist content and fraud, having been published on 9 November 2023. 

Child safety is clearly at the forefront of the Act, with the Secretary of State for Science, Innovation and Technology, Michelle Donelan, having previously stated that the protection of children is at "the heart of this Bill". 

Interestingly, the Act has received Royal Assent a day after it was reported that around 3,000 AI-generated images depicting child abuse were shared on a dark web forum. The Act itself does not extend to the AI companies whose models are being used to generate this abusive imagery, however, Ofcom's Chief Executive Dame Melanie Dawes has stated that the new laws imposed by the Act "give Ofcom the power to start making a real difference in creating a safer life online for children and adults in the UK". She acknowledges that whilst online safety "cannot be achieved overnight", the online safety regulator is ready to meet the challenge.

featured image