Call us on +44 (0)20 7465 4300
ashni-Wh9ZC4727e4-unsplash-min
20 November 2023

Online Safety Act receives Royal Assent

Henry Watkinson and Hanna Basha of our Dispute Resolution Team go in-depth on the implications of the new Act in their latest article.

The Online Safety Bill received Royal Assent on Thursday 26 October, becoming the Online Safety Act 2023.

The Act has received a lot of media attention already.  It places legal duties on social media platforms, which the government believes heralds a new era of internet safety.  Whether their optimism is warranted, only time will tell, but the Act marks a stark change in the regulatory regime.

Prior to the Act, platforms such as Facebook, TikTok and Instagram were protected from liability for content posted on their platforms to a significant degree by provisions in the Defamation Act 2013 and the E-Commerce Directive (Directive 2000/31/EC). These relatively recent provisions have brought with them a number of problems for users wanting to remove content from online.  One of the most challenging has been the proliferation of online porn, with victims finding it difficult to force platforms to take down offensive material because the claims which can be brought against them if they fail to do so are tricky and technical.  The only recourse has been to the Courts, rather than to a regulatory body such as Ofcom.

Our firm recently acted for Georgia Harrison in her civil claim against Stephen Bear who uploaded a video of her engaged in sexual intercourse. Georgia was incredibly brave, sued and was awarded record damages and an injunction.  However, even with a civil win and the criminal conviction of Stephen Bear, it remains difficult for Georgia to track down and hold responsible all of the platforms hosting the video.

The Act should assist with victims in Georgia’s position.  It has a global reach; meaning that it applies to providers based in the UK but also to platforms based overseas that have a significant number of UK users (or specifically target the UK market) and it extends to any platform which hosts “user generated content” and to internet search providers.

The size and reach of a social media platform will determine how many of the regulations bite, but broadly speaking the most well-known platforms will now be subject to much tougher rules, including the requirement to self-regulate.  Companies will have to put in place systems and processes to improve customer safety.

Labelled by the government as the “Triple Shield” the Act will seek to protect adults by introducing the following:

  1. All in scope services will need to put in place measures to prevent their services being used for illegal activity and to remove illegal content when it does appear;
  2. Category 1 services (the largest and most-high risk services) must remove content that is banned by their own terms and conditions; and
  3. Category 1 services must also empower their adult users with tools that give them greater control over the content that they see and who they engage with.

The Act provides that platforms must conduct “content moderation, including taking down content… if it is proportionate to do so“.  At these early stages, what is precisely meant by “proportionate” is yet to be defined by Ofcom (or tested by the courts) but it is anticipated that the most well-known social media platforms will be expected to moderate, and remove, harmful content.

Service providers must now comply with a range of reporting duties and an obligation to undertake regular risk assessments.

Ofcom is responsible for overseeing compliance with the rules. The Act grants Ofcom the power to demand information from relevant companies, powers to enter companies’ premises to access computer equipment and data and to request interviews with company employees.  Company executives could face prosecution for failing to comply with an Ofcom information request.  Offences also extend to companies who supress or destroy information or for providing false information at interview.

Ofcom is also granted powers of enforcement.  It will be able to implement business disruption measures and issue enforcement notices against companies that do not comply with the Act.  Most notably, Ofcom has the power to issue fines against companies of up to £18million or 10% of global annual turnover, whichever is higher.  Put into context, TikTok generated an estimated $9.4 billion in revenue in 2022.

The process is ongoing.  Ofcom will need to categorise service providers and draft additional requirements for large providers, and this is not likely to be completed until 2025.  We will therefore not see fines being handed-out for some time yet, but the looming threat will no doubt focus the minds of the large social media sites and video sharing apps to vastly improve monitoring and deletion of harmful posts and videos.

About the Author
Henry Watkinson
View Profile
Hanna Basha
View Profile