Insights Ofcom consults on illegal harms requirements under the Online Safety Act 2023 (“OSA”)


Under the OSA, which became law on 26 October 2023, companies that allow users to interact or upload and share content (user-to-user, or “U2U”, services) must take steps to prevent access to and remove illegal content (e.g. relating to terrorism offences, child sexual exploitation and abuse (“CSEA”), including grooming and child sexual abuse material (“CSAM”), encouraging or assisting suicide or serious self-harm and hate offences) and prevent children from seeing content that is harmful to them but not necessarily illegal (e.g. content promoting self-harm and eating disorders). Specifically, the Act requires providers to assess the risk of harm from illegal content for all users and, if their service is likely to be accessed by children, the risk to children from content that is harmful to them. Providers must take effective and proportionate steps to mitigate and manage the risks identified, be clear in their terms how users will be protected, and provide user reporting and appeals processes, whilst balancing all such obligations against the importance of free speech and privacy. Similar duties are imposed on providers of internet search services. Providers of online pornography must take steps to ensure children cannot normally access such content.

Ofcom, the regulator under the Act, will have powers to impose large fines for the most serious infringements (up to £18m or 10% of qualifying worldwide expenditure (whichever is greater)) and certain infringements of the Act will amount to a criminal offence with directors and senior managers facing personal criminal liability. Infringements will consist of the failure to take the various risk assessment and mitigation measures required under the OSA; Ofcom will not be enforcing through decisions based on individual pieces of content or complaints as it does in other areas of its remit.

Most of the substantive provisions of the OSA cannot come into force until Ofcom produces certain Codes of Practice and guidance. On 9 November, it issued a consultation focusing on the illegal content duties. The OSA lists a number of “priority offences” and U2U service providers must act to prevent, and search services must minimise the risk of, users encountering content which amounts to any of those offences. This consultation, which runs to hundreds of pages, seeks feedback on Ofcom’s Register of Risks (Ofcom’s analysis of the causes and impacts of online harm), guidance for conducting risk assessments including record keeping and review, how risks are to be mitigated (to be captured in a Code of Practice), guidance on how to make illegal content judgements and Ofcom’s approach to enforcement. It is worth noting that Codes of Conduct are voluntary, but service providers that follow them will be deemed to have complied with the relevant legal duties under the OSA. Some highlights from the consultation are set out below.

The proposed risk assessment guidance proposes a number of Risk Profiles, based on the type or characteristics of a particular service, which are mapped to tables of harms, “Risk Factors”, most likely to occur on that service. These derive from Ofcom’s research as set out in its Register of Risks and form one element to be considered during risk assessments. For example, social media services are considered likely to have an increased risk of nearly all kinds of illegal harm, whereas file storage and file sharing services are likely to have an increased risk of harm relating to terrorism, CSEA (image-based CSAM) and intimate child abuse offences. Services with content recommender systems are likely to have an increased risk of harm relating to encouraging or assisting suicide or serious self-harm or hate offences and services enabling users to search for user-generated content are likely to have an increased risk of harm relating to drugs, firearms, extreme pornography and fraud and financial services offences.

A major part of the consultation (370 pages) explores the proposed mitigations for illegal harms which may have been identified during a risk assessment. When finalised, these will be written into a Code of Practice (one for U2U services and one for search services), drafts of which are Annexed to the consultation (another 100+ pages). It is proposed that mitigations (which include moderation (human and automated), user reporting, default settings, user controls etc) will differ depending on whether the service provider is large (average user base of greater than seven million per month in the UK), whether it is considered (based on the risk assessment) medium or high risk for a specific type(s) of harm and/or whether it is considered (based on the risk assessment) multi risk (medium or high risk for at least two of the 15 priority harms set out in the OSA).

On timing, in a separately-published approach to implementation of the OSA, Ofcom states that, within a year of the start of this consultation, Ofcom expects to publish a statement of its decisions with final versions of the Codes and guidance.  Services in scope of the OSA will then have three months to conduct their illegal content risk assessments. At the same time, the Codes will be submitted for approval by the Government and, if approved, for subsequent approval by Parliament. At the conclusion of these approval processes, the illegal harms duties will become enforceable. It is estimated that this will happen around the end of 2024.

As reported by Wiggin previously, a number of further consultations on different aspects of the DSA will follow during 2023, 2024 and beyond including in relation to the guidance on age verification/estimation to support those providers subject to the obligation to prevent children from accessing pornographic content online and in relation to the various special duties under the OSA aimed specifically at the protection to children.

For more information and to respond to the consultation, which closes on 23 February 2024, click here.