HomeInsightsIllegal Content Codes of Practice: Ofcom consults on additional measures

Ofcom has published a consultation on proposals to introduce additional safety measures in order to strengthen its Illegal Content Codes of Practice (which we have discussed previously here).

The consultation is a detailed and dense document running to just under 300 pages and setting out a range of proposed changes (for those short on time, there is thankfully a helpful summary towards the beginning). In addition to further measures being proposed in relation to user sanctions and age assurance, the consultation focuses on three particular areas: (1) addressing risks arising from livestreaming, (2) ensuring that certain services have a crisis response protocol, and (3) introducing greater use of proactive technology such as so-called ‘hash matching’.

Livestreaming is identified as posing particular risks from an online safety perspective, especially for children. While there are already measures in the Codes that address some of these risks, Ofcom argues that additional steps are needed. These include requiring all services offering livestreaming to enable users easily to report livestreams that depict imminent physical harm, and to have human moderators available to review content and take action in real time. The consultation also proposes that where children are able to livestream, users should not be able to interact with them such as by posting comments, reacting, or sending gifts.

Extensive focus is also paid to so-called ‘proactive technology’ which can be used at scale to analyse and detect suspected illegal content. Ofcom proposes that certain service providers must assess whether accurate and effective proactive technology exists for detecting illegal content and, if so, they should use it as part of their content moderation systems. One tool that is identified in particular as being something that should be employed for terrorist and intimate image abuse content is hash-matching which, as the consultation explains, “involves creating a hash (an identifying series of characters) of a unique piece of content, storing that hash in a database, and using an algorithm to detect attempts to upload the same or similar versions of that content”.

Finally, Ofcom recommends that services of a particular size or risk profile should have a crisis response protocol in place to prevent the rapid spread of illegal content, as took place during the riots in the UK last summer. This would include services having protocols in place to identify when a crisis has arisen, what steps it will take, and having dedicated channels to liaise with law enforcement authorities.

The consultation closes on 20 October 2025 and can be read in full here.