Insights Ofcom publishes information on how it is approaching online safety risk assessments that platforms and service providers will have to provide under new online safety regime

Ofcom explains that risk assessments will be a fundamental part of the new online safety regime and will require online platforms and search services to have a clear understanding of risks of harm to users and effective risk management processes. Ofcom’s information document explains its proposed approach to risk, and how it intends to support services in carrying out their assessments.

Ofcom notes that many details of the legislation, including proposed amendments, are still being debated in Parliament, meaning that future regulatory requirements for online services are still liable to change. However, it says, the future online safety regime will require regulated services to better understand the risk that users will encounter illegal content or content harmful to children online.

The Bill will require all firms in scope of regulation to do a risk assessment of illegal content that may appear on their service, ranging from online fraud to terrorism. Services that are likely to be accessed by children will also have to do a risk assessment looking at content which is harmful to children. This is likely to include material such as pornography and content which promotes eating disorders. Ofcom will produce guidance to help services meet these requirements.

As a cornerstone of the regime, it will be important for regulated services to understand what good practice looks like for risk assessment and risk management. While it will be up to services to do their own assessments, Ofcom sees its role as the future regulator to provide guidance to services, helping them focus on illegal content and content that is harmful to children, explain how this content might appear on their service, and promote good practice around risk management as a fundamental part of service design and organisational culture.

Ofcom says that a good risk assessment should help a service anticipate and address the ways in which their users could be exposed to greater risks of harmful content. They should ask questions such as:

  • How does the service’s user base affect this risk; for example, do large numbers of child users in the UK increase the risk of exploitation?
  • How do the functionalities of the service affect risk; for example, does offering stranger pairing increase the risk of romance fraud?
  • What effect does the service’s business model have; for example, how can a service’s financial incentives under a given revenue model increase the risk of hosting harmful content?

Once the Bill passes, Ofcom will consult on its approach to risk assessments. This will be an opportunity for services and other stakeholders to review its plans in full and provide feedback. Ofcom says that it is committed to providing comprehensive guidance to services about what risks they need to look out for, how they should assess them and what they should do about them. In addition to Ofcom’s engagement with industry and civil society organisations, the regulator hopes that these early indications of its proposed approach will be instructive for services preparing for the new regulatory requirements under the Online Safety Bill. To read the paper in full, click here.