Insights Protecting children from harmful videos: Ofcom publishes report on TikTok, Snap and Twitch

Contact

Video sharing platforms (“VSPs”) are regulated under the UK Communications Act 2003. VSPs include video sharing services, social media sites that allow audiovisual content to be shared, and live-streaming audiovisual services such as pornography websites. Under the rules, VSPs are required to protect minors from videos that may impair their physical, mental or moral development, and protect the general public from incitement to violence or hatred and from criminal content provoking terrorism, child pornography, racist and xenophobic offences. VSPs are required to achieve this through a range of measures including requirements in user terms and conditions, functionality to allow users to report issues, age-verification systems, rating systems, parental controls and media literacy measures. The provisions are targeted at the systems established by VSPs to prevent harm, rather than the harmful content itself. VSPs that fall within the UK regime are also required to register with the regulator, Ofcom. Eventually, the VSP rules under the Communications Act will be repealed and replaced by the Online Safety Act 2023.

On 14 December 2023, Ofcom published a report following a review of how TikTok, Twitch and Snap set, enforce and test the measures they put in place to protect children from encountering potentially harmful videos. Amongst other things, Ofcom found that all three enforce their minimum age of 13 years by relying on a user declaring their correct age during account sign up. However, Ofcom reports that it found it was easy to enter a false age and that no account sign up was required at all to watch videos on Twitch. All three services use further measures, such as AI and human moderators, to enforce their age restriction by identifying underage accounts but the effectiveness of these methods is yet to be established. All three services have systems for classifying and labelling content not suitable for children, and TikTok and Snap offer parental controls, but Ofcom states that, without any access controls and corresponding effective safety measures, children may still encounter harmful content.

Ofcom states that it will continue to engage with the platforms but, where it is not satisfied that they are meeting their statutory duties, it will take enforcement action. In its press release announcing the report, Ofcom also announced that it is opening an investigation into whether TikTok has failed to comply with its duty to provide information in response to a formal request from Ofcom as required under the Communications Act. The request sought information concerning TikTok’s parental control system. Ofcom states that it expects to provide an update on this investigation by February 2024.

For more information, click here and here.