Insights Government amends Online Safety Bill to protect free speech

Contact

The Government says that its amendments to the Bill “go further than before to shield children”, while “protect[ing] free speech online”. Therefore, social media firms will still need to protect children and remove content that is illegal or prohibited, but the Bill will no longer define specific types of legal content that they must address and any incentives for social media platforms to “over-remove” legal online content will be removed.

The Government says that the changes remove any influence future governments could have on what private companies do about legal speech on their sites and ensure there is no risk that companies are motivated to take down legitimate posts to avoid sanctions.

The Government says that new measures will also be added to make social media platforms more transparent and accountable to their users. They will be legally required to provide a “triple shield” for consumers and: (i) remove illegal content; (ii) take down material in breach of their own terms of service; and (iii) provide adults with greater choice over the content they see and engage with.

Duties relating to “legal but harmful” content accessed by adults will therefore be removed from the legislation and replaced with the “triple shield”. The Bill will instead give adults greater control over online posts they may not wish to see on platforms. If users are likely to encounter certain types of content, such as the glorification of eating disorders, racism, antisemitism or misogyny not meeting the criminal threshold, internet companies will have to offer adults tools to help them avoid them. These could include human moderation, blocking content flagged by other users or sensitivity and warning screens.

The Bill will also now explicitly prohibit social media platforms from removing or restricting user-generated content, or suspending or banning users, where this does not breach their terms of service or the law. In addition, they will need to have clear, easy to understand and consistently enforced terms of service.

Firms will also have to publish more information about the risks their platforms pose to children so people can see what dangers sites really hold. They will also be made to show how they enforce their user age limits to stop kids circumventing authentication methods and publish details of when the regulator Ofcom has taken action against them.

The Government says that the changes refocus the Bill on its original aims to: (i) protect children and tackle criminal activity online while preserving free speech; (ii) ensure that tech firms are accountable to their users; and (iii) empower adults to make more informed choices about the platforms they use.

The changes follow confirmation that the Bill will include measures to make significant changes to the UK’s criminal law to increase protections for vulnerable people online by criminalising the sharing of people’s intimate images without their consent.

The criminal offence of controlling or coercive behaviour will also be added to the list of priority offences in the Bill. This means platforms will have to take proactive steps, such as putting in measures to allow users to manage who can interact with them or their content, instead of only responding when this illegal content is flagged to them through complaints.

In addition, the Victim’s Commissioner, Domestic Abuse Commissioner and Children’s Commissioner will be added as statutory consultees in the Bill, meaning Ofcom must consult with each when drafting the codes that tech firms must follow to comply with the Bill. To read the Government’s press release in full, click here.