Insights EU Child Sexual Abuse Regulation update

When Wiggin last reported on the European Commission’s 2022 proposal for a Regulation to prevent and combat child sexual abuse in November 2023, the European Parliament’s lead Committee on the proposal (LIBE) had published their proposed amendments to the Commission’s text. Those amendments were subsequently confirmed by the European Parliament in plenary. The next step would have been for the Council of the EU to agree its position. However, to date it has been impossible for the Council, comprising representatives of Member States, to reach an agreement and the proposed legislation remains blocked.

One of the main areas of contention relates to detection orders. The Commission’s original proposed that the authorities would have the power to issue detection orders where there is evidence of a significant risk of a service being used for the purposes of online child sexual abuse (“online CSA”) which would require the provider, during the period specified by the order, to take certain specified measures to instal and operate technology for the detection of online CSA. Online CSA includes child sexual abuse material (CSAM) and the solicitation of children for sexual purposes (grooming), and CSAM includes both “new” CSAM (i.e. previously unknown material) as well “known” CSAM (material previously detected and confirmed as constituting CSAM much of which is contained in databases against which searches can be made).

The Parliament has proposed to amend those provisions so that detection orders should be a last resort when all mitigation measures have been exhausted (or have not been implemented), should only apply to CSAM and not to the solicitation of children, must be “targeted” to individual users or a group of users where there are “reasonable grounds of suspicion” of a link with CSAM (the significant risk test has been deleted), and should not apply to end-to-end encrypted communications. According to a progress report from the Council (15 December 2023), the proposal being considered by the Council was that detection orders should only be considered as a measure of last resort when mitigation measures are considered insufficient and that they should be targeted to “specific types of users” where possible. Additional safeguards for the technologies to be used for detection, and requirements to safeguard encryption, were also considered. Further, it was proposed that detection orders should be limited to known CSAM whilst orders for the detection of new CSAM and the solicitation of children could only start to apply in the future when detection technologies are considered sufficiently reliable. However, members of the Council were unable to reach agreement and, therefore, to date, the Council has no mandate to enter into negotiations with the Parliament to agree a compromise text.

On 13 February 2023, the European Data Protection Board (“EDPB”), an independent body tasked with ensuring consistent application and enforcement of data protection law across the EEA and which is composed of the heads of the national supervisory authorities, issued a Statement. The EDPB welcomes Parliament’s exemption of end-to-end encryption from detection orders. However, EDPB regrets that the Parliament did not limit detection orders to known CSAM. Detection of new CSAM and grooming go beyond what is necessary and proportionate given the intrusiveness, probabilistic nature and the errors rates associated with such detection technologies. Further, the EDPB finds that the text proposed by the Parliament does not fully resolve the issues previously flagged by the EDPB relating to the general and indiscriminate monitoring of private communications. Specifically, it is not clear how the detection orders are to be “targeted” and when “reasonable grounds for suspicion” should be deemed to exist.

The proposal will now have to wait for the EU’s new political mandate, which means that it cannot be picked up again until later this year. If and when it is, it remains to be seen if the legislators can achieve a balance between privacy and the need to prevent and combat online child sexual abuse.

For more information, click here and here.