Insights European Commission designates first set of Very Large Online Platforms and Search Engines under Digital Services Act (DSA) and launches call for evidence on data access for researchers

Contact

The Commission has adopted the first designation decisions under the DSA designating 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) that reach at least 45 million active users every month. These are:

VLOPs: Alibaba AliExpress; Amazon Store; Apple AppStore; Booking.com; Facebook; Google Play; Google Maps; Google Shopping; Instagram; LinkedIn; Pinterest; Snapchat; TikTok; Twitter; Wikipedia; YouTube; and Zalando;

VLOSEs: Bing and Google Search.

The platforms have been designated based on the user data that they were required to publish by 17 February 2023.

Following their designation, these companies will now have to comply, within four months, with the full set of new obligations under the DSA. These are aimed at empowering and protecting users online, including minors, by requiring designated services to assess and mitigate their systemic risks and provide robust content moderation tools. This includes:

more user empowerment:

  • users must be given clear information on why they are being recommended certain information and will have the right to opt-out from recommendation systems based on profiling;
  • users must be able to report illegal content easily and platforms must process reports diligently;
  • advertisements cannot be displayed based on the sensitive data of a user (such as ethnic origin, political opinions or sexual orientation);
  • platforms must label all ads and inform users who is promoting them;
  • platforms must provide an easily understandable, plain-language summary of their terms and conditions in the languages of the Member States where they operate;

strong protection of minors:

  • platforms must redesign their systems to ensure a high level of privacy, security, and safety of minors;
  • targeted advertising based on profiling towards children is not permitted;
  • special risk assessments, including for negative effects on mental health, must be provided to the Commission four months from designation and made public within 12 months;
  • platforms must redesign their services, including their interfaces, recommender systems, and terms and conditions to mitigate these risks;

more diligent content moderation, less disinformation:

  • platforms and search engines must take measures to address risks linked to the dissemination of illegal content online and to negative effects on freedom of expression and information;
  • platforms must have clear terms and conditions and enforce them diligently and non-arbitrarily;
  • platforms must have a mechanism for users to flag illegal content and act upon notifications expeditiously;
  • platforms must analyse their specific risks, and put in place mitigation measures, e.g. to address the spread of disinformation and inauthentic use of their service; and

more transparency and accountability:

  • platforms must ensure that their risk assessments and compliance with all DSA obligations are externally and independently audited;
  • platforms must give access to publicly available data to researchers; in the future, a special mechanism for vetted researchers will be established;
  • platforms must publish repositories of all ads on their user interface;
  • platforms must publish transparency reports on content moderation decisions and risk management;
  • four months after designation, platforms and search engines must adapt their systems, resources, and processes for compliance, set up an independent system of compliance and carry out their first annual risk assessment that must also be reported to the Commission.

As for risk assessments, platforms must identify, analyse and mitigate a wide array of systemic risks ranging from how illegal content and disinformation is spread on their services, to the impact on freedom of expression and media freedom. Specific risks around gender-based violence online and the protection of minors online and their mental health must be assessed and mitigated. Risk mitigation plans will be subject to independent audit and oversight by the Commission.

In terms of enforcement, the DSA will be enforced through a pan-European supervisory framework. While the Commission is the competent authority for supervising designated platforms and search engines, it will work in close cooperation with the Digital Services Coordinators within the framework established by the DSA. These national authorities, which are also responsible for the supervision of smaller platforms and search engines, must be established by EU Member States by 17 February 2024. That date is also the deadline by which all other platforms must comply with their obligations under the DSA and provide users with protections and safeguards as set out in the DSA.

To enforce the DSA, the Commission says that it is also bolstering its expertise with in-house and external multidisciplinary knowledge and recently launched the European Centre for Algorithmic Transparency (ECAT). ECAT will provide support with assessments as to whether the functioning of algorithmic systems is in line with risk management obligations. The Commission says that it is also setting up a digital enforcement ecosystem bringing together expertise from all relevant sectors.

The Commission has also launched a call for evidence on data access for researchers under the DSA. These provisions in the DSA are designed to better monitor platform providers’ actions on tackling illegal content, as well as other risks such as the spread of disinformation, and risks that may affect users’ mental health. Vetted researchers will be able to access the data of any VLOP or VLOSE to conduct research on systemic risks in the EU. This means that they could, for example, analyse platforms’ decisions on what users see and engage with online, having access to previously undisclosed data. In view of the feedback received, the Commission will present a delegated act to design an easy, practical and clear process for data access while ensuring adequate safeguards against abuse. The deadline for responding to the consultation is 25 May 2023. To read the Commission’s press release in full, click here.