The recently adopted Digital Services Act (DSA) specifies a category of Very Large Platforms, which differ from normal platforms in their reach: they have on average more than 45 million users per month in the EU. Due to this fact, they have a significant impact on online security and commerce and forming of public opinion. Thereupon, the Regulation imposes a number of obligations on these entities.
The Digital Services Coordinator of a country regularly verifies the number of users of online platforms in its area of jurisdiction. The determination that an entity is a 'Very Large Platform’ entails inclusion by the Commission in a public list of such platforms and being subject to specific obligations. Such status will certainly be attributed to digital giants like Meta (Facebook), Twitter or Booking.com.
Above all, as the legislator points out, despite having a huge impact on reality, platforms focus mainly on profit, thus failing to recognise and analyse the risks of negative socio-economic phenomena that their activities may cause. Therefore, in addition to the obligations required of all other digital service providers, the DSA imposes an annual risk analysis on very large platforms. These include the dissemination of illegal content by users (e.g. hate speech, the sale of illegal products), the violation of the fundamental rights of EU citizens (e.g. by activities on the platform resulting in a restriction of competition) or the use of platforms for deliberate manipulation (e.g. the use of bots to spread fake news for political purposes).
Risk mitigation and external audits
Platforms should not only analyse the risks associated with their activities, but also proactively combat the negative effects of these activities. First of all, they should adjust their moderation and content recommendation systems so that harmful information is not spread. In addition, the DSA mentions measures in the form of deprivation of advertising revenue for such content and, for the sake of balance, improving the display of credible sources of information . The steps taken must also be proportionate – they should not result in unnecessary restrictions on the use of services. Finally, the DSA encourages the development of codes of conduct on their own or jointly with other platforms to implement the most effective solutions. In addition, very large platforms should regularly submit their functioning in the EU to external audits by independent expert bodies. The objective of the audit is to verify, among other things, whether platforms comply with the obligations imposed on them by the DSA. In the event of negative audit results, a check may be initiated by the Digital Services Coordinator.