07/07/22
Twenty years after the implementation of the e-Commerce Directive - drafted in a period when smartphones had just been introduced, the Cloud was barely in the air and many large social platforms did not yet exist - it is time for a thorough revision of the rules for online platforms. PwC experts Ilse van Wendel de Joode, Joey Keppel and Julia le Fèvre discuss the main features of the Digital Services Act (DSA) and the consequences of the DSA for online platforms that offer their services in the European Union.
The DSA, together with the Digital Market Act (DMA), is part of the Digital Services package of the European Commission (EC). This package seeks to create a safer digital space in which the fundamental rights of users are protected and competitive conditions for businesses are the same. The DMA introduces a new regulatory mechanism to ensure fair competition in the markets in which large online platforms and their customers operate. Topical issues in addition to the Digital Services package include the protection of labour relations and regulations to increase fiscal transparency in the digital economy (DAC-7).
The DSA relates to the day-to-day processes of the provision of intermediary services, including online platforms. The new act provides a better mechanism for removing illegal content and protects the fundamental rights of consumers in the online world. The DSA also provides a clear framework for transparency, promotes innovation, growth and competition, and ensures stricter government supervision of the larger platforms.
The DSA does not cover the online platform industry as a whole, but focuses on the providers of intermediary services, i.e. the part of the industry that engages in the transmission, storage and public dissemination of content generated by users.
The users or recipients of the services - both natural and legal persons - must have a place of residence or an establishment in the European Union (EU). This means that the DSA also applies to organisations that are not established in the EU, but do have users in the EU or activities aimed at the EU. In general, providers of social media platforms, messaging services, review websites, internet service providers (ISPs) or digital 'open' marketplaces will fall under the DSA, but providers of mere film and video streaming services will not. Also platforms where individuals can post videos or sell products themselves will fall within the scope the DSA.
The DSA has a pyramidal structure addressing four categories of organisations (see figure 1). Rights and obligations apply cumulatively to successive categories. For example, the first category, 'intermediary services', is subject to basic rules. Successive categories must comply with the same basic rules PLUS the rules applicable to the successive categories (see figure 2).
Figure 1: Scope of application Digital Services Act
According to the DSA, intermediary services are service providers that offer a ‘mere conduit’ service. Think of parties offering caching services or providing networks over which content can be transmitted, such as ISPs.
The second category concerns hosting services that also store the information provided by the user ('hosting service providers’). These are organisations that provide the IT infrastructure for, among other things, websites, cloud services and e-mail.
The third and fourth categories concern online platforms, which store and disseminate the user's information publicly on request. Think of social media platforms, music and video services where individuals can post content themselves, online marketplaces, search engines, platforms that link customers to drivers for transport or to homeowners for accommodation or to restaurants for meal delivery.
Very large online platforms (>45 million users in Europe), have additional obligations, including crisis response and external audits. The number of users determines whether a platform belongs to category 3 or 4. In addition to all applicable rights and obligations under the DSA, also the rights and obligations under the DMA can apply to this category of platforms. This is the case if they offer core platform services AND have been designated as 'very large online platforms'.
It is important that you assess in which category or categories your organisation falls, because this also determines the rights and obligations applicable to your organisation.
Cumulative obligations | Intermediary services | Hosting services |
Online platforms | Very large online platforms |
Transparency report | x | x | x | x |
conditions with respect to fundamental rights | x | x | x | x |
Cooperation with national authorities if ordered to | x | x | x | x |
Contact points and, if necessary, legal representative | x | x | x | x |
Notification and action obligations to inform users | x | x | x | |
Reporting of criminal offences | x | x | x | |
Complaints and redress mechanism and out-of-court dispute settlement | x | x | ||
Trusted flaggers | x | x | ||
Measures against false notifications and counter notifications | x | x | ||
Special obligations for marketplaces, e.g. vetting of third-party vendor credentials ('Know Your Customer'), compliance by design, sampling | x | x | ||
Prohibition of targeted advertising to children and advertising based on special characteristics of users | x | x | ||
Transparency of recommendation systems | x | x | ||
Transparency of online advertising for users | x | x | ||
Obligations in the field of risk management and crisis response | x | |||
External, independent auditing, internal compliance function and public accountability | x | |||
User choice not to receive recommendations based on profiling | x | |||
Exchange of data with authorities and researchers | x | |||
Codes of Conduct | x | |||
Crisis response coordination | x |
Figure 2: cumulative applicable regulations for categories of organisations
The first set of obligations in the DSA applies to all categories of organisations in the pyramid. Because of the broad scope of category 1, the biggest changes are in the areas of liability, illegal content, adjusting general conditions, transparency and permanent contact points.
Online service providers that take action on their own initiative to block illegal content can still, to a certain extent, rely on limitation of their liability. Previously, providers of 'mere conduit', 'caching' or ‘hosting’ services were sometimes reluctant to take such action on their own initiative, because they wanted to avoid becoming liable if their checks for illegal content would fail. Because the DSA now stipulates that these service providers can still rely on limitation of their liability in such cases, this type of action is no longer discouraged.
In addition, organisations must comply with orders from the authorities regarding (the removal of) illegal content. Banning illegal content is subject to transparency obligations and the assessment process must be public. This way, online providers cannot remove content randomly. In the general conditions ('terms of use'), providers must state all restrictions that can be imposed on the recipients of the service. They must for example specify what the provider regards as illegal content and may therefore remove. The terms of use may also provide for procedures or measures with regard to this 'content moderation'.
There is also a duty of care, requiring providers to make an annual transparency report. Providers must state in this report to what extent they have engaged in ‘content moderation’ and how they have acted on orders from the authorities regarding illegal content. The transparency report contains additional obligations for (very large) online platforms, which we will explain below.
Finally, all online providers must designate a central contact point for direct communication with national authorities, the European Commission and the European Digital Services Council. Providers must make this contact information publicly available to facilitate communication and, indirectly, the implementation of the DSA. If a service provider is not established in the EU, but does offer services in it, it must designate a legal representative instead of a contact point.
Hosting services - the second category in the pyramid - and consequently also (very large) online platforms, have the obligation to set up a 'notice and action' mechanism. This means that they must remove illegal content - for example, copyrighted material or information that defames individuals - immediately upon notification by third parties. Or that they must contact the notifier to obtain more information about the posted content. The above does of course only apply if the notification is sufficiently substantiated. Your organisation will have to develop procedures to make it easy for third parties to make notifications. You will also have to train your staff to handle, assess and check such notifications and explain your response to them to the notifier.
The rights and obligations for the third and fourth category of the pyramid - online platforms and very large online platforms - include in the first place rules regarding so-called 'trusted flaggers'. ‘Trusted flaggers’ are entities that meet certain criteria, so that their notifications regarding illegal online content on platforms may be considered reliable. Think of recognised foundations and other organisations whose objective is to combat child pornography or online fraud, or institutions with a public duty, such as Europol.
Secondly, platforms must give more insight into why they remove certain information from the platform. And the posters of information must have the possibility to challenge removal decisions. Online platforms will have to set up an internal complaints handling system for this purpose.
Thirdly, in addition to the aforementioned general transparency obligations, which apply to all organisations (the annual transparency report), online platforms will have to report their number of users every six months to assess whether or not they fall into the category of 'very large online platform'. And in relation to this increased transparency obligation, online platforms will have to improve their checks of the reliability of traders using their platforms. Platforms that give professionals or individuals the possibility to offer their products or services to consumers via their platform will be obliged to register the personal data and contact information of the trader before he or it makes use of the platform. The consumers of the product or the service will be able to see some of this data. This means that online platforms must in fact comply with a 'know your customer' obligation with respect to the traders.
Although the legislative text is not yet final, the rights and obligations are already clear in outline. On 5 July, Tthe European Parliament adopted the DSA as agreed upon in collaboration with the European Council.
The European Parliament and the European Council will still have to approve the final legislative text - which has been before the European Parliament and the EU Member States since May 2022. This is expected in September 2022.
Next, there will be an implementation period. Enforcement will start in the first quarter of 2024 at the earliest. This is different for very large online platforms, in respect of which enforcement will start at an earlier stage, i.e. four months after they have been classified as a 'very large online platform' by the authorities.
Given the large and ever-increasing role that online platforms and services play in our economy, the scope and application of the DSA will be extensive. To prepare for its entry into force, you can take the following steps: