The Digital Services Act and the role of (social media) platforms

Brussels, 21 August 2020 (DIGITAL SME). In July 2020, the European Commission has launched two public consultations about digital services and the role of platforms as gatekeepers (the so-called “Digital Services Act Package”). It is the aim of this blog post to reflect on the EU’s efforts and to discuss the potential impact of stronger platform regulation for the online economy.

This blog post is informed by the fact-finding conducted during the European Commission-funded coordination and support action (CSA) COMPACT.

Why a new Digital Services Act Package?

The current liability framework for online intermediaries (i.e. online service providers such as social media platforms like YouTube or Facebook, but also smaller marketplaces or internet service providers) is governed by the e-commerce directive, which dates back to the year 2000. In this legal framework, internet service providers and intermediaries are not liable for illegal and/or harmful content, goods, or services which may be distributed via their channels if they fulfil certain conditions: Intermediaries are not liable if they remove illegal content or disable access to it as fast as possible once they are aware of its illegal nature or if they play a neutral, merely technical and passive role towards the hosted content[1].

This limited liability is fundamental to today’s internet as we know it, in which platforms, but also smaller websites may host content or offer products and services without great legal risks. This model has without doubt contributed to the enormous growth and success of the internet in the past 20 years. For example, social media platforms focusing on video could offer freely available content, i.e. content they did not have to create themselves, nor spend resources on checking each video. Instead, the platform providers would focus on tailoring the content to their users’ preferences by analysing clicks and views and thus strengthening the algorithms that would lead to the promotion of certain content. In social media/messaging platforms, the principle is similar. Individuals would share their personal photographs, thoughts, and opinions on topics ranging from personal exchanges to political views while the platform would not monitor for illegal content and only intervene according to certain community standards which may have banned certain types of content (e.g. sexual imagery). Thus, social media platforms  are not liable for the content provided but simply act as a host.

At the same time, the internet ecosystem as we know it relies to a great extent on advertisement revenues. Facebook and Google relied on advertisement for over 98 / 85 percent of their revenues in 2018[2], respectively. These ad-reliant business models only work if the companies collect personal data of their users and understand their preferences, behaviour and choices (in order to tailor advertisement). This is why social media platforms play a crucial role in this ecosystem: people share personal experiences and traits and reveal private information about themselves which can be used to understand their preferences better and tailor marketing to them. However, in order to obtain this personal data from users and sell advertisement space, the platforms need to make sure that users see the advertisement and reveal personal information, i.e. spend time on their platforms (or other websites which can be tracked by cookies). This, in turn, only works if users see content that is interesting to them and others. Users need to be attracted to what they see, and they need to be engaged to share content themselves with their friends or in specific groups. Besides cat videos and personal holiday pictures, some users tend to share and be attracted to content which can sometimes be sensational, extreme or harmful (e.g. violence, hatred, false news).

What does the Digital Services Act propose?

Formally, the consultation on the Digital Services Act is split up in two parts: (1) a review of the e-commerce directive and (2) ex-ante regulation of digital platforms with a gatekeeper role. On top of that, the European Commission is investigating the need for an ex-ante competition tool aimed at addressing structural competition problems in digital markets.

In the public consultation, different types of harms associated with platforms are addressed in a set of questions. Next to the aspects mentioned above, the consultation focuses on safety of users online, ranging from illegal goods (e.g. dangerous products), content (e.g. violence, hate speech) and  services or practices infringing consumer law. The consultation also addresses the following topics: reviewing the liability regime of digital services acting as intermediaries; the gatekeeper power of digital platforms; other emerging issues and opportunities, including online advertising and smart contracts; challenges around the situation of self-employed individuals offering services through online platforms; the governance to reinforcing the Single Market for digital services [3].

To put it simply, the Digital Service Act aims at tackling two main issues which have been associated with large social media platforms: the spread of hate speech and associated harms for society, public discourse and democracy; and the dominance of gatekeeper platforms in certain markets. These two aspects seem to be entangled for two reasons: (1) social media platforms strongly rely on advertisement revenues for their business models. Thus, it is essential for them to generate and drive traffic; (2) due to network effects and a closed proprietary environment, digital markets seem to concentrate more easily, leading to the dominance of certain platforms which can sometimes act as gatekeepers. This may create hurdles for competitors trying to enter the same market, but the platforms also set the rules for participants/users of the platform. Sometimes, these rules can be set unfairly, leading to privacy/data protection concerns in the terms of use—which users need to accept anyway as the platforms offer them unique access to a network—or unfair terms and conditions for business users.

What have we learnt from the research and symposia conducted in the past two years?

During the past two years, the COMPACT symposia held in different EU countries have gathered technology experts, policy makers and researchers to discuss topics of social media convergence. The project’s core issue was to dis-entangle the information disorder and the effects of new technologies on social media and society. During a host of debates, participants and the wider public have had access to food for thought on those topics, which were accompanied by the project’s research results in the area of standardisation, court rulings and mapping of regulatory approaches to disinformation in social media.

For instance, one part of the project focused on mapping governance initiatives in the EU, which addressed issues of disinformation. It examined whether bottom-up disinformation debunking initiatives had grown, whether the government had initiated such initiatives, or whether any regulation had been passed to control hate speech or disinformation. A study conducted as part of the project showed that the majority of information governance initiatives in the EU countries that were examined are performed by non-governmental or civil society actors. Not many EU countries have passed legislation to tackle the issues identified above (with a few exceptions, e.g. the NetzDG in Germany or similar initiatives in France). The study also found that the focus of these type of initiatives was very nationally oriented.

Would you like to know more about social media convergence and the results of other studies? Please check out the project website here.

What could be the way forward?

Governance of the internet and regulating platforms is complex as platforms act at the intersection of different social activities (economic activities, private exchanges and conversations, media & news consumption) and formerly separated areas of regulation. How the distribution of media content needs to be governed in order to make sure violent content is limited (e.g. not accessible to children) while ensuring freedom of expression and speech has rather been the role of media regulation (i.e. national rules, but also at European level via the Audiovisual Media Services Directive (AVMSD)), while the question of concentration and dominance are areas of competition (i.e. economics and competition law). However, as explained above, there are certain characteristics of social media platforms, e.g. the reliance on advertisement revenue, which, in combination with the current e-commerce framework seem to lead to negative effects for society, be it as concerns the spread of hate speech, the influence on elections via microtargeting, or negative effect on consumer choice.

In order to revamp the e-commerce directive and to address the main issues and harms associated with large platforms, there is a need to disentangle these relationships and interactions, and to come up with a simple and smart way of regulating that fosters a healthy digital ecosystem.

First, the Commission could propose clear rules on the responsibilities of large platforms focusing on media content and news. At the same time, smaller platforms should not be hindered in their development, so regulation needs to be adapted to different sizes and impacts as well as potential harm. Secondly, these platforms and services are proprietary and limit a healthy competitive environment. As a remedy, the Digital Services Act package could introduce interoperability requirements for large platforms, which could solve some of the issues associated with the gatekeeper role of platforms.

If you would like to voice your opinion on this topic, please fill in our survey about the Digital Services Act here!

[1] See: https://ec.europa.eu/digital-single-market/en/e-commerce-directive#:~:text=business%20and%20citizens.-,The%20e%2DCommerce%20Directive,liability%20of%20intermediary%20service%20providers.

[2] https://www.visualcapitalist.com/how-tech-giants-make-billions/

[3] https://ec.europa.eu/digital-single-market/en/news/consultation-digital-services-act-package

CONTACT US