Document Actions

Articles

The Out-of-court dispute settlement mechanism in the Digital Services Act: A disservice to its own goals

  1. LL.M. (NYU) Jörg Wimmers

Abstract

The Digital Services Act (DSA), proposed by the EU Commission, introduces extensive content moderation rules for online platforms. Under Article 18 DSA, users whose content has been blocked or removed or whose account has been suspended by the platform are entitled to select a certified out-of-court dispute settlement body to resolve their disputes with the service provider. The author describes context and parties of online speech, examines conditions and consequences of this redress mechanism, and concludes that the proposed provision is flawed in several ways: it does not approximate different regulation, but promotes fragmentation and creates legal uncertainty; it does not provide criteria or standards for the complex factual and legal determinations and balancing of rights in the area of online speech; and with the incentives set by this regulation, it opens the field for a race to the bottom. While out-of-court dispute settlement mechanisms usually aim at a consensual solution, placing emphasis on interests, rather than on the legal positions of the parties or on the rights asserted, free speech disputes are strictly normative and do not lend themselves to a settlement by private bodies, but are reserved for the judiciary. Moreover, most platforms have established appeals mechanisms for their users already allowing for a second review. By further extending this redress mechanism to decisions based on the platforms’ community standards, the DSA frustrates existing ‘flagging’-systems established by the platform providers, and thereby doing a disservice to its own goals. In the outlook the author proposes to modernize and build on the existing infrastructure of the judiciary to address needs of private persons to pursue their rights and to ensure the quality of process and decision, rather than duplicating the existing court system by adding a redress system of private alternative dispute resolution (ADR) bodies.

Keywords

1. Introduction*

1

With the out-of-court dispute settlement mechanism in Article 18 of the draft Digital Services Act [1] (in the following “DSA”), the “settlement euphoria” [2] of the European legislature has reached the field of free speech. Directive 2013/11/EU on alternative dispute resolution (ADR) [3] for consumer disputes and Regulation 524/2013 on online dispute resolution for consumer disputes (ODR) [4] laid the groundwork for an easily accessible framework within which consumers can pursue their rights quickly and effectively. While doubts persist as to whether these legislations have accomplished their goals [5], Regulation (EU) 2019/1150 on promoting fairness and transparency for business users of online intermediation services [6] translated the out-of-court redress into the sphere of internet intermediary services and search engines. While these legislative acts all concerned the role of ADR and ODR in online commerce, Directive 2019/790 on copyright and related rights in the Digital Single Market [7] took a view on the effects of “steps taken by online content-sharing service providers in cooperation with rightholders” on the freedom of expression of those users of the platforms who upload their content and calls on Member States to ensure that these users have access to out-of-court redress mechanisms. Similarly, Directive 2018/1808 amending the Audiovisual Media Services Directive [8] provides for such out-of-court redress for the settlement of disputes between users and video-sharing platform providers.

2

The European Commission’s proposal for a Digital Services Act follows suit and prescribes in Article 18 an out-of-court dispute settlement mechanism that uploaders may select when their content is blocked or removed by the platform operator, or when their account or the provision of the service is suspended or terminated. With its predecessors this proposal appears to be on safe ground. This paper will argue the opposite. Doubts are already appropriate because of the apparent lack of empirical data. The Commission itself admits in its Impact Assessment that with regard to out-of-court ADR systems “there is a level of uncertainty, as no reliable data or precedent allows to estimate what volumes of complaints would be escalated” [9]; the availability of ADR in all Member States “would however facilitate access to such mechanisms and likely append negligible costs compared to the current system.” The dispute settlement mechanism of Article 18 DSA is flawed in several ways, as will be laid out in more detail below:

  • Article 18 DSA is against its intention not approximating different regulation, but promotes fragmentation and creates legal uncertainty. The provision adds to a cacophony of different rules for redress mechanisms [10] that apply to the same service and creates a patchwork of overlapping regulation. [11]

  • The DSA does not harmonize regulation “on the merits”, but subjects online platforms to the laws of all 27 Member States. There is also no procedural approximation: [12] By providing no standards or criteria for the complex factual and legal determinations and balancing of rights in the area of online speech, the quality of the decision-making process will vary as will the decisions; Article 18 DSA opens the field for a classic race to the bottom.

  • With its sweeping reference to Article 11 of the Charter, the Commission fails to recognize that the rights of the Charter are addressed to the institutions and bodies of the Union and do not apply directly to the horizontal relationship between private parties. Therefore, one cannot simply transpose the standard of free speech to which the Union and Member States authorities are bound to private online platforms. These platforms must have room to direct their network towards specific target groups, create respectful interactions within their platform community, or minimise liability risks with regard to possible illegal content. [13]

  • Out-of-court dispute settlement mechanisms usually aim at a consensual solution, placing emphasis on interests, rather than on the legal positions of the parties or on the rights asserted. Free speech disputes on the other hand are strictly rights based; there is no room for give and take. With ADR and its online-sibling ODR, even in the field of commercial disputes being far from a silver bullet solution, free speech disputes do not lend themselves to settlement by private bodies. Free speech is a classic field reserved for the judiciary, subject to extensive and nuanced case law, and we should refrain from creating a parallel layer of ADR providers next to the competent court system. [14]

  • Article 18 DSA also operates against its own goals. By subjecting decisions by platform operators to remove or block content based on their community standards to the redress mechanism, the DSA frustrates the pre-existing and efficient “flagging”-systems established by all major social networks. These systems are effective, because they are easy and quick to use; the amount of content removed on this basis alone shows that the significant procedural requirements established by Article 18 DSA will likely render these systems unfeasible. Article 6 DSA intends to reward such “voluntary own- initiative investigations”, whereas Article 18 DSA does the opposite. [15]

3

To stake out the field, this paper will first take a look at the background for the Commission’s votum for an out-of-court dispute settlement mechanism (2.). This will be followed by a description of the relevant provisions of the draft DSA, their scope of application as well as relevant carve-outs (3.), a discussion of the out-of-court redress mechanism (4.) and an outlook and proposal (5.).

2. The background: The influence of operators on the content available on their online platforms

4

There is a heated debate in Europe about the “censoring” of free speech by private entities. It is claimed that internet-based platforms have grown to become powerful intermediaries, organising, curating and “increasingly controlling” communications in the virtual world [16], and it is argued that it should not be left to private and profit-oriented businesses to decide what content is available on the Internet and what content not.

5

This debate is a revenant of the sometimes sharply conducted upload-filter discussion in the context of the EU Copyright Directive. [17] There is no doubt that freedom of expression constitutes one of the essential foundations of a democratic society and one of the basic conditions for its progress and for each individual’s self-fulfilment. [18] However, the debate on the role of internet intermediaries for the process of public communication is often fuelled by political beliefs; on the other hand, it sometimes takes too little consideration of the different actors in online communication and their roles, “responsibilities, powers and capabilities.” [19] It is often overlooked in these discussions that the multipolarity of different participants to a communication with different and sometimes conflicting rights and interests is the special feature of online communication over platforms such as Twitter, Facebook, Reddit, or YouTube. There is (i) the person making an online statement, i.e. the uploader of content, there may or may not be (ii) an infringed person, there are (iii) other recipients of the service, viewing the uploaded content [20], and there is (iv) the online platform, defined by Article 2(h) DSA as a provider of hosting services which, at the request of a recipient of the service, stores and disseminates to the public information. [21]

2.1. Online platforms are hosting services that have neither knowledge of nor control over the content on their platforms

6

Online platforms are hosting providers subject to the (conditional) liability exemption in Article 14 of the e-Commerce Directive if their “activity is of a mere technical, automatic and passive nature, which implies that the information society service provider has neither knowledge of nor control over the information which is transmitted or stored.” [22] A platform operator can no longer rely on this privilege, where it plays an ‘active role’ giving it ‘knowledge of, or control over’ the data which it stores at the request of its users. [23] While it is a truism that any service provider storing information provided by its users necessarily has control over that information as it has the technical capacity to remove or to disable access to it [24], “control” in the sense of an active role requires more, i.e. that the online platform, by the nature of its activity, acquires the intellectual control of that content by selecting the content or otherwise being involved in the content or its presentation. [25] In general, therefore, the online platforms which are at the core of the current discussion—Twitter, Facebook, Reddit, YouTube, etc.—do not “control” the content on their platforms within this meaning. They are intermediaries protected with their content-neutral activity under the liability privilege of Art. 14(1) of the e-Commerce Directive. [26]

7

In his opinion in the joined cases C-682/18 and C-683/18, Advocate General Saugmandsgaard-Øe explained that “the logic of ‘notice and take down’ underlying Article 14(1) seeks to strike a balance between the different interests at stake, and, in particular, to safeguard the freedom of expression of users.” [27] The notification is intended to give the operator of the service sufficient evidence to verify the illegal nature of the information, as a provider must remove such information only “where its illegal nature is ‘apparent’, that is to say manifest. That requirement seeks […] to avoid forcing a provider itself to come to decisions on legally complex questions and, in doing so, turn itself into a judge of online legality. [28]

8

What is true in this copyright case, where the assessment of the infringing character of an uploaded file requires a number of contextual elements and a thorough legal analysis [29], is all the more true for free speech. The determination whether speech is unlawful requires the examination of a statement within its context and the balancing of conflicting fundamental rights, including possible effects and consequences of measures for parties playing an intermediary role on the Internet. [30] An intermediary is generally unable to make this determination. He does not have any own knowledge about the statement and its accuracy, he is in no position to prove whether the incriminated statement is truthful, and therefore cannot make its own assessment of the material justification of a statement. [31]

2.2. European and national legislatures curtail the hosting provider privilege for online platforms and demand operators to determine and remove illegal content

9

This notwithstanding, both the European and the national legislatures increased their demands on online platforms to step up their measures against certain content uploaded by its users, as have courts. [32] As a result, the (conditional) liability exemption for these neutral platform operators is crumbling. [33] The EU Copyright Directive reversed the previously widely held view [34] that the platform operator does not itself engage in any act of use under copyright law. [35] National legislative initiatives such as the Network Enforcement Act [36] in Germany and similar laws enacted in France and Austria [37] are especially aimed towards hate speech and increasingly place responsibility for (illegal) content posted by users on the platform operator with ever-growing information, review and removal, as well as procedural and reporting obligations. [38]

10

Hate speech, online bullying, and defamation have become commonplace on the Internet [39] and false or misleading information has been surging, in particular, with regard to the current pandemic and in organized attempts to influence democratic elections in the U.S. and elsewhere. There seems to be wide consensus that the tech companies operating the largest online platforms must live up to their responsibility in combating these phenomena. [40] On the other hand, legislative acts demanding platforms to determine and remove user uploaded content as illegal have been heavily criticised not only with regard to the hosting provider privilege [41], but especially due to possible “chilling effects” for the freedom of expression and information. [42]

11

It is true that such regulation unavoidably tips the balance in favour of the complainants and against online speech. From an economic perspective, it is reasonable behaviour by the platform operators to take such complaints at face value and remove content upon notice. This saves cost and reduces legal risks of litigation in which the platform operator—with no own knowledge—is at a structural disadvantage. These regulations therefore set an incentive for the provider to keep at least a "safe distance" in their decisions leading to the removal of content which—while repulsive, indecent or otherwise offensive—may not violate the law. [43] And there is another incentive amplifying this risk of overblocking [44]: persons, who are the subject of information on the Internet, that—while legal—they consider detrimental [45], may (and will) exploit this structural disadvantage of the online platform by attacking speech with contrived or even false allegations. [46] However, this possible risk of overblocking prescribed by the legislator will not be cured by now also outsourcing conflict resolution, after the evaluation and determination of criminal content on the internet has already been transferred to the platform operators. [47]

2.3. The decisions of online platforms to remove illegal content and removals for community guidelines violations

12

Such regulation unavoidably tips the balance in favour of the complainants and against online speech. From an economic perspective, it is reasonable behaviour by the platform operators to take such complaints at face value and remove content upon notice. This saves cost and reduces legal risks of litigation in which the platform operator—with no own knowledge—is at a structural disadvantage. These regulations therefore set an incentive for the provider to keep at least a "safe distance" in their decisions leading to the removal of content which—while repulsive, indecent or otherwise offensive—may not violate the law. [48] And there is another incentive amplifying this risk of overblocking [49]: persons, who are the subject of information on the Internet, that—while legal—they consider detrimental [50], may (and will) exploit this structural disadvantage of the online platform by attacking speech with contrived or even false allegations. [51]

13

In addition to these new statutory obligations for platform operator to block or remove criminal content on the platform, the DSA takes on the removal of content based on the platforms’ terms and conditions, i.e. on the contractual relationship between the uploader and the platform operator. In its Impact Assessment the Commission points out that decisions by online platforms to remove content were “often not based on an assessment of the legality of the content, […] but they are solely governed by the discretionary powers of the platform according to the terms of services that are part of their contractual terms”. [52]

14

While the public debate sometimes focuses on striking examples of nonsensical blockings, such as the removal of copies of the famous Courbet painting "L'Origine du Monde" [53] or Facebook's blocking of the passage "merciless Indian Savages" from the Declaration of Independence of the United States of America [54], more relevant for this discussion are other court decisions obligating social networks to reinstate content they have blocked for violations of their ‘community standards’ or ‘community guidelines’, which form part of the contractual relationship between platform and user. “Facebook may not delete at will” headlines the German daily newspaper Sueddeutsche Zeitung about a judgment by the court of appeals in Munich—not without a touch of Schadenfreude. It continues: “Facebook must respect freedom of expression and other fundamental rights in the same way as the state. [55]

15

Must it? These “community guidelines” or “community standards” define what is and what is not allowed on the respective platform. In order to “enforce” these guidelines and to engage their communities of registered users in keeping certain content off the platform, online platforms, and particularly social networks, already established so-called “flagging” mechanisms years ago. These mechanisms allow registered users to choose from defined categories of “guideline” violations in a drop-down-menu and report potentially incompatible content with one click and without the need for a further explanation. Community guidelines violations concern a whole universe of decisions that range from spam and deceptive practices, to graphic, violent, pornographic or abusive content and hate speech, etc. [56] Employees of the provider compare such “flags” against the alleged community guideline violation and, where applicable, remove content. Most providers put specific trust in the notifications of so-called “trusted flaggers”, i.e. persons or organisations which have shown in their submissions that their judgment is trustworthy. [57] All major online platforms have complaint handling mechanisms, they inform their uploaders, whose content is removed about their decision and its basis, [58] and allow these uploaders to appeal—where this is appropriate with regard to freedom of expression. These systems are balanced, they are swift and effective, and they are used extremely widely proving the success of this tool. Online platforms are not dealing with a few hundred thousand flaggings but with numbers in the millions or even billions [59]; and there are only few appeals of these removal decisions. [60]

3. The out-of-court-dispute-settlement in Article 18 of the draft DSA

3.1. The complaint and redress mechanism of the DSA

16

Article 18 DSA provides for an out-of-court dispute settlement mechanism that “recipients of the service” (in the following also referred to as the uploader) may select to resolve disputes relating to decisions by an online platform:

  • to remove or disable access to information recipients have provided,

  • to suspend or terminate the provision of the service, in whole or in part to the recipients, or

  • to suspend or terminate the recipients’ account.

17

Article 18 must be read in the context of Articles 15 and 17 DSA, as the new procedural redress mechanism of the DSA is composed of several steps. [61] Article 15 DSA obliges the online platform to a “clear and specific statement of reasons” for its decision including information on the redress possibilities for the recipient. Pursuant to Article 17 DSA, online platforms are required to provide the uploaders—“for a period of at least six months following the decision”—with access to an effective, cost-free internal complaint-handling system against the operator’s decisions. Where a complaint contains sufficient grounds that the information is not illegal and not incompatible with the terms and conditions of the provider, the provider shall reverse the decision. Users shall be informed about the decision and the possibility of out-of-court dispute settlement and other available redress possibilities without undue delay.

18

Article 18 DSA entitles users to select any of the certified bodies for out–of-court settlement and requires online platforms to engage “in good faith” with these bodies. The provider “shall be bound by the decision taken by the body”, whereas the user’s right to redress against the platform’s decision before a court remains unaffected by his entitlement to an out-of-court settlement. Article 18(2) DSA establishes conditions and procedures for the certification of bodies for out-of-court settlement, which shall be, inter alia, impartial and independent, have the necessary expertise in relation to the issues arising in one or more particular areas of illegal content or in relation to the application of terms and conditions; the body shall be easily accessible through electronic communications technology, capable of settling the dispute “in a swift, efficient and cost-effective manner”, and operate “with clear and fair rules of procedure.” According to Article 18(3), the online platform shall reimburse the recipient for fees and expenses if the body decides in favour of the user, but the user shall not the online platform, if the body decides in the platform’s favour.

3.2. The personal scope of the dispute settlement mechanism in Article 18

19

“Recipients of the service” may select a body for an out-of-court dispute settlement, which includes by definition any natural or legal person using the service, [62] thereby extending this right also to companies, associations, political parties, etc. While for example the use of online platforms for political parties may be of particular importance especially in pre-election phases, [63] they do not appear to be in need of additional safeguards to exercise their rights; in particular, it is not comprehensible why a sophisticated business or political party using the platform should not be required to reimburse the online platform in case the body decides in favour of the platform. [64]

3.3. The material scope of the dispute settlement mechanism in Article 18 DSA

20

The decisions subject to the out-of-court redress of Article 18 DSA can be based on either the illegality of the content or its incompatibility with the terms and conditions of the provider.

21

The DSA does not define what constitutes illegal content, but generally refers to “any information which – by itself or by its reference to an activity – is not in compliance with Union law or the law of a Member State” (Art. 2(g) DSA). This broad “horizontal” approach will require online platforms operating in all EU member states to apply the requirements of EU law, but also the standards of 27 national legal systems. Within the limits set by the country-of-origin principle of Article 3 of the e-Commerce Directive, this will present major challenges. This wide scope of application is also subject to significant “carve-outs” (see in the following 3.3.1) and burdened with uncertainties and ambiguities (see in the following 3.3.2). The redress is not only available for decisions by the platform on “illegal content,” but also those that are based on an incompatibility with the terms and conditions of the provider (see in the following 3.3.3).

3.3.1. The carve-outs for copyright infringements and video-sharing platform providers

22

Pursuant to Article 1(5)(c), the DSA is without prejudice to the rules laid down by Union law on copyright and related rights. While the DSA is somewhat ambivalent as to the scope of this carve-out, [65] recital 11 clarifies that Union law on copyright and related rights establishes “specific rules and procedures that should remain unaffected”. This confirms that at least the provisions on “an effective and expeditious complaint and redress mechanism” in Article 17(9) of Directive (EU) 2019/790 take precedence over the provisions in Articles 17 and 18 DSA. [66] The complaint and redress mechanism in Article 17(9) of Directive (EU) 2019/790 takes copyright infringements on a platform like YouTube out of the scope of application of the DSA’s out-of-court dispute settlement mechanism and—depending on the Member States’ laws enacted pursuant to Article 17(9) of Directive 2019/790—requires such platforms to establish different workflows, systems and to submit itself—depending on the content in question—to different redress mechanisms. [67]

23

An even more significant carve-out follows from the recently amended Audiovisual Media Services Directive (AVMSD), [68] regarding which recital 9 of the DSA states that the DSA “should complement, yet not affect” its application. More specifically, the AVMSD shall be considered lex specialis in relation to the DSA. [69] Directive (EU) 2018/1808 amended the AVMSD by adding new provisions concerning so-called “video-sharing platform services”, i.e. platforms devoted to user-generated content for which the platform does not have editorial responsibility, [70] such as e.g. YouTube, DailyMotion, etc. Article 28b of that Directive provides for specific obligations for video-sharing platforms regarding certain “illegal content” as defined by the aquis communautaire: Member States have to ensure that video-sharing platform providers under their jurisdiction take appropriate measures to protect minors, the general public against the incitement to violence or hatred directed against a group of persons or a member of a group, and the general public from content of which dissemination constitutes certain criminal offences under Union law in the areas of terrorist activities, child sexual abuse materials, and racism and xenophobia. [71] Article 28b(3) establishes some general principles Member States have to abide by in determining the appropriate measures (e.g. nature of the content and the harm it may cause, the persons to be protected, the legitimate interests at stake, as well as the general public interest), but also specific requirements, such as a transparent and user-friendly mechanism for users to report content, age verification systems, content rating systems for users, parental control systems, as well as a complaint handling mechanism in relation to all of these points. For the implementation of these measures, Member States shall encourage the use of co-regulation and ensure that out-of-court redress mechanisms are available for the settlement of disputes between users and video-sharing platform providers. These provisions are detailed and comprehensive mandates for the transposition by Member States into national law. The measures are aligned with the objects of the directive (e.g. protection of minors) as well as the specific content available on video-sharing platforms. Accordingly, these are not “issues that are not or not fully addressed” by Directive 2018/1808 or “which are left to the Member States” within the meaning of recital 9 of the DSA. Rather, the provisions in Article 28b of the AVMSD in some parts go beyond those in the DSA and therefore take precedence.

24

Regulation (EU) 2019/1150 (the P2B-Regulation) prescribes for “online intermediation services” its own internal complaint-handling mechanism (Article 11) and a mediation process that differs significantly from the ADR provisions in Article 18 DSA. Article 1(5)(g) DSA provides that the DSA is without prejudice to this regulation; in the Explanatory Memorandum, the European Commission again explains that in ensuring “appropriate transparency, fairness and effective redress possibilities, [Regulation (EU) 2019/1150] will apply as lex specialis. [72] There is considerable overlap between the DSA and the P2B-Regulation: platforms like eBay or Amazon are online intermediation services and online platforms within the meaning of the DSA. YouTube is an online platform and – e.g. regarding films offered on the platform against payment – an online intermediation service. Moreover, as recital 13 specifically lists online marketplaces as an example for online platforms, the comments and ratings segments on those platforms may not qualify as a “purely ancillary feature”, so as to exempt it from the online platform provisions of the DSA.

25

Accordingly, copyright infringements are largely—at least for online content-sharing service providers—outside the scope of application of the DSA and there is a strong argument that Articles 17 and 18 DSA do not apply to video-sharing platform providers concerning user-generated videos [73] (and audio-visual commercial communications) in relation to the content defined in Article 28b of Directive 2018/1808. With further carve-outs following from the P2B-Regulation, it is unclear what will remain for the redress mechanism in the DSA. More importantly, however, this patchwork quilt of overlapping regulation is creating legal uncertainty for both providers and internet users.

3.3.2. Ambiguities with regard to the scope of application of the out-of-court settlement mechanism

26

The scope of application of Article 18 is further burdened with ambiguities. Other than the AVMSD, the DSA does not specify what content it considers “illegal”, [74] but makes a horizontal reference to non-compliance with Union or Member State law. In addition to the corpus of Union law, this broad definition demands online platforms to comply with the legal requirements of all 27 Member States, which especially in the area of free speech vary considerably. [75] The DSA does not give any guidance on the criteria to be applied when examining the alleged illegality of the content in question, it does not specify the intensity with which the question of illegality must be measured, nor how to proceed within a spectrum of justifiable decisions if at all. Not only do platform operators bear a considerable decision-making risk here, it also remains unclear how the out-of-court dispute settlement bodies will tackle this highly contextual and complex range of issues. With different legal systems and traditions in the different Member States with regard to speech issues, this does not resonate well with the legislator’s express aim of the “approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty”. [76]

27

Moreover, Article 18 allows the recipient of the service “to select any out-of-court dispute” certified in accordance with Article 18(2) with no restriction on the Member State in which such body is certified. There is also no restriction on whether the recipient may select multiple bodies in different Member States. It is an odd consequence of this provision, that a recipient whose content has been removed or disabled in one Member State under the (defamation) laws of that state may turn to an out-of-court settlement body in another Member State. It appears out-right absurd that this body may then decide—possibly with binding effect for the service provider—that removed content that violates the defamation laws of that Member State must be reinstated.

28

A further ambiguity is created with regard to the country-of-origin principle, which is a key principle of the e-Commerce Directive and confirmed and extended in the AVMSD. [77] This principle is meant to avoid that providers of intermediary services established in a Member State have to comply with all Member States’ rules. Recital 33 of the DSA provides for an exception to this principle. It shall not apply “to orders to act against illegal content” by a Member State addressed to intermediaries not established within that Member State where such orders “relate to specific items of illegal content”. While Article 8 DSA lays down the welcome clarification which conditions an “order to act against illegal content” by a relevant national judicial or administrative authority must fulfil, the DSA fails to explain the relationship of orders under this Article 8 and its specific conditions to the “remnants” of the former provisions of the e-Commerce Directive which found entry into the new intermediary privileges in Articles 3(3), 4(2), and 5(4) DSA. According to these paragraphs, the respective liability privilege “shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement”. In the case law in Germany, these paragraphs are the hinges upon which the civil law claims to cease and desist are hung, [78] claims which under this case law need not fulfil the specific requirements of Article 8. Based on this interpretation, different out-of-court settlement bodies may apply the laws of different Member States to the same set of facts which may lead to a race to the bottom to that body providing the most beneficial outcome for the recipient of the service.

29

Since Article 8 DSA appears to be the more specific rule, it will have to be considered lex specialis to Articles 3(3), 4(2) and 5(4) DSA. However, in newly formulated legislative acts it would be desirable to avoid such ambiguities, which are being put into effect already in the legal discussion in Germany, where some voices want to read the “order” in recital 33 to also extend to civil law claims to cease and desist. [79]

3.3.3. The extension to decisions based on the providers’ terms and conditions

30

It is not a small matter which lies behind the words “incompatible with the terms and conditions of the provider” in Article 17(1) DSA, as this applies the obligations under Articles 15, 17, and 18 DSA not only to decisions on the illegality of the content, but also to such decisions based on a non-compliance with the platform operators’ terms and conditions. With this extension to decisions based on contractual relationships, the Commission takes account of the assumption that the freedom of opinion and information may be significantly influenced by online platforms and who should therefore not be free in their decision to remove content from the platform or the blocking of accounts. While recital 38 DSA acknowledges that “the freedom of contract of providers of intermediary services should in principle be respected”, it was “appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes”. [80] In its Impact Assessment, the Commission emphasizes that the removal of content “can have severe consequences on the rights and freedoms of their users”, in particular their freedom of expression. The report continues: “These decisions are often not based on an assessment of the legality of the content, nor are they accompanied by appropriate safeguards, including justifications for the removal or access to complaints mechanisms, but they are solely governed by the discretionary powers of the platform according to the terms of services that are part of their contractual terms.” [81]

31

Removal decisions based on violations of the terms and conditions of the provider concern a wide range of decisions from spam, deceptive practices, to graphic, violent, pornographic or abusive content and hate speech, etc. [82] Not only are the reasons for such removals manifold, the removal numbers illustrate the diseconomies of scale associated with their inclusion for an out-of-court settlement mechanism: Facebook took action on more than 95 million pieces of content for hate speech and approximately 130 million for adult nudity and sexual activity [83], and YouTube removed in 2020 more than 200 million videos and 4.9 billion of comments posted by its users for community guidelines violations. [84] An obligation on the platform operator to further “administer” and impose procedures on this swift, intuitive and efficient process with requirements to provide “clear and specific statements of reasons” (Article 15 and the detailed requirements in paragraph 2 lit a) through f)), demands for a formalized complaint-handling system with further reporting obligations (Article 17) and finally an out-of-court settlement process (Article 18) will likely render the entire system unfeasible. As a consequence, systems currently working extremely effectively will lose their power to efficiently prevent misuse of the service, and the DSA will do a disservice to its own goals.

4. The Out-of-court redress in Article 18 DSA misses its objective, it encroaches the fundamental rights of the online platforms and frustrates their effective own initiatives

32

With its proposal, the Commission intends “to ensure harmonised conditions for innovative cross-border services to develop in the Union”; the DSA was necessary “to ensure effective harmonisation across the Union and avoid legal fragmentation”. [85] Besides the requirements of the stated legal basis of the DSA (Article 114 TFEU) [86], the provisions in Article 18 DSA—viewed from the perspective of the platform operator—must also meet the conditions of Article 52 of the Charter. [87] According to that article, limitations on the exercise of the rights and freedoms recognised by the Charter must be provided for by law and respect the essence of those rights and freedoms; they are further subject to the principle of proportionality and may only be made if necessary to meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others. The relevant freedoms of the platform operators in this regard follow from the freedom to conduct a business in Article 16 and right to an effective remedy and a fair trial in Article 47 of the Charter.

33

The out-of-court redress in Article 18 DSA misses its objective, as it promotes fragmentation rather than approximation (4.1.). It does not recognize the platform operators’ freedom of contract as an outflow from Article 16 of the Charter (4.2.). The proposed out-of-court redress is in violation of the right to an effective remedy and to a fair trial in Article 47 of the Charter (4.3.), and it frustrates voluntary own-initiative investigations by the platform in violation of the principle of proportionality (4.4.).

4.1. Fragmentation instead of approximation

34

Already on the Union level, the DSA is not approximating different regulation, but rather promotes fragmentation and creates legal uncertainty with regard to its scope of application. [88] Different rules and procedures for video-sharing platforms, for online content-sharing platforms, for online intermediation services, for online platforms, and for different areas of law leave a scattered landscape for providers whose services may fall within more than one of these definitions. The same service may therefore be subject to different regulations with regard to alternative redress possibilities, increasing the demands on the operator, who will have to establish and allocate separate resources, systems and workflows for different redress mechanisms to meet their requirements.

35

“On the merits” the DSA does not even attempt to harmonize regulation at Union level, [89] but subjects online platforms to the requirements of the laws in all 27 Member States, “irrespective of the precise subject matter or nature of that law”, with further uncertainties resulting from exceptions to the country-of-origin principle. [90] An approximation does not lie in uniform procedural regulatory requirements, as the Commission fails to recognize the effects of its content moderation rules. The uploader whose content has been removed or account blocked may select any certified body for an out-of-court dispute settlement in any Member State. Furthermore, the DSA does not provide any guidance for which criteria and which standard the redress body shall apply in examining the alleged illegality of the content in question. Especially in the area of free speech, the legal assessment as to whether content is lawful is highly contextual and subject to complex factual and legal determinations and the balancing of conflicting fundamental rights. And this assessment and balancing become even more complex with the “addition” of an intermediary service to the equation. [91] Without standards and guidance for this assessment, the quality of the decision making process and the decision will likely vary significantly from body to body and Member State to Member State, resulting in a patchwork “case-law” of deviating decisions and a classic race-to-the-bottom with all the wrong incentives; the uploader whose content is removed will turn to the certified body that is likely to grant their claim. As a consequence, we will see bodies in certain Member States decide more uploader-friendly than in others and we will see bodies that may differ in their findings on speech along different political, social and/or religious beliefs. The result of Article 18 DSA is more fragmentation, not less.

4.2. The parties’ freedom of contract is not sufficiently regarded in the proposed complaint and redress mechanism

36

Platform operators can rely on the freedom to conduct a business guaranteed in Article 16 of the Charter, which protects them, in principle, from obligations which may have a significant impact on their activity. [92] This paper will not discuss in this respect the significant measures required by the online platforms to adapt workflows, dedicate resources and invest in systems, but wants to put the focus on another concern with regard to the rights from Article 16 of the Charter, resulting from the extension of the DSA’s content moderation measures to removals based on community guidelines’ violations.

37

The freedom of contract is an essential element of the protection granted by Article 16 of the Charter in the jurisdiction of the CJEU. The scope of protection of the freedom of contract of companies implies the free choice of the contractual partner and the design and amendment of the content of the contract. [93] The Commission acknowledges the importance of this freedom in recital 38, but considers regulation appropriate for the avoidance of “unfair or arbitrary outcomes”. This is grounded in the Commission’s assumption that decisions by the platform operator on the basis of their terms and conditions may be arbitrary and untransparent and consequently oppress legal speech. This somewhat sweeping assumption requires a closer examination in law and fact.

4.2.1. The online platforms must have discretion in their decision to remove content

38

According to the Explanatory Memorandum, the DSA “seeks to foster responsible and diligent behaviour by providers of intermediary services to ensure a safe online environment, which allows Union citizens and other parties to freely exercise their fundamental rights, in particular the freedom of expression and information”. [94]

39

Somewhat neglected in the current debate on the perceived influence and a consequential responsibility of online platforms with regard to the freedom of expression is that the fundamental rights of the Charter are addressed to the institutions and bodies of the Union. [95] Also, the case law of the ECtHR that is sometimes referred to in this discussion when pointing out the importance of online platforms for the exercise of free speech on the Internet concern cases where it was a judicial or administrative authority of a treaty state to the Convention that encroached a citizen’s rights from Article 10 ECHR. [96] In this relationship—i.e. citizen v. state—the fundamental rights and especially the freedom of expression apply directly and fully. However, the freedoms of the Convention, in general, do not apply horizontally between private persons. [97] Regarding the Charter it is equally doubtful, whether the fundamental freedoms granted under its articles have effect in determining or resolving relationships between private parties. [98] An argument against binding private parties to the freedoms of the Charter is that it does not appear to be fundamentally necessary in order to realize the internal market. Moreover, propagating such binding effect may disregard the wilful decisions of private individuals as expression of their autonomy. [99] With regard to social networks, the Federal Constitutional Court of Germany has held in a recent temporary restraining order decision that it has not yet been conclusively clarified either in the case law of the civil courts or in the case law of the Federal Constitutional Court, whether and, if so, which legal requirements may arise in this respect for operators of “social networks on the Internet”; the constitutional legal relationships are still unresolved in this respect. [100] This contribution cannot dive into the details of this complicated and far-reaching legal issue. But for the purposes of this paper it may suffice to say that it is too short-sighted to simply apply the same standard of protection of the freedom of expression and information vis-á-vis state authorities also to relationships between private parties in the private marketplace.

40

The boundaries of lawful free speech in their function as a defensive right against encroachments by legislative, administrative or judicial authorities are very wide. [101] Private companies cannot be held to this same standard. If this standard were to be read into the provisions of Articles 15, 17, and 18 DSA, those provisions would leave almost no leeway for providers to give their commercially operated network a certain orientation in order to gear it towards target groups, or to create respectful interactions within its "community", or to minimise a liability risk due to the dissemination of possibly illegal content.

41

The use of online services by the individual user is foremost subject to contractual agreements laid down in the providers' general terms and conditions and agreed to by the user during the registration process. In these terms and conditions, online platforms usually reserve the right to remove content that is in conflict with these agreed rules/guidelines. These guidelines may stipulate that the platform is dedicated to a specific subject matter or purpose and declare content outside this subject matter not permissible (e.g. a social network dedicated to a certain type of sport or pastime); they may formulate rules on how to behave and interact on the platform (“netiquette”) or stipulate that certain content is generally not permissible, even though such content may not be illegal (e.g. certain types of nudity). [102] The content and purpose of such “house rules” may be manifold and as such are protected by the service provider’s contractual freedom under Article 16 of the Charter. It is also within the scope of protection granted by Article 16 of the Charter that private companies take measures to protect them against legal risks, including the avoidance of litigation. It is therefore not only in the interest, but also protected under Article 16 of the Charter for online platforms to reserve in these guidelines (or their interpretation) a corridor of discretion that keeps a “safe distance” to the illegality. In carrying out this balancing of conflicting fundamental rights, the German Federal Court of Justice (BGH) in its recent Facebook decisions came to the conclusion, that the social network is in principle entitled to require its users to comply with certain communication standards that go beyond the requirements of criminal law (e.g. insult, defamation or incitement of the people). In particular, it may reserve the right to remove posts and block the user account in question in the event of a breach of the communication standards. [103]

42

Even if one were to assume that the fundamental rights of the Charter can also be effective in disputes between private parties by way of indirect (third-party) effect, [104] this would not lead to a “must-carry”-obligation for the online platform. While under such a regime it could be argued, that the service provider, especially where the platform is of a general nature and not limited to a specific subject matter or purpose, may not be entitled to reserve the right to arbitrarily decide on the removal of content, there must be room for service providers to remove content which otherwise may not be illegal in accordance with its house rules. However, a platform operator cannot be obliged to allow any content on the platform, if such content only complies with the limitations of Article 11 of the Charter. While the fundamental rights of the uploader may influence his relationship with the online platform, such effect is indirect, and must recognize and bring to effect the fundamental rights of both parties. To this end, the service providers’ right under Article 16 of the Charter must be recognized to not only devise its house rules, in order for its users to be able to use the platform free of any hostility and disrespectful behaviour, but also to protect its interest by reducing the risk of exposure to legal enforcement or fines. In particular, as much of what is repulsive, indecent and distasteful under any consideration—such as racist and other hateful content in particular—may (still) be covered by the freedom of expression under Article 11 of the Charter, the service provider must be able—in balancing the various interests and rights—to keep such content from the platform, as long as the terms of use providing for such rights are transparent and not arbitrary.

4.2.2. No guidelines as to expertise, standards for the out-of-court settlement body’s decision

43

The DSA gives no guidelines at all on the standard the certified out-of-court settlement bodies shall apply in finding their decision. Article 18(2) DSA only requires generally, that such body has demonstrated “the necessary expertise in relation to the issues arising in one or more particular areas of illegal content”. It is not clear, what “necessary expertise” means. Directive 2013/11/EU on ADR for consumer disputes made such necessary expertise, knowledge and skills a requirement, “as well as a general understanding of the law”. Can one draw from this the reverse conclusion that knowledge of the law (not even a general understanding) is not required under the DSA? The settlement shall take place in accordance “with clear and fair rules of procedure”, without specifying what this means in detail. This, in itself, is a violation of Article 52 of the Charter, which not only requires that any limitation on the exercise of right protected by the Charter must be provided for by law, but also that the legal basis must be sufficiently clear and precise. [105]

44

The determination of whether the removal of content [106] uploaded by a third person violates this person’s freedom of expression, or more precisely, whether such removal decision is “lawful” in the context of balancing fundamental rights or in the application of contractual terms between the parties, is fully rights-based. It is not a question of finding a consensual compromise in commercial relationships that helps both parties by finding a swift and reasonable resolution. [107] As explained above, this is an entirely normative decision, which includes at its core the balancing of conflicting fundamental rights. There is no room for give and take. Such decision must naturally be reserved to judges or other legal professionals who possess a keen understanding of the law and operate on the basis of fundamental due process principles. [108] How would a private body assess and decide on these issues? What standard would it apply? How are the requirements of due process met? Who would be heard? And how would such body investigate and establish the facts relevant to the content, and its context, as the assessment as to whether certain content on the Internet is illegal is often highly contextual and therefore complex? How and on the basis of what standards would such private organization apply the terms of service of the online platforms and decide on the issue of whether fundamental rights have effect on the contractual relationship between the online platform and its users? Neither does the DSA answer these pertinent questions nor are they discussed in the Explanatory Memorandum or the Impact Assessment. This is all the more surprising in light of the long catalogue of decisions by both the ECtHR and the CJEU on the balancing of the freedom of expression, which constitutes one of the essential foundations of a democratic society and one of the basic conditions for its progress and for each individual’s self-fulfilment [109], and particularly the right to protection of reputation as protected by Article 8 of the Convention as part of the right to respect for private life. The ECtHR has repeatedly emphasized that, as a matter of principle, the rights guaranteed under Articles 8 and 10 deserve equal respect and has established a set of principles for this particular balancing. [110]

4.3. The out-of-court dispute settlement mechanism in Article 18 DSA is a violation of the right to an effective remedy and to a fair trial in Article 47 of the Charter

45

The out-of-court dispute settlement mechanism of Article 18 DSA encroaches upon the online platform’s rights from Article 47 of the Charter.

4.3.1. The out-of-court settlement body’s binding decision violates the online platform’s rights from Article 47 of the Charter

46

While the online platform “shall engage in good faith” with the out-of-court settlement body selected by the uploader, it “shall be bound by the decision taken by the body” (Article 18(1) DSA). Conversely, such decision is without prejudice to the uploader’s right to redress the decision before a court of law. As Article 18(1) 44 DSA only mentions “the recipient” for such redress, online platforms may not have such right. [111] Besides the ambiguity of the quoted language, a binding effect of the out-of-court settlement body’s decision on the online platform is an obvious violation of the platform’s rights from Article 47 of the Charter.

47

A fundamental principle of justice in dispute systems design is unconditional access to justice. [112] The Commission addressed this aspect of Article 47 of the Charter only with regard to the “recipient of the service”. Presumably, the Commission envisages the uploader vis-á-vis the online platform in a weak position and therefore did not want to deprive it of any possibility to pursue its claims. The proposal fails to recognize, however, the serious effects a binding decision by the out-of-court settlement body would have on the fundamental rights of the online platform, as this would effectively establish a “must carry”-obligation on the part of the platform, imposed on the platform by a private organization, and without any possibility of redress.

48

In accordance with these requirements of Article 47 of the Charter, the “predecessors” of the DSA in devising ADR and ODR mechanisms have followed a different path. Directive 2013/11/EU on alternative dispute resolution for consumer disputes spells out in recital 45 that “this Directive should not prevent parties from exercising their right of access to the judicial system.” Similarly, the AVMSD as amended by Directive (EU) 2018/1808 states in recital 50: “The right to an effective remedy and the right to a fair trial are fundamental rights laid down in Article 47 of the Charter. The provisions of Directive 2010/13/EU should not, therefore, be construed in a way that would prevent parties from exercising their right of access to the judicial system.” [113] It is not comprehensible, why the Commission opts for its one-sided solution, especially in this field of law, which necessarily involves fundamental rights in all four corners of this particularly multipolar constellation of communication. [114]

4.3.2. The transfer of original tasks of the judiciary to private entities

49

Out-of-court dispute settlement mechanisms usually aim at a consensual dispute settlement, placing emphasis on interests rather than on the legal positions of the parties or on the rights asserted. [115] The suitable disputes for such ADR/ODR systems are those that are concerned with the entitlement to material benefits rather than those concerned with fundamental rights. [116] Alternatively, where a rights-based analysis involving fundamental rights is at the core of a dispute, it is the original task of the public courts under Article 47 of the Charter to provide and enforce solutions. Particularly in the legally sophisticated field of free speech it does not appear possible to leave the solution of disputes to non-legal private providers, which may not be trained or incentivized for this task and which operate outside the procedural safeguards of the court system. [117]

50

Moreover, the transfer of such normative decisions to private bodies gives rise to a load of further problems: (1) the out-of-court settlement bodies certified under Article 18(2) DSA need to be sufficiently funded to be able to operate, (2) where such funding shall come from is not provided for in the draft DSA, (3) there is an incentive for out-of-court settlement bodies to try to attract as many settlement proceedings as possible, and (4) in pursuit of that goal, a consequential incentive to tend to decide in favour of the applying uploader. Where conflicts are shifted to private service providers who have an incentive to follow the applicants’ interests, efficiency may be put above judicial scrutiny and the observance of due process standards. [118]

51

There are no specific arrangements for the oversight of such out-of-court settlement bodies and the experience with this aspect under Directive 2013/11/EU does not seem very positive. [119] As a consequence, there are also doubts as to the quality of the decision making bodies and the resulting quality of their decisions. [120] Moreover, as uploaders may select any certified out-of-court settlement body there will likely be a diversity of quality standards across the EU. This situation is likely to create a risk of out-of-court settlement body shopping, leading to a race to the bottom. [121]

52

As the decisions to be rendered by the out-of-court settlement bodies are by their nature normative and rights-based, it follows that the qualifications of persons that are entrusted with such decisions must be competent to administer these processes. Clearly, and especially in this highly complex field, this can only be carried out by trained lawyers, who are familiar not only with the applicable Union and national laws but also with the book(s) of relevant case law. If—as in the present context—the goal of the process is rights enforcement, only legal professionals are in a position to do justice to this goal. [122] This is not reflected in the DSA.

4.4. No proportionality/contradiction with good-Samaritan principle in Art. 6

53

The inclusion of removal decisions based on violations of the providers’ terms and conditions, is not meeting the requirement of the principle of proportionality within the meaning of Article 52(1) of the Charter. The flagging systems as they have been established by all leading online platforms in order to enforce their community guidelines are swift and efficient, and they are widely used by registered users of the respective platforms. [123] The success of these systems is due to their simplicity, their easy accessibility and their fast decision making by using formalized complaints and electronic means. Requiring online platforms to submit—besides extensive reporting and an internal complaint-handling mechanism—to an out-of-court dispute settlement regarding only a small fraction of these billions of removal decisions [124], is obviously out of proportion.

54

Such requirement may very well become its own source of disputes, when being abused by complaining uploaders. [125] This is not a theoretical issue especially in the area of hate speech. There are studies confirming the suspicion that in the field of hate speech, there are few originators responsible for a very high proportion of hate speech content, and that these users often comment qualitatively differently than "normal" users. The study “Hass auf Knopfdruck” (“hate at the push of a button”) of the Institute for Strategic Dialogue (ISD) has mapped the rise and nature of far-right hate speech in Germany. It combines quantitative data-analysis from Facebook comment sections with insights gained from ethnographic research in far-right chat groups. The study found: "Hate speech among media articles on the major German-language news sites on Facebook is produced, ‘pushed’ and distributed by a small group of accounts - measured by the number of all users. The distribution is often coordinated in terms of content and time." [126] Not only will these convinced perpetrators not be deterred from posting blocked content again; they will likely take any opportunity to confront and attack people with deviating opinions and thus instrumentalise content moderation procedures for their purposes. The out-of-court dispute settlement mechanism in Article 18 DSA provides such an opportunity without a cost-risk and—combined with the incentives for out-of-court settlement bodies to decide “complaint-friendly”—creates a risk that content is re-uploaded to the platform although its removal was well-founded because the reported content is illegal or violates the terms of service of the provider.

55

Article 6 DSA intends to reward voluntary own-initiative investigations (the so-called “good-Samaritan principle”). The flagging systems voluntarily established by the online platforms can be subsumed under this term. Making these systems subject to the extensive obligations in Article 15, 17 and 18 DSA can cause an online platform to curtail these systems or shut them down entirely. The ECtHR has held regarding the imposition of liability of an internet portal for its third-party comments section: “Such liability may have foreseeable negative consequences on the comment environment of an Internet portal, for example by impelling it to close the commenting space altogether.” [127] The interference with these functioning systems would thus run contrary to the principle established in Article 6 DSA.

5. Outlook

56

The DSA, in providing for content moderation and an out-of-court dispute settlement mechanism, intends to empower the uploader and make available to them an easily accessible, swift and effective as well as cost-free redress against decisions by the online platform to remove their content. There is nothing wrong with this intention, but the Commission operates with its proposal on premises which do not or at least not fully hold true. Also, the implementation itself is defective.

57

It is true that the removal of users’ content by platforms “can have severe consequences on the rights and freedoms of their users” and that the platforms’ decisions are often not based on an assessment of the legality of the content but “according to the terms of services that are part of their contractual terms”. Moreover, “in some cases, content can also be removed erroneously, even if it is not illegal, nor in violation of the terms of service” stemming from erroneous reporting by other users, abusive notices, or from platforms’ own detection systems, not least when automated tools are used. Despite all these facts, the opting for the redress mechanism as devised in Article 18 DSA is short-sighted and does not match the requirements and risks. [128]

58

Making the decision of the out-of-court settlement body binding upon the online platform is the most obvious mistake of the proposal. Similarly, it does not appear necessary to let legal persons such as businesses or political parties participate in the “privileges” granted by Article 18 DSA. But more importantly there is doubt as to the premise of the Commission that the protection of fundamental rights of the uploaders require an enforcement as regulated in the DSA. Every major online platform has a functioning internal complaint-handling mechanism. As part of these systems, an uploader whose content is removed or access to it disabled, or whose account is suspended or terminated, is informed of such decision by the online platform. All major platforms also provide for a possibility to appeal such decision. A reasonable regulation of the requirements of such system is certainly helpful to harmonize standards.

59

An out-of-court dispute settlement process in the form of Article 18, however, is not necessary and even counter-productive to the goals of the DSA. There is already an appeal mechanism for any removal decision by e.g. YouTube, which is used only by a fraction of uploaders, clearing the suspicion of overblocking. The fact that these easily accessible, swift and cost-free appeal mechanisms are rarely used by uploaders whose content have been removed, does not support the need for an out-of-court redress mechanism. In the same vein, there is only a limited number of court cases in Germany by uploaders demanding that their content is being reinstated. In Germany, there is effective legal protection available through the court system, e.g. the possibility of interim injunctions at relatively low cost [129] and the right way to correct legal standards for this assessment. Secondly, experience with this case law shows that the overwhelming number of plaintiffs demanding the reinstatement of their content are political activists from the far-right of the political spectrum, deniers of the current pandemic or certain aspects related to it, or business people operating in the twilight of grey markets. It goes without saying that all these people have a right to be heard with their appeal. But neither the number of cases nor the position of the plaintiffs appear to underscore a need for an additional out-of-court redress mechanism.

60

Establishing systems of out-of-court redress operated by private and competing organizations in all Member States would effectively duplicate a quasi-judicial landscape of ADR providers next to the courts. [130] The transaction costs in regulating these private providers to secure minimum standards will be significant and an inefficient duplication of resources. [131] Moreover, the enforcement of these bodies’ decisions remains unclear, their impartiality as well as their oversight questionable.

61

The biggest point of criticism, however, remains that the decision on the lawfulness of content is a normative decision reserved for the judiciary. The benefits of easy access, of swiftness, and decision-making via electronic means cannot outweigh the severe flaws of the envisaged redress mechanism. Instead, it would be much more prudent and efficient to modernize the existing judicial infrastructure in the Member States. This could build on existing elements or by providing for amendments to the existing courts system to address (real or assumed) needs of private persons to effectively pursue their rights. The pandemic has brought experience in online court hearings that could be put to use for the cases in question. [132] Further tools could be the lowering of court and attorney’s fees for such proceedings, [133] a more generous handling of legal aid, or the possibility of mandating certain not-for-profit bodies, organisations or associations to bring claims, mandated by a recipient of the service. [134] In light of the possibility for abuse of any such redress mechanism, [135] however, there also should be hurdles to pursue one’s rights. [136] One could also think of a right of associations to sue similar to such right granted under German laws against illegal clauses in general terms and conditions. With such right to sue, there would be effective measures of redress against arbitrary clauses in terms of conditions of providers.

*by Jörg Wimmers, LL.M. (NYU), partner in the Hamburg office of international law firm Taylor Wessing



[1] * Jörg Wimmers, LL.M. (NYU), partner in the Hamburg office of international law firm Taylor Wessing.

Commission, ‘Proposal for a Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC’ COM(2020) 825 final.

[2] The term along with its critical undertone is borrowed from Horst Eidenmüller/Martin Engel, ‘Against False Settlement: Designing Efficient Consumer Rights Enforcement Systems in Europe’ [2014] 29:2 Ohio State Journal on Dispute Resolution 261.

[3] Directive 2013/11/EU of the European Parliament and of the Council of 21 May 2013 on alternative dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004 and Directive 2009/22/EC (Directive on consumer ADR).

[4] Regulation (EU) No 524/2013 of the European Parliament and of the Council of 21 May 2013 on online dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004 and Directive 2009/22/EC (Regulation on consumer ODR).

[5] Alexandre Biard, ‘Impact of Directive 2013/11/EU on Consumer ADR Quality: Evidence from France and the UK’ [2019] 42 Journal of Consumer Policy 109.

[6] Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services.

[7] Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC.

[8] Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities.

[9] Commission, ‘Commission Staff Working Document, Impact Assessment, Accompanying the document Proposal for a Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC’ SWD(2020) 348 final, Part 1/2, para 193.

[10] cf e.g. Article 17(9) of the EU Copyright Directive (n 7), Article 28b of the Audiovisual Media Services Directive (n 8), or Regulation (EU) 2019/1150 on promoting fairness and transparency for business users of online intermediation services (n 6).

[11] See infra IV.1.

[12] See infra IV.1.

[13] See infra IV.2.a).

[14] See infra IV.3.b).

[15] See infra IV.4.

[16] Gerhard Wagner, ‘Haftung von Plattformen für Rechtsverletzungen (Teil 1)’ [2019] GRUR 329; Julia Reda, ‘Der Digital Services Act steht für einen Sinneswandel in Brüssel‘ (netzpolitik.org, 5 January 2021) <https://netzpolitik.org/2021/edit-policy-der-digital-services-act-steht-fuer-einen-sinneswandel-in-bruessel/> accessed 20 August 2021; cf. also recital 3 DSA.

[17] cf e.g. Markus Reuter, ‘Protests against Copyright Directive: All Cities, Dates and Numbers of Participants across Europe’ (netzpolitik.org 25 March 2019) <https://netzpolitik.org/2019/protests-against-copyright-directive-all-cities-dates-and-numbers-of-participants-across-europe/> accessed 20 August 2021; Michael Hanfeld, ‘Protest gegen EU-Urheberrecht: „Seid ihr Bots? Seid ihr ein Mob?“‘ Frankfurter Allgemeine Zeitung (Frankfurt, 23 March 2019) <https://www.faz.net/aktuell/feuilleton/debatten/proteste-gegen-und-eintreten-fuer-das-eu-urheberrecht-16104780.html> accessed 20 August 2021; Julia Reda, ‚Upload Filters‘ (juliareda.eu, no date) <https://juliareda.eu/eu-copyright-reform/censorship-machines/> accessed 20 August 2021; Julia Reda, Joschka Selinger, Michael Servatius, ‘Article 17 of the Directive on Copyright in the Digital Single Market: a Fundamental Rights Assessment’ (freiheitsrechte.org, 16 November 2020) <https://freiheitsrechte.org/home/wp-content/uploads/2020/11/GFF_Article17_Fundamental_Rights.pdf> accessed 20 August 2021.

[18] Handyside v The United Kingdom App no 5493/72 (ECtHR, 7 December 1976), para 49; Hasan Yazici v Turkey App no 40877/07 (ECtHR, 15 July 2014), para 48; see regarding this discussion now the instructive statements by Advocate General Saugmandsgaard-Øe in his Opinion in Case C-401/19 Republic of Poland v European Parliament, Council of the European Union (Opinion of AG Saugmandsgaard-Øe, 15 July 2021).

[19] This is – with the qualification of “special responsibilities” – the formulation used by the Court of Justice of the European Union (CJEU) to define the role of search engines: Case C-136/17 GC and Others v Commission nationale de l'informatique et des libertés (CNIL) (CJEU, 24 September 2019), para 49.

[20] Regarding their fundamental rights and necessary safeguards cf Case C-314/12 UPC Telekabel Wien GmbH v Constantin Film Verleih GmbH, Munich (Germany), Wega Filmproduktionsgesellschaft mbH (CJEU, 27 March 2014).

[21] Wagner (n 16) 329.

[22] Recital 42 of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce').

[23] C-324/09 L'Oréal SA and Others v eBay International AG and Others (CJEU, 12 July 2011), para 113; Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (CJEU, 22 June 2021), para 106.

[24] Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (Opinion of Advocate General Saugmandsgaard-Øe, 16 July 2020), para 151.

[25] Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (Opinion of Advocate General Saugmandsgaard-Øe, 16 July 2020), para 152.

[26] CJEU, judgment dated 3 October 2019 - case C-18/18, para. 22 – Glawishnig-Piesczek/Facebook; Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (Opinion of Advocate General Saugmandsgaard-Øe, 16 July 2020), para 151.

[27] Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (Opinion of Advocate General Saugmandsgaard-Øe, 16 July 2020), para 186; Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (CJEU, 22 June 2021), para 113; see also C-401/19 Republic of Poland v European Parliament, Council of the European Union (Opinion of AG Saugmandsgaard-Øe, 15 July 2021), para 126 et seq, 132 et seq.

[28] Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (Opinion of Advocate General Saugmandsgaard-Øe, 16 July 2020), para 187.

[29] Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (Opinion of Advocate General Saugmandsgaard-Øe, 16 July 2020), para 188.

[30] Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v Hungary App no 22947/13 (ECtHR, 2 February 2016), para 68 et seq.

[31] BGH GRUR 2020, 1338, para 38; Wagner (n 16) 336.

[32] Case C-18/18 Eva Glawischnig-Piesczek v Facebook Ireland Limited (CJEU, 3 October 2019), which did not see a violation of Article 15 of the e-Commerce Directive by a national court ordering Facebook to block also not identical, but equivalent content.

[33] Wagner (n 16) 329; regarding search engines cf Case C-131/12 Costeja v Google Spain (CJEU,13 May 2014); C-136/17 – GC et al (CJEU, 24 September 2019); cf also Article 29 Data Protection Working Party, ‘Guidelines on the Implementation of the Court of Justice of the European Union Judgment on “Google Spain and Inc v. Agencia Espagnola de Proteccion de Datos (AEPD) and Mario Costeja Gonzales” C-131/12 [2014] 14/EN WP 225.

[34] cf Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (Opinion of Advocate General Saugmandsgaard-Øe, 16 July 2020), para 102

[35] According to Directive 2019/790 (n 7), art 17(9), Member States shall also provide that online content-sharing service providers put in place an effective and expeditious complaint and redress mechanism that is available to users of their services in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them; Member States shall also ensure that out-of-court redress mechanisms are available for the settlement of disputes.

[36] Netzwerkdurchsetzungsgesetz vom 1. September 2017 (BGBl. I S. 3352) (Network Enforcement Act), latest changes by Article 15 Nos 3 and 6 of the law dated 30 March 2021 (BGBl. I S. 448).

[37] Bundesgesetz über Maßnahmen zum Schutz der Nutzer auf Kommunikationsplattformen (Kommunikationsplattformen-Gesetz – KoPl-G); BGBl. I Nr. 151/2020 (Federal Act Regarding Measures for the Protection of Users on Communication Platforms); in France the so-called Loi Avia was adopted in May 2020, cf. https://www.assemblee-nationale.fr/dyn/15/textes/l15t0419_texte-adopte-provisoire.pdf ; in a decision dated 18 June 2020, the French Conseil Constitutionnel struck down some of the law’s provisions, cf https://www.conseil-constitutionnel.fr/decision/2020/2020801DC.htm .

[38] In its proposal for a Network Enforcement Act the government reasoned that there was “currently a massive change in social discourse on the net and in social networks in particular. The culture of debate on the net is often aggressive, hurtful and not infrequently hateful. […] Hate crime and other criminal content that cannot be effectively combated and prosecuted poses a great danger to the peaceful coexistence of a free, open and democratic society. Moreover, following the experience of the U.S. election campaign, combating criminal false news ("fake news") on social networks has also become a high priority in the Federal Republic of Germany. There is therefore a need to improve law enforcement on social networks in order to immediately remove objectively punishable content such as incitement of the people, insult, defamation or disturbing the public peace by pretending to have committed a crime”; Gesetzentwurf der Bundesregierung, Entwurf eines Gesetzes zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz – NetzDG), available at <https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Dokumente/RegE_NetzDG.pdf;jsessionid=BCE7F452F946378EA96140490839B26D.2_cid324?__blob=publicationFile&v=2> accessed 20 August 2021.

[39] See the four day social media boycott by English professional football clubs following a series of high-profile online racist attacks: Mark Townsend, ‘Footballers and clubs to boycott social media in mass protest over racist abuse’ The Guardian (London, 24 April 2021) <https://www.theguardian.com/football/2021/apr/24/footballers-to-boycott-social-media-in-mass-protest-over-racist-abuse> accessed 20 August 2021.

[40] Wagner (n 16) 337 et seq, who sees a gatekeeper role of the online platforms as a justification for their liability.

[41] It was held that, in particular, the country-of-origin principle in Article 3 of Directive 2000/31/EC (the “e-Commerce Directive”) and the liability provisions for hosting providers in Article 14, 15 of that Directive were violated.

[42] In Germany, it was opined that the Network Enforcement Act caused platform operators to structurally decide to remove reported content and had an inherent "systemic tendency towards deletion"; Josef Drexl, ‘Bedrohung der Meinungsvielfalt durch Algorithmen’ [2017] ZUM 529; Thorsten Feldmann, ‘Zum Referentenentwurf eines NetzDG: Eine kritische Betrachtung’ [2017] K&R 292; Eike Michael Frenzel, ‘Aktuelles Gesetzgebungsvorhaben: Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (NetzDG)’ [2017] JuS 414; Hubertus Gersdorf, ‘Hate Speech in sozialen Netzwerken’ [2017] MMR 439; Nikolas Guggenberger, ‘Das Netzwerkdurchsetzungsgesetz in der Anwendung’ [2017] NJW 2577; Nikolas Guggenberger, ‘Das Netzwerkdurchsetzungsgesetz – schön gedacht, schlecht gemacht’ [2017] ZRP 98; Bernd Holznagel, ‘Das Compliance-System des Entwurfs des Netzwerkdurchsetzungsgesetzes’ [2017] ZUM 2017, 615; Fiete Kalscheuer/Christian Hornung, ‘Das Netzwerkdurchsetzungsgesetz – Ein verfassungswidriger Schnellschuss’ [2017] NVwZ 1721; Ralf Köbler, ‘Fake News, Hassbotschaft und Co. - ein zivilprozessualer Gegenvorschlag zum NetzDG’ [2017] AfP 282; Karl-Heinz Ladeur/Tobias Gostomzyk, ‘Das Netzwerkdurchsetzungsgesetz und die Logik der Meinungsfreiheit’ [2017] K&R 390; Marc Liesching, ‘Was sind »rechtswidrige Inhalte« im Sinne des Netzwerkdurchsetzungsgesetzes?’ [2017] ZUM 809; Holger Lutz/Sebastian Schwiddesen, ‘The New German Hate Speech Law – Introduction and Frequently Asked Questions’ [2017] CRi 103; Georg Nolte, ‘Hate-Speech, Fake-News, das »Netzwerkdurchsetzungsgesetz« und Vielfaltsicherung durch Suchmaschinen’ [2017] ZUM 552; Boris P. Paal/Moritz Hennemann, ‘Meinungsbildung im digitalen Zeitalter’ [2017] JZ 641; Gerald Spindler, ‘Das Netzwerkdurchsetzungsgesetz’ [2017] K&R 533; Gerald Spindler, ‘Rechtsdurchsetzung von Persönlichkeitsrechten’ [2018] GRUR 365; Jörg Wimmers/Britta Heymann, ‘Zum Referentenentwurf eines Netzwerkdurchsetzungsgesetzes (NetzDG) – eine kritische Stellungnahme’ [2017] AfP 93; Marc Liesching, ‘Die Durchsetzung von Verfassungs- und Europarecht gegen das NetzDG’ [2018] MMR 26; Marc Liesching, in Spindler/Schmitz, TMG, 2. Edition., 2018, § 1 NetzDG Rn. 6 ff., 13 ff., 21 ff.; Matthias Ringer/Dirk Wiedemann, ‘Beschwerdeverfahren bei Facebook wegen Markenverletzung - "Gefällt mir"?’ [2018] GRURPrax 203; Gunter Warg, ‘Meinungsfreiheit zwischen Zensur und Selbstzensur’ [2018] DÖV 473.

[43] BGH GRUR 2020, 1338, para 38; Daphne Keller, ‘Who Do You Sue? State and Platform Hybrid Power over Online Speech’ (Hoover Working Group on National Security, Technology, and Law, Aegis Series Paper No. 1902, 29 January 2019) <https://www.lawfareblog.com/who-do-you-sue-state-and-platform-hybrid-power-over-online-speech> accessed 20 August 2021, 3; operators could even remove content upon notification without any (cost-intensive) examination on the basis of an economic risk assessment, since the economic consequences of an unauthorized deletion are marginal compared to high fines and/or the operational expense of the establishment of legally trained staff positions to assess the content; cf Liesching (n 43) 27; id. in Spindler/Schmitz, TMG, 2nd ed., 2018, § 1 NetzDG marginal no. 25.

[44] Gerhard Wagner, ‘Haftung von Plattformen für Rechtsverletzungen (Teil 2)‘ [2020] GRUR 447, 452.

[45] Examples are manifold: The manager of financial services companies who is going after blog-posts on his companies’ business practices; right wing activists and conspiracy theorists pursuing critical reports on their activities or opinions.

[46] cf Commission, ‘Communication from the Commission to the European Parliament, the Council. The European Economic and Social Committee of the Regions, Tackling Illegal Content Online, Towards an enhanced responsibility of online platforms’ COM(2017) 555 final, 18; Keller (n 44) 3.

[47] The author is critical oft he approach taken by the German legislator with the Network Enforcement Act (NetzDG), which imposes on the platforms the determination whether content violates certain offences of the German Criminal Law Act: instead he favors a clear strengthening and reinforcement of the prosecutor's offices and the police for the prosecution of uploaders of criminal content on the internet; see also the study “Hass auf Knopfdruck” quoted at n. 132, which indicates that it may be a small group of users posting the majority of hateful comments; see also Johanna Spiegel, Britta Heymann, ‘Ein Minenfeld für Anbieter sozialer Netzwerke – Zwischen NetzDG, Verfassungsrecht und Vertragsfreiheit’ [2020] K&R 344, 349.

[48] BGH GRUR 2020, 1338, para 38; Daphne Keller, ‘Who Do You Sue? State and Platform Hybrid Power over Online Speech’ (Hoover Working Group on National Security, Technology, and Law, Aegis Series Paper No. 1902, 29 January 2019) <https://www.lawfareblog.com/who-do-you-sue-state-and-platform-hybrid-power-over-online-speech> accessed 20 August 2021, 3; operators could even remove content upon notification without any (cost-intensive) examination on the basis of an economic risk assessment, since the economic consequences of an unauthorized deletion are marginal compared to high fines and/or the operational expense of the establishment of legally trained staff positions to assess the content; cf Liesching (n 43) 27; id. in Spindler/Schmitz, TMG, 2nd ed., 2018, § 1 NetzDG marginal no. 25.

[49] Gerhard Wagner, ‘Haftung von Plattformen für Rechtsverletzungen (Teil 2)‘ [2020] GRUR 447, 452.

[50] Examples are manifold: The manager of financial services companies who is going after blog-posts on his companies’ business practices; right wing activists and conspiracy theorists pursuing critical reports on their activities or opinions.

[51] cf Commission, ‘Communication from the Commission to the European Parliament, the Council. The European Economic and Social Committee of the Regions, Tackling Illegal Content Online, Towards an enhanced responsibility of online platforms’ COM(2017) 555 final, 18; Keller (n 44) 3.

[52] Commission, Impact Assessment (n 9) SWD(2020) 348 final, Part ½, para 51.

[53] Philippe Sotto, ‘French Court Issues Mixed Ruling in Facebook Nudity Case’ U.S. News and World Report (March 15, 2018) <https://www.usnews.com/news/business/articles/2018-03-15/french-court-issues-mixed-ruling-in-facebook-nudity-case> accessed 20 August 2021.

[54] Keller (n 44) 1.

[55] ‘Facebook darf nicht nach Belieben löschen’ Sueddeutsche Zeitung (6 September 2018) <https://www.sueddeutsche.de/digital/facebook-beitraege-loeschen-1.4119997> accessed 20 August 2021; at the time of publishing the decision rendered by the Federal Court of Justice (BGH) on 29 July 2021 (III ZR 179/20 and III ZR 192/20) on Facebook’s appeal was not available as a full-text judgment. According to the press release by the BGH, the court demanded Facebook to reinstate the removed content.

[56] cf the community guidelines of YouTube: ‘Community Guidelines’ <https://www.youtube.com/intl/ALL_en/howyoutubeworks/policies/community-guidelines/> accessed 20 August 2021, the Twitter Rules at: ‘The Twitter Rules’ <https://help.twitter.com/en/rules-and-policies/twitter-rules> accessed 20 August 2021.

[57] The provision on Trusted Flaggers in Article 19 DSA is a different concept of regulatory stipulations for organisations, e.g. law enforcement.

[58] Note, however, that the Federal Court of Justice (BGH) in its decision dated 29 July 2021 (n 50) demanded Facebook to reinstate blocked content, because Facebook’s terms and conditions did not provide for such information to the uploader and were, therefore, invalid; cf press release: ‘Bundesgerichtshof zu Ansprüchen gegen die Anbieterin eines sozialen Netzwerks, die unter dem Vorwurf der "Hassrede" Beiträge gelöscht und Konten gesperrt hat’ (Karlsruhe, 29 July 2021) < https://www.bundesgerichtshof.de/SharedDocs/Pressemitteilungen/DE/2021/2021149.html > accessed 20 August 2021.

[59] For some more detailed figures on YouTube and Facebook see below at footnote 81 and 82.

[60] See the figures e.g. for YouTube at: <https://transparencyreport.google.com/youtube-policy/appeals> accessed 20 August 2021.

[61] Martin Eifert/Axel Metzger/Heike Schweitzer/Gerhard Wagner ‘Taming the Giants: The DMA/DSA Package’ [2021] 58 CML Rev. 1.

[62] Article 2(g) DSA.

[64] See Article 18(3) DSA; another lopsidedness follows from the fact that while the uploader is granted these rights and safeguards, the person whose rights may be infringed does not. Why a person affected by, for example, an infringing (defamatory) statement should not have access to such proceedings for his part is not clear against the background of the "fundamental rights" perspective that the Commission adopts with its proposal. This is not suggesting a further extension of the proposed regulations, but rather another indication that the Commission's proposal seems one-sided and half-baked; see also Eifert/ Metzger/Schweitzer/Wagner (n 58) 25.

[65] While Art. 1(5) DSA carves out copyright law, Recital 12 mentions the non-authorised use of copyright protected material shall be covered by the broad concept of illegal content within the meaning of the DSA.

[66] A different view takes: Gerald Spindler, ‘Der Vorschlag für ein neues Haftungsregime für Internetprovider – der EU-Digital Services Act Teil 2: Große und besonders große Plattformen’ [2021] GRUR 653, who suggests that the requirements of Article 18 DSA can be used to concretize the dispute resolution mechanisms, which are only sketched out by Article 17(7) et seq. of the Copyright Directive.

[67] It is the view of the European Commission that the “DSA is not an IPR enforcement tool” given its horizontal nature; therefore, the Commission considers that Article 17 of Directive 2019/790 remains “unaffected; i.e. DSA rules on limited liability, notice and action, redress and out of court mechanism [are] not applicable for [online content sharing services platforms].”; quoted after Joao Quintais/Sebastian Felix Schwemer, ‘The Interplay between the Digital Servicers Act and Sector Regulation: How Special is Copyright?’ SSRN (May 7 2021) <https://privpapers.ssrn.com/sol3/papers.cfm?abstract_id=3841606> accessed 20 August 2021.

[68] Audiovisual Media Services Directive (n 8).

[69] Recital 9 continues: “However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level”; see also: Commission, ‘Explanatory Memorandum’ (n 1) COM(2020) 825 final, 4.

[70] cf Art. 1 lit. b) of the Audiovisual Media Services Directive (n 8).

[71] According to that provision Member States shall ensure that videosharing platform providers under their jurisdiction take appropriate measures to protect: (a) minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development in accordance with Article 6a(1); (b) the general public from programmes, user-generated videos and audiovisual commercial communications containing incitement to violence or hatred directed against a group of persons or a member of a group based on any of the grounds referred to in Article 21 of the Charter; (c) the general public from programmes, user-generated videos and audiovisual commercial communications containing content the dissemination of which constitutes an activity which is a criminal offence under Union law, namely public provocation to commit a terrorist offence as set out in Article 5 of Directive (EU) 2017/541, offences concerning child pornography as set out in Article 5(4) of Directive 2011/93/EU of the European Parliament and of the Council (*) and offences concerning racism and xenophobia as set out in Article 1 of Framework Decision 2008/913/JHA.

[72] Commission, ‘Explanatory Memorandum’ (n 1) COM(2020) 825 final, p 4.

[73] Directive (EU) 2018/1808 clearly spells out that measures shall be taken concerning “programmes, user-generated videos and audiovisual commercial communications”; the German Network Enforcement Act – in its latest amendment – carves out content that is not user-generated videos or broadcasts from the Directives application, i.e. video descriptions and comments. This seems an odd differentiation, also given the fact that comments and descriptions usually do not exist independent of the content (e.g. a user-generated video) to which they refer.

[74] Article 28b of Directive 2018/1808 names the protection of minors from content which may impair their physical, mental or moral development in accordance with Article 6a(1), the protection of the general public from content containing incitement to violence or hatred directed against a group of persons or a member of a group based on any of the grounds referred to in Article 21 of the Charter, or the dissemination of which constitutes an activity which is a criminal offence under Union law, namely public provocation to commit a terrorist offence as set out in Article 5 of Directive (EU) 2017/541, offences concerning child pornography as set out in Article 5(4) of Directive 2011/93/EU of the European Parliament and of the Council (*) and offences concerning racism and xenophobia as set out in Article 1 of Framework Decision 2008/913/JHA.

[75] See for instance the specific prohibitions in Germany on the denial of the holocaust in § 130 StGB (German Criminal Code).

[76] cf Recital 4 DSA.

[77] Article 28a of Directive 2018/1808 extends the country-of-origin principle to video-sharing platform services “deemed to be established in a Member State”.

[78] BGH GRUR 2018, 1132 para 47; Case C-484/14 Tobias Mc Fadden v Sony Music Entertainment Germany GmbH (CJEU, 15 September 2016).

[79] Bernd Holznagel, ‘Chapter II des Vorschlags der EU-Kommission für einen Digital Services Act’ [2021] CR 123; Gerald Spindler, ‘Der Vorschlag für ein neues Haftungsregime für Internetprovider – der EU-Digital Services Act (Teil 1)’ [2021] GRUR 545.

[80] It remains unclear whether and how these provisions of the DSA can influence the contractual relationship between the platform and its users, which remains at the disposal of the parties. It is therefore likely that the obligations and regulatory instruments will rather be enforced by the competent authorities in accordance with the provisions of Chapter IV of the draft DSA. It is further unclear, whether and on what basis violations of these obligations by the platform give rise to civil law claims by the users; doubtful as here also Spindler (n 63) 654, who assumes that enforcement will occur as public law enforcement.

[81] Commission, Impact Assessment, COM(2020) 825 final, para 51.

[84] cf Google Transparency Report, YouTube Community Guidelines enforcement, available at <https://transparencyreport.google.com/youtube-policy/removals?hl=en>.

[85] Commission, ‘Explanatory Memorandum’ (n 1) COM(2020) 825 final, 3, 5.

[86] Critical of Article 114 TFEU as a suitable basis: Jörg Ukrow, ‘Die Vorschläge der EU-Kommission für einen Digital Services Act und einen Digital Markets Act’ Institute of European Media Law, 8 et seq <https://emr-sb.de/wp-content/uploads/2021/01/Impulse-aus-dem-EMR_DMA-und-DSA.pdf> accessed 20 August 2021.

[87] Case C-401/19 Republic of Poland v European Parliament, Council of the European Union (Opinion of AG Saugmandsgaard-Øe, 15 July 2021), para 88 et seq.

[88] See above III.3.a).

[89] In criticising the effects of subjecting service providers to the legal systems of all 27 Member States, Nettesheim, suggests a harmonization: “The emergence of a common European area of fundamental rights, shaped by the ECHR and the European Charter of Fundamental Rights (CFR), makes it possible today to harmonize the basic principles of what must be permitted on (very large) online platforms and what can or should be prohibited under European Union law”; Martin Nettesheim, ‘Die unionsrechtliche Regulierung großer Internet-Plattformen: Die Kommissionsentwürfe für einen Digital Markets Act und einen Digital Services Act’ Bundestagsdrucksache 19(21)136) <https://uni-tuebingen.de/fakultaeten/juristische-fakultaet/lehrstuehle-und-personen/lehrstuehle/lehrstuehle-oeffentliches-recht/nettesheim/> accessed 20 August 2021.

[90] See above III.3.b).

[91] See above II. and Magyar Tartalomszolgáltatók v Hungary App no. 22947/13 (ECtHR, 2 February 2016), para 68 et seq.

[92] cf Joined Cases C-682/18 and C-683/18 Frank Peterson v Google LLC, YouTube LLC, YouTube Inc., Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando (C-683/18) (Opinion of Advocate General Saugmandsgaard-Øe, 16 July 2020), para 240; see also Case C-360/10 SABAM v Netlog (CJEU, 16 February 2012), para 44 et seq; Case C-70/10 Scarlet Extended/SABAM (CJEU, 24 November 2011), para 46 et seq.

[93] cf Case C-240/97 Spain v Commission (CJEU, 5 October 1999), para 99; case C-426/11 Mark Alemo-Herron (CJEU, 18 July 2013), para 32; Case C-283/11 Sky Austria (CJEU, 22.1.2013), para 42 et seq; Case C-277/16 Polkomtel (CJEU, 20 December 2017), para 50.

[94] Commission, ‘Explanatory Memorandum’ (n 1) COM(2020) 825 final, p 6.

[95] Article 51 of the Charter; the Member States are bound when they are implementing Union law.

[96] Cengiz at al. v Turkey App nos. 48226/10 and 14027/11 (ECtHR, 1 December 2015), para 52; Times Newspapers Ltd. v United Kingdom App nos. 3002/03 and 23676/02 (ECtHr, 10 March 2009), para 27; Delfi AS v Estonia App no. 64569/09 (ECtHR, 16 June 2015), para 110.

[97] See also Eifert/Metzger/Schweitzer/Wagner (n 58) 27.

[98] There are some decisions by the ECtHR and the CJEU which can be interpreted as applying the freedoms of the Charter or the Convention respectively also between private parties. However, these cases concerned the effect of Article 10 of the Convention in the workplace and the relationship between employee and employer, where the state had a positive obligation to protect the right to freedom of expression even in the sphere of relations between individuals; cf. e.g. Heinisch v Federal Republic of Germany App no 28274/08 (ECtHR, 21 July 2011) with further references. In the debate in Germany, where the doctrine of “mittelbarer Drittwirkung” [indirect third party effect] is well-established by the Federal Constitutional Court, such indirect effect is sometimes also afforded to the freedoms of the Charter; cf Jarass, Charta der Grundrechte, Art. 51 para 30 et seq; Schwerdtfeger in Meyer/Hölscheidt, Charta der Grundrechte der Europäischen Union, Art. 51 para 57 et seq.

[99] cf Forsthoff in Grabitz, Hilf, Nettesheim, Das Recht der Europäischen Union, Art. 45 AEUV para. 165.

[100] BVerfG NJW 2019, 1935; available at <https://www.bundesverfassungsgericht.de/SharedDocs/Entscheidungen/DE/2019/05/qk20190522_1bvq004219.html> with an abstract in English.

[101] For German constitutional law, this follows instructively from a decision by the Federal Constitutional Court (BVerfG NJW 2010, 47, para 54): “The possible confrontation with disquieting opinions, even if in their conceptual consequence they are dangerous, and even if they aim at a fundamental transformation of the valid order, is part of the state based on freedom. Protection against an impairment of the “general feeling of peace” or the “poisoning of the intellectual atmosphere” constitute no more reason for an encroachment than does the protection of the population against an insult to their sense of right and wrong by totalitarian ideologies or an evidently false interpretation of history. Neither does the goal of establishing human rights in the legal awareness of the population permit the suppression of contrary views. Instead, the constitution trusts that society can cope with criticism, and even polemics, in this regard, and that they will be countered in a spirit of civil commitment, and that finally citizens will exercise their freedom by refusing to follow such views. By contrast, the recognition of public peace as a limit of what is acceptable as against unacceptable ideas solely because of the opinion as such would disable the principle of freedom, which itself is guaranteed in Article 5.1 of the Basic Law.” Decision available in English at <https://www.bundesverfassungsgericht.de/SharedDocs/Entscheidungen/EN/2009/11/rs20091104_1bvr215008en.html> accessed 20 August 2021.

[102] cf the community guidelines of YouTube: ‘YouTube Guidelines’ <https://www.youtube.com/intl/ALL_en/howyoutubeworks/policies/community-guidelines/> accessed 20 August 2021, the Twitter Rules: ‘Twitter Rules’ <https://help.twitter.com/en/rules-and-policies/twitter-rules> accessed 20 August 2021.

[103] This is the wording used in the press release by the BGH concerning these judgments; to be confirmed once the full-text judgment is available; cf < https://www.bundesgerichtshof.de/SharedDocs/Pressemitteilungen/DE/2021/2021149.html >. Facebook was nevertheless ordered to reinstate the removed content in these decisions as its terms and conditions did not obligate Facebook to inform the uploader at least afterwards that his content was removed and beforehand in case of account suspensions.

[104] cf the principles established by the Federal Constitutional Court of Germany on the so-called Drittwirkung, BVerfG GRUR 2020, 35, para 76; with references to the established case law; available in English at <https://www.bundesverfassungsgericht.de/SharedDocs/Entscheidungen/EN/2019/11/rs20191106_1bvr001613en.html> accessed 20 August 2021.

[105] cf Case C-419/14 WebMindLicenses Kft. (CJEU, 17 December 2015), para 81.

[106] Or the decision to suspend or terminate in whole or in part the provision of the service to the recipient, or to suspend or terminate the recipient’s account respectively.

[107] Critical already regarding out-of-court settlement by private bodies in the area of consumer rights enforcement Eidenmüller/Engel (n 2) 261.

[108] Regarding consumer rights cf. Eidenmüller/Engel (n 2) 288 et seq.

[109] Magyar Tartalomszolgáltatók v Hungary App no. 22947/13 (ECtHR, 2 February 2016), para 54; Delfi AS v Estonia App no 64659/09 (ECtHR, 10 October 2013), para. 78; each with further references.

[110] Magyar Tartalomszolgáltatók v Hungary App no. 22947/13 (ECtHR, 2 February 2016), para 60 et seq.

[111] Daniel Holznagel, ‘The Digital Services Act wants you to “sue” Facebook over content decisions in private de facto courts’ (verfassungsblog.de, 24 June 2021) < https://verfassungsblog.de/dsa-art-18/> accessed 20 August 2021.

[112] Eidenmüller/Engel (n 2) 282.

[113] See also the almost identical language in recital 26 of Regulation (EU) No 524/2013 on online dispute resolution for consumer disputes.

[114] See above II.

[115] Eidenmüller/Engel (n 2) 273.

[116] cf Pablo Cortes, ‘Online Dispute Resolution for Consumers in the European Union’ (Rutledge 2011), 3 et seq.

[117] cf Eidenmüller/Engel (n 2) 283.

[118] Eidenmüller/Engel (n 2) 263.

[119] Biard (n 5) 113; Eidenmüller/Engel (n 2) 289.

[120] Holznagel (n 108).

[121] Biard (n 5) 113.

[122] Eidenmüller/Engel (n 104) 263.

[123] See above II.

[124] On the numbers see above at n 81 and 82.

[125] Ethan Katsh/ Orna Rabinovich-Einy, ‘Digital Justice – Technology and the Internet of Disputes’ [2017] 117.

[126] Institute for Strategic Dialogue (ISD) “Hass auf Knopfdruck - Rechtsextreme Trollfabriken und das Ökosystem koordinierter Hasskampagnen im Netz” (2018); available at <https://www.isdglobal.org/isd-publications/hass-auf-knopfdruck/>.

[127] Magyar Tartalomszolgáltatók v Hungary App no. 22947/13 (ECtHR, 2 February 2016), para 86.

[128] cf Commission, ‘Impact Assessment’ (n 9), para 51.

[129] Court costs as well as attorneys’ fees are calculated on the basis of the value of the matter in dispute, which is usually set at EUR 5,000.00; cf. also Holznagel (n 108).

[130] Holznagel (n 108) speaks of de facto-courts and questions the Unions respective competence.

[131] Eigenmüller/Engel (n 2) 296.

[132] § 128a ZPO (German Code of Civil Procedure) allows for oral hearings to take place by means of video and audio transmission, which has been put into effect during the current pandemic.

[133] The German copyright act in § 97a provides such cost-reduction in certain proceedings.

[134] Such representation is already provided for in Article 68 DSA.

[135] See above IV.4.

[136] It must also be recognized that online platforms cannot be equated to public broadcasting or to “essential facilities” monopolizing communication on the internet and functioning as a bottleneck to speech. Without going in to any detail on this issue, online platforms – as already the plural indicates – do not monopolize speech on the internet. If someone cannot post his speech on Facebook, he can do so on Reddit or Twitter, etc. He could also post the content on his own website and search engines will provide at least some accessibility.

Fulltext

License

Any party may pass on this Work by electronic means and make it available for download under the terms and conditions of the Digital Peer Publishing License. The text of the license may be accessed and retrieved at http://www.dipp.nrw.de/lizenzen/dppl/dppl/DPPL_v2_en_06-2004.html.

JIPITEC – Journal of Intellectual Property, Information Technology and E-Commerce Law
Article search
Extended article search
Newsletter
Subscribe to our newsletter
Follow Us
twitter
 
Navigation