Document Actions

Articles

Prior filtering obligations after Case C-401/19: balancing the content moderation triangle A comparative analysis of the legal implications of Case C-401/19 for filtering obligations ex ante and the freedom of expression in Europe

  1. Willemijn Kornelius

Abstract

On 26 April 2022, the CJEU finally delivered its judgment (Case C-401/19 Poland v. Parliament and Council) on the compatibility of Article 17 DSM-directive with the freedom of expression (Article 11 Charter). Article 17 DSM-directive introduces an obligation for online content-sharing platforms to proactively prevent uploads of copyright infringing material. This de facto requires them to resort to automatic filtering technologies with a potential of over-blocking. The CJEU concluded that such prior filtering restricts an important means of disseminating online content and therefore constitutes a limitation of Article 11 of the EU Charter. The CJEU nevertheless upheld Article 17, finding a justification for this limitation. Several scholars have suggested that the CJEU’s conclusions have implications outside the copyright realm on obligations for platforms to detect illegal content. Although this linkage is suggested, it has up to now not been looked into exhaustively. This article aims to answer the question what the legal implications of Case C-401/19 are for the regime of de facto obligations on online content-sharing platforms under EU law to act against illegal content ex ante more generally. It distils other de facto obligations on online content-sharing platforms to carry out a prior review of content. These obligations are all governed by the prohibition of general monitoring obligations (e.g. Article 8 DSA). The CJEU treats this prohibition as a safeguard to the freedom of expression. Consequently, online content-sharing platforms should only block content that is clearly illegal. This article shows that the fundamental importance of the freedom of expression and information of the users of the internet needs to play a key role in designing obligations to act against illegal content both inside and outside the area of copyright law.

Keywords

1. Introduction*

1

People increasingly share and access information and works (“content”) on the available services on the Internet such as online content-sharing platforms. [1] Online content-sharing platforms, like Facebook and Twitter, provide their users with the possibility to express themselves with a potential global reach and to participate in ongoing discussions. The European Court of Human Rights (“ECtHR”) repeated in Vladimir Kharitonov v. Russia that the Internet has become one of the most important means for individuals to exercise their right to freedom of expression and information. [2]

2

However, illegal content is an immediate concern of the European legislature. “Illegal content” comprises information or works uploaded by internet users (related to an activity that is) not in compliance with the law. [3] For example, such content could be the spreading of information concerning child sexual abuse, large scale copyright infringements or incitements to violence. [4] Initially, the European legislature considered it disproportional to hold online platforms liable for illegal activities of their users (such as copyright infringement or hate speech). [5] The European E-Commerce Directive 2000/31/EC (“E-Commerce Directive”) contains liability exemptions for online intermediaries (“safe harbours”). In these situations an online platform is—in short—not liable if they do not have actual knowledge of the illegal content or act expeditiously to remove it. As the digital environment changed, so did this policy focus. [6] The wish to control and tackle online harm and illegal content online, has resulted in stronger regulation and stronger liability rules for online platforms. [7] These online platforms are considered to be best placed to bring infringing activities to an end. [8]

3

In the area of copyright law, the liability of intermediaries, such as online content-sharing platforms, has evolved quite progressively. The CJEU construed this type of liability in its case law. [9] In light of the new challenges posed by the increasing digitalisation, the EU adopted the Directive on Copyright in the Digital Single Market (“DSM-directive”) in 2019. [10] The coming into being of this directive was not an easy road. Especially the introduction of a liability framework with a “staydown”-obligation for online content-sharing service providers (“OCSSPs”), e.g., Article 2(6) DSM) in Article 17 was controversial. [11] It requires these OCSSPs to proactively take measures ex ante. Ex ante here means: “before content is uploaded”. [12]

4

The problem with obligations to ex ante ensure “staydown” of illegal content are the filtering technologies platforms put in place. These technologies tend to excessively block information: “over-blocking”. [13] Over-blocking means that not only illegal, but also legal content is blocked. Removal of legal content would constitute an infringement of users’ freedom of expression. [14] This fear led Poland to issue an annulment procedure against (parts of) Article 17 DSM-directive. On 26 April 2022, the CJEU finally published its ruling (Poland v. Parliament and Council (“C-401/19”)). The CJEU admits that the use of automatic filtering technologies is inevitable. [15] This constitutes a de facto obligation to carry out a prior review of content users wish to upload. In this sense, and in the rest of this article, a de facto obligation is understood as an obligation that is not prescribed by the law itself in that way, but there is in fact no alternative way to comply with the legal obligation. Despite this observation, the CJEU chooses to uphold Article 17 DSM-directive. [16]

5

Obligations on online content-sharing platforms to proactively act ex ante against illegal content, requiring the implementation of automatic filtering technologies will exist under the wings of the Digital Services Act (“DSA”). [17] Moreover, additional sector-specific regulations of different types of content strengthen the responsibilities of online content-sharing platforms to act. C-401/19 is embedded in the prohibition of a general monitoring obligation (Article 17(8) DSM-directive and Article 15 e-Commerce Directive), which is also in the DSA (Article 8). Several scholars have suggested that the CJEU’s conclusions have implications outside the copyright realm on obligations in the European Union for platforms to detect illegal content. [18]

6

Although this linkage is suggested, it has not been looked into exhaustively. Scholars focus on the impact of the DSA on copyright content moderation in the EU. [19] They describe the interplay between the DSA and Article 17 and their lex specialis-lex generalis relationship. [20] In this article, I therefore aim to answer the following question “what are the legal implications of the CJEU’s ruling in C-401/19 on Article 17 DSM-directive for the regime of de facto obligations on online content-sharing platforms under EU law to act against illegal content ex ante more generally?”

7

I focus on the EU legal framework that applies to online content-sharing platforms of which users post illegal content. These platforms are information society service providers with the aim to store and give the public access to a large amount of works or information uploaded by their users (“content”), which they organise and promote for profit-making purposes. [21] This definition is partly derived from Article 2(6) DSM-directive, but broadens the concept because it does not solely focus on copyright-protected content. It also aligns with the definition of an “online platform” as laid down in Article 3(i) DSA. The definition in the DSA, however, is broader and does not refer to “profit-making”. [22]

8

Before a legal analysis is made of the different regulatory frameworks at play, I give a theoretical background of the context of content moderation in which these frameworks have effect (Section 2). I outline the existing EU framework of platform liability to distil what other obligations exist in Section 3. I describe the case of platform liability under EU copyright law in more detail to determine the place of Article 17 of the DSM-directive within EU law. I use the applicable EU legislation, relevant case law and academic literature. I supplement this by (legal-)empirical studies on automated content moderation tools to achieve an understanding of the risks of over-blocking for the freedom of expression in practice (Section 4). I continue with an analysis of Case C-401/19 (Section 5. Finally, I compare the de facto obligations to act against illegal content ex ante within the EU with Article 17 of the DSM-directive to determine what Case C-401/19 implies for these obligations (Section 6).

2. Content moderation of illegal content: a triangle relationship

9

In this section, I describe the context in which the obligations under consideration have effect. This type of regulation is meant to mitigate the effects of illegal content (Section 2.1). I demonstrate that obliging online content-sharing platforms to act against illegal content is a form of regulation of platforms leading to regulation by platforms (Section 2.2). We can witness a content moderation triangle relationship between platform, user(s) and affected parties emerging in this area (Section 2.3).

2.1. Illegal content

10

This article covers obligations resulting from EU legislative initiatives to tackle illegal content online. The EU legislature perceives illegal content to be content that comprises information or works (related to an activity that is) not in compliance with the law (Article 3(h) DSA). [23] The “law” could be EU law or the law of a Member State. Illegal could be information relating to terrorism, pictures or videos from child sexual abuse, illegal hate speech, but also posts that infringe copyrights. [24]

11

Illegal content has negative consequences which should be addressed. Content containing copyright protected material infringes the rights of the copyright holder. A post that incites hatred against someone or a group of people, negatively effects these persons, but also negatively impacts the public more generally. [25] The dissemination of radical terrorist content endangers general security. [26] The Commission spells out that the presence of illegal content has serious negative consequences “for users, affected citizens and companies and for society at large”. [27] Illegal content interferes with the interests the laws attempt to safeguard. Dealing with the illegal activities underlying illegal content, enforcement of the law and the protection of the interests, is traditionally seen as a public state responsibility. [28] Online content-sharing platforms are taking over parts of this role.

2.2. Platform governance and regulation “of” and “by” online platforms

12

The wish to tackle illegal content results in regulation by platforms following regulation of platforms. As described in Section 1 the European legislature targets online platforms to act against the illegal content uploaded on their platforms, for example through regulations concerning their liability for this content. [29] In this article, I understand “liability of online platforms” to refer to obligations imposed on these platforms to act against illegal content of their users. [30] It encompasses liability for damages but also, and important in this context, legal obligations to act, such as injunctions, or court orders. [31] To comply with their responsibilities and in order to remain immune from liability, online platforms are required to take action once they obtain knowledge of illegal activities (“notice-and-action”) or comply with orders to delete and prevent uploads of illegal content. [32] This type of regulation can be described as regulation of platforms. [33]

13

Online content-sharing platforms increasingly act to remove or disable access to illegal content. The platform thereby governs and orders the activity of its users: “platform governance”. [34] Platforms set rules (terms and conditions, behavioural guidelines) and install ‘notice-and-takedown’-systems and ex ante content filtering systems. [35] The platform assesses whether uploaded information is indeed illegal or infringing and decides to act or not. [36] To effectuate all this, platforms need to employ technical measures, such as algorithmic content detection. [37] As underlined in academic research and literature, the management of users’ activity that emerges through these technologies is a form of regulation by platforms. [38]

2.3. Content moderation through technologies: removing illegal content

14

The regulation by platforms resulting from the EU regulation of platforms results in activities these platforms undertake to detect, remove or disable access to illegal content. In this article, I address these activities as “content moderation”. I thereby align with the only definition thereof in EU law, found in article 3(t) DSA. [39] The DSA refers to both automated and non-automated activities. In light of the question central to this article, I focus on automated activities.

15

Husovec describes the content moderation practices as “delegated enforcement” because the action taken against illegal practices is left to private actors, in this case, online content service providers (“OCSPs”). [40] Here, we can see a triangle relationship. As online platforms are given the responsibility to protect public or private interests protected by laws, they embark on activities to remove or disable access to illegal content and interfere with the rights of the users that uploaded this content. [41] Most notably, this concerns the users’ right to freedom of expression including the freedom to receive and impart information and ideas in an open democratic society (“freedom of expression”) protected by Article 11 of the Charter. [42] In the following figure I display this relationship:

 

Figure 1: Content moderation triangle where OCSP is the online content-sharing platform

16

For content moderation in the area of copyright law, this relationship can be specified as follows:

Figure 2: Copyright content moderation triangle where OCSSP is the online content-sharing service provider

17

I focus on content moderation by platforms that follows from regulation of platforms and its compatibility with the users’ freedom of expression. In general, this right is primarily addressed to public institutions. [43] The Charter is addressed to the ‘institutions and agencies of the EU and Member States when they implement EU law’. [44] I therefore focus on the importance of the assessment of the CJEU in C-401/19 for obligations on online content-sharing platforms stemming from public institutions. The figures above show that the freedom of expression affects the triangle relationship resulting from these obligations. Content moderation following from voluntary actions by online content-sharing platforms is left outside the scope of this article. [45]

3. European legal framework for platform liability in case of illegal content

18

The obligations of online content-sharing platforms to engage in content moderation in the European Union are governed by a complex puzzle of applicable chunks of legislation and case law. [46] In this section, I provide an overview of this legal framework. I identify de facto obligations for online content-sharing platforms to act ex ante against illegal content. The general rule of EU platform liability framework is the neutrality of the platform: as long as a platform does not have knowledge of an illegal activity on his platform, they do not have to act (Section 3.1). In its Communication on Online Platforms and the Digital Single Market, the Commission chose to upheld the existing liability regime. [47] However, the Commission acknowledged that specific issues for certain types of illegal content had been identified that demand a ‘sectorial, problem-driven approach’. [48] Article 17 DSM-directive embodies a specific approach to copyright infringing material online. [49] The EU legislature thereby created an exception to the neutrality-rule for copyright (Section 3.2). The DSA still upholds the idea of neutrality, but the legislature creates obligations for proactive action against illegal content in specific situations outside copyright. Section 3.3 and 3.4 discuss these and how they relate to the precedingly discussed rules.

3.1. Neutrality of the platform: E-Commerce Directive

19

In 2000, the European legislature introduced the E-Commerce Directive. The European Commission wanted to clarify the legal position of online intermediaries when their users partake in illegal activities. [50] The E-Commerce Directive does not create a ground for liability. [51] The Commission merely focused on situations in which online intermediaries should not be held liable to respect users’ freedom of expression and to stimulate the development of an innovative digital single market. [52] The E-Commerce Directive restricts the scope of obligations that (national) authorities can impose on online content-sharing platforms to act against illegal content. It provides liability exemptions (Articles 12–14) and prohibits general monitoring obligations (Article 15) as discussed below. [53]

3.1.1. Article 14: liability exemption for hosting services

20

The exemptions of Articles 12 to 14 E-Commerce Directive are based on the idea that the service providers are not the providers of information themselves but are merely occupied with the transmission or storage of that information. Shortly put, they lack the control over and knowledge of the content of that information. [54] Online content-sharing platforms function as intermediaries that store information of their users so that other users can access it. It is commonly accepted in legal practice and academic literature that these activities fall under “hosting” (Article 14 E-Commerce Directive). [55] Following Article 14 the online content-sharing platform is not liable when 1) it does not have actual knowledge of the illegal activity or information and is not aware of any facts from which this illegal activity is apparent, or 2) it acts “expeditiously” to remove or disable access to the information after obtaining such knowledge or awareness. [56]

21

The CJEU has stressed that the hosting activities of a platform must be neutral, i.e., merely technical and automatic. [57] The E-Commerce Directive does not dictate how online content-sharing platforms could obtain their knowledge of illegal activity. [58] Member States have worked this out in different ways. [59] A few Member States have specific rules on ‘notice-and-action’-systems. [60] The legal contours of these systems are strongly debated and are a topic in case law of the CJEU as well. [61] This case law is discussed under 3.1.3. Tom Nichini TN cross-reference

3.1.2. Article 15: general monitoring obligations prohibited

22

Although Article 14 E-Commerce Directive provides no guidelines of what obligations can be imposed on online platforms, Article 15 explicates that it should not amount to a “general monitoring obligation”. The directive does not give a clear definition. Recital 48 suggests that ‘duties of care’ can be imposed on hosting platforms aimed at detecting and preventing illegal activities. Recital 47 indicates that such a general monitoring obligation differs from “monitoring obligations in a specific case”. What is supposed to be the difference between “general” and “specific” has been disputed. [62] A topic of debate has been whether Article 15 prohibits preventive measures aimed at future illegal activities or “re-uploads”. [63] This could result in an obligation for online platforms to monitor all content.

3.1.3. Case law of the CJEU

23

In successive rulings, the CJEU attempted to outline what obligations to moderate content are allowed under Article 14 and 15. In L’Oréal v. eBay the CJEU explained that Article 15 entails that an online platform cannot be required to actively monitor “all the data of each of its customers”. [64] However, the CJEU allows an order to take specific measures to terminate an infringement and prevent future infringements, as long as it is effective and proportionate. [65] Such an order could be the suspension of the individual offender. [66]

24

In further case law, Scarlett Extended v. SABAM and SABAM v. Netlog, the CJEU specified that measures aimed at preventing future infringements of intellectual property law are allowed, but must comply with Article 12 to 15 E-Commerce Directive. [67] In both cases, the measure at issue was an injunction against a hosting service provider requiring it to install a filtering system. [68] The CJEU held that this filtering system would necessitate the platform to preventively monitor all electronic communications to assess what of it is infringing and thus blocked. [69] This requires active observation of all information and all users, which the CJEU found to be prohibited by Article 15. [70]

25

Additionally, the CJEU held that this filtering obligation had to be assessed in light of the protection of fundamental rights of the persons affected by the filtering (i.e., users). [71] According to the CJEU, a filtering system might not be able to adequately distinguish between unlawful and lawful content. The potential blocking of lawful content undermines the freedom of information of the platform’s users. [72]

26

McFadden v. Sony concerned a mere conduit service provider (Article 12) and is relevant in light of the accessibility of lawful content.  [73] The CJEU reiterated that Article 15(1) E-Commerce Directive prohibits a measure requiring “monitoring of all information transmitted”. [74] The measure at issue, requiring an intermediary to prevent users to make copyright-infringing material available, must be strictly targeted without “thereby affecting the possibility of internet users lawfully accessing information using the provider’s services”, because that would infringe the users’ freedom of information. [75]

27

These rulings together raised questions about the permissibility of other filtering obligations. The filtering obligations in the SABAM-cases were broad in time (unlimited period), applied to all content and all users. [76] Could it be that measures are only “general” if they ask from platforms to proactively seek for all potentially illegal content, but that specific notifications or court orders leading to monitoring of all content are excluded from the scope of Article 15? [77] Based on the wording used by the CJEU in the case law until McFadden, the answer would likely have been ‘no’. [78] The CJEU excluded measures requiring “monitoring of all information”. [79] However, the Glawischnig-Piesczek-ruling in 2019 confused this course. [80]

3.1.4. Glawischnig-Piesczek

28

Glawischnig-Piesczek concerned a dispute between Eva Glawischnig-Piesczek, an Austrian politician for the Greens, and Facebook. A Facebook user published an article about the Greens on its personal page, resulting in a thumbnail with the title of that article, complemented by a picture of Glawischnig-Piesczek. The user posted a comment in connection to the article. An Austrian court found this comment to be defamatory to Glawischnig-Piesczek. [81] The question at issue was whether an injunction ordering an online platform to remove “identical and equivalent” information to the information previously declared to be illegal was compatible with Article 15 E-Commerce Directive.

29

The CJEU does not mention its previous case law cited above at all. It states that Article 15 does not concern monitoring obligations in a specific case. [82] Such a specific case could be a certain piece of information examined and assessed by a court and found to be illegal. [83] The CJEU further considered that, because of the “genuine risk that information which was held to be illegal is subsequently reproduced and shared by another user”, it is legitimate that the online platform is ordered to block access to information that is identical or equivalent to the content of the illegal information. [84] This requires the platform to monitor all the content uploaded to the platform. [85]

30

However, as the CJEU continues, Article 15 E-Commerce Directive means that an order to monitor must not be an “excessive obligation” on the online platform and that the different interests at stake should be balanced. [86] The required monitoring should be limited to information containing the elements specified in the order and the identical or equivalent nature of that content does not require the platform to “carry out an independent assessment”. The CJEU thereby takes into consideration that the platform had “automated search tools and technologies” at its disposal. [87] This reasoning caused a lot of debate in academic literature. Until now, the debate remains unsettled. [88]

31

Angelopoulos and Senftleben argue that the CJEU herewith turns Article 15 in a reasonability test, rather than a hard prohibition. [89] It seems to permit an obligation leading online platforms to use filtering technologies to detect and remove illegal content amongst all the content of their users, as long as online platforms are not required to independently assess whether content is illegal and filtering is limited to predetermined information. [90] In this case there was a national court that determined the comment to be illegal. This leaves open the question whether notices of private entities could lead to an obligation to remove identical and equivalent content. [91]

3.2. Stronger obligations to act: the case of copyright

32

Liability of online content-sharing platforms for copyright infringements is a sector-specific lex specialis to the lex generalis-system of the E-Commerce Directive as discussed in Section 3.1. [92] It is governed by an interplay of the Copyright Directive, E-Commerce Directive and the DSM-directive. [93] In this area, the EU legislature has envisaged strong obligations to act against infringing content, sometimes requiring ex ante actions.

33

The Copyright Directive harmonises the rules of copyright law to a great extent. [94] Platform liability for copyright infringements of their users is therefore more harmonised at EU level compared to other types of illegal content and has taken shape in two ways. There is direct liability (Article 3 Copyright Directive and since 7 June 2021 Article 17 DSM-directive) and indirect liability as an intermediary (Article 8(3) Copyright Directive). As a result, the platform might be obliged to act against the infringements and to engage in content moderation. The CJEU has played a significant role in determining the contours of the liability of online content-sharing platforms when their users upload copyright infringing material. [95]

3.2.1. Direct liability: Article 3 Copyright Directive

34

The Copyright Directive grants an exclusive right to copyrightholders to “authorise or prohibit any communication to the public of their works” in Article 3. This exclusive right encompasses communication at a distance. [96] This concept is broadly interpreted to cover the multiple ways offered by new technologies to disseminate works. [97] A user that uploads copyright protected material to an online content-sharing platform, so that others can access or even download it, communicates it to the public. [98] If the user does not have permission from the rightholder and there is no exception that protects them, they infringe copyright. This makes the upload illegal. [99]

35

But what is the legal position of the online content-sharing platform when this happens? In some situations, the CJEU has ruled, they could be directly liable as “communicators to the public”. The most recent case in this context is YouTube and Cyando, in which the CJEU ruled specifically on content-sharing platforms. [100] Previous cases Stichting Brein and GS Media already dealt with the question of an act of communication by a platform. [101] On the basis of these cases, first of all, it has to be established whether the online content-sharing platform played an indispensable role in facilitating the act of communication. [102] In addition, the intervention by the platform should be deliberate. [103] He should intervene in full knowledge of the consequences of doing so with the aim of giving the public access to protected works. [104] In paragraph 84 of the judgment, the CJEU lists factors that have to be taken into account. [105] These factors imply that as long as the content-sharing platform has a filtering technology that detects infringements, it does not make a communication to the public itself. [106] The question remains whether the platform can nevertheless be ordered to act against copyright infringements of its users (indirect liability).

3.2.2. Indirect liability: liability for infringements by others

36

If an online content-sharing platform is not a direct infringer on the basis of Article 3 Copyright Directive, the rightholder can still apply for an injunction against an intermediary to end the infringement (Article 8(3)). [107] It is a lex specialis of Article 14(3) E-Commerce Directive. [108] The specific details of this injunction must be determined at a national level, but the E-Commerce Directive determines its outer contours. [109] The scope of injunctions is limited by both Article 15 E-Commerce Directive and the freedom of expression. [110] In YouTube & Cyando, the CJEU states that an injunction can also be imposed on an platform falling under the exemption of Article 14. [111] Then it repeats its pre-Glawischnig-Piesczek case law. [112] As found in Scarlet Extended and SABAM v. Netlog, Article 15 means that an online platform cannot be required to have a filtering mechanism entailing general and permanent monitoring to prevent future infringements. [113] National authorities should strike a fair balance with the freedom of expression and information of internet users. [114]

37

Nevertheless, as the CJEU continues in paragraph 140 to 142 YouTube & Cyando, the platform could be required to expeditiously remove or block access to content and to take appropriate measures to prevent further infringements, when a rightholder notifies the platform of an infringement. After YouTube and Cyando, it is disputed what the CJEU perceives to be “general monitoring” by online platforms. [115] How far could the measures to prevent further infringements go?

3.2.3. Direct liability: Article 17 DSM-directive

38

Since 7 June 2021, the Article 17-framework further complicated the landscape of liability of online content-sharing platforms for copyright infringements. Departing from the safe harbour for hosting platforms, Article 17 introduces obligations on online content-sharing platforms to proactively prevent uploads of illegal content upon receival of necessary information from copyrightholders. It is a lex specialis of Article 14 E-Commerce Directive and Article 3 Copyright Directive. [116] Article 17 DSM-directive is based on the wish to strengthen the position of copyrightholders on the internet. [117] Online content-sharing platforms profit from copyright infringing content through targeted advertising, while copyrightholders are barely remunerated for this exploitation of their works: the so-called value gap. [118]

39

The starting point of Article 17 is that online content-sharing service providers (“OCSSPs”, defined in Article 2(6) DSM-directive) “perform an act of communication to the public when [they give] the public access to copyright-protected works (…) uploaded by its users”. [119] Article 17(3) determines that such an OCSSP does not fall under the liability exemption of Article 14(1) E-Commerce Directive. The online content-sharing platforms under consideration in this article should be seen as to largely fall under the definition of Article 2(6). [120] Consequently, OCSSPs should obtain authorisation of rightsholders for these uploads, for example through licensing agreements (Article 17(1)). If this authorisation is not obtained by the OCSSP, the liability framework of Article 17(4) enters into force. This provision contains three “best-efforts-obligations”: the OCSSP should a) make best efforts to obtain an authorisation; and b) make best efforts to ensure that notified copyright protected works are unavailable (‘notice-and-takedown’), but also c) make best efforts to prevent future uploads of this protected content (‘notice-and-staydown’). The obligation under c) is an ex ante legal obligation to prevent illegal content. If the OCSSP does not meet these obligations, he is liable for the copyright infringements.

40

These “best-efforts”-obligations caused a lot of controversy. [121] To comply with these obligations, online content-sharing platform inevitably need to install automatic filtering technologies. [122] Due to the impreciseness of these technologies (Section 4), they may also block legal content. [123] However, Article 17(7) commands that compliance with the “best-efforts”-obligations of Article 17(4) does not lead to the unavailability of legal (non-infringing) content. [124] Therefore, scholars and other commentators perceive Article 17 to contain conflicting obligations. [125]

41

Article 17(8) furthermore iterates the prohibition on the imposition of a general monitoring obligation on online content-sharing platforms. In addition, Article 17(9) contains a set of ex post procedural safeguards that online content-sharing platforms should have in place for users whose content is removed or blocked. Recital 70 clarifies that the working of this liability framework, thus the content moderation that follows therefrom, should be in line with the freedom of expression. [126]

3.3. Obligations to act outside copyright: new EU initiatives on the horizon

42

Since the E-Commerce Directive entered into force 20 years ago, the wish to control the spread of illegal content online led to several regulatory initiatives of the EU legislature. [127] This resulted in a rather fragmented landscape of obligations for platforms to act against illegal content. [128] There are rules for specific types of content, such as child sexual abuse and terrorist content. [129] There are rules for specific online platforms: video-sharing platforms. [130] And then there are soft law initiatives: a Communication and a Recommendation of the Commission trying to set general principles for the fight against illegal content online, focusing on all platforms and all types of content. [131]

43

In this section, I analyse what obligations to prevent ex ante that certain illegal content appears online for online content-sharing platforms can be observed resulting in de facto obligations to ex ante carry out a review of content. I describe the new DSA-proposal and the Terrorist Regulation, since these give concrete substance to what is required from online platforms.

3.3.1. Digital Services Act: notice-and-action

44

On 19 October 2022 the Digital Services Act was officially adopted by the EU legislature. [132] This is a horizontal regulation that applies to all digital services and contains general rules. The Commission proposed this regulation to update and harmonize the currently applicable rules on responsibilities of digital services. [133] It largely upholds the general liability framework applicable to online platforms, as provided for by the E-Commerce Directive, but additionally introduces due diligence obligations for the digital service providers. These due diligence obligations are called “asymmetric” obligations, because their application depends on the type of service provider. [134]

45

While maintaining this “core” of the liability framework, it is quite likely that the DSA will revitalize the regime to some extent. [135] The DSA is a regulation and thus directly applicable in the Member States, without a need for transposition. It defines certain concepts relevant for the moderation of illegal content by online content-sharing platforms. It, e.g., defines “illegal content” as discussed in Section 2.1. Furthermore, it introduces four categories of digital services, relevant for the application of the due diligence obligations. [136] Barata et al describe these as “Russian dolls”, because the first category comprises all the other categories and with every step, the categories get more specific. [137]

Figure 3: The relation between the different digital services

46

As I display in Figure 3, online content-sharing platforms fall under the provisions that concern intermediary services, hosting services, online platforms and in some cases very large online platforms (“VLOPS”). [138]

47

Article 6 DSA contains a liability exemption for hosting services, with identical wording to Article 14 E-Commerce Directive. The European legislature chose to harmonise part of the ‘notice-and-action’-mechanisms through an obligation in Article 16 DSA, applicable to hosting providers, including online platforms. Shortly put, the online content-sharing platforms are required to have a mechanism that allows their users to report ‘illegal content’. [139] Removal or disabling of access should be undertaken “in the observance of the principle of freedom of expression”. [140]

48

Article 8 repeats the prohibition of a general monitoring obligation for intermediary services as established under Article 15 E-Commerce Directive. It is likely that the case law as discussed in Section 3.1.3 Tom Nichini TN cross-reference remains applicable, as the Commission expressed in the Explanatory Memorandum that the liability rules and key principles of the E-Commerce Directive are upheld and remain valid. [141] The Commission elaborates on the scope of the prohibition in recital 30 DSA. The prohibition does not “concern monitoring obligations in a specific case”. Article 9 sets basic conditions for orders of national authorities to online platforms to act against illegal content. Following recital 31, it seems that the European legislature wants to establish a certain limit to ‘excessive monitoring obligations’ in line with the CJEU’s case law in Glawischnig-Piesczek. [142]

49

All in all, the DSA does its best to define responsibilities of online platforms. In light of the further discussion of requirements to act against illegal content to avoid liability, it is interesting that the prohibition of a general monitoring obligation is upheld, and that removal of illegal content has to be done in line with the freedom of expression of users. The threat of liability and national court orders will likely motivate the online content-sharing platforms to set up automatic filtering systems to deal with the notices and removal of content. [143] Moreover, along the DSA, sector-specific approaches (such as Article 17 DSM-directive) exist that go further than the neutral approach of the DSA (in continuation of the E-Commerce Directive).

3.3.2. Regulation of terrorist content

50

Without the wish to be exhaustive, I address one sectoral approach to illegal content: Regulation 2021/784 addressing the dissemination of terrorist content. [144] It requires cooperation of online content-sharing platforms to combat the spread of terrorist content on their services. [145] Another example would be the proposed regulation for the detection and removal of online child sexual abuse to complement the DSA. [146]

51

Article 5(2) of the Terrorist regulation contains the obligation to take specific measures when the platform is exposed to terrorist content. These specific measures are proactive measures, to be taken prior to when the content is uploaded (ex ante), such as the identification and preventive removal of terrorist content. [147] Such an obligation exists, when the competent national authority informs the platform about the content (Article 5(4)). It de facto creates an obligation to ex ante monitor content. [148] The regulation further shapes this obligation: the measures should be applied in a way that respects users’ freedom of expression and information (Article 5(3)(c)) and should not lead to a general monitoring obligation or an obligation to use automated tools. Article 5(8) upholds Article 15 E-Commerce Directive. Automated tools to give effect to the obligation are allowed (Recital 25). When automated tools are used, appropriate safeguards should be provided to “to avoid the removal of material that is not terrorist content”. [149] The Terrorist Regulation provides an example of how online content-sharing platforms can be de facto required by national authorities to priorly review content to detect illegal content outside copyright. It also shows how this obligation should not affect legal content, should not be a general monitoring obligation and should be effectuated in a way that respects the freedom of expression. [150]

3.4. De facto obligations to carry out a prior review of content

52

The aforementioned legal framework allows, in specific situations, that obligations are imposed on platforms to prevent uploads from predefined illegal content. The Article 17-DSM-framework requires platforms not only to takedown notified content, but also ensure that notified content that allegedly infringes copyright is not re-uploaded (“staydown”). Such an obligation does not directly follow from the general framework of the E-Commerce Directive and the DSA. Since the DSA finetunes the liability framework of the E-Commerce Directive, it is fair to assume that the Article 17-regime is a lex specialis to the DSA-regime. [151]

53

The DSA introduces an obligation for online content-sharing platforms to have notice-and-action-mechanisms. A notification that content is to be regarded illegal, leads to “actual knowledge” of the platform, requiring it to act. What makes Article 17 special, is the staydown-obligation of Article 17(4)(c). [152] But, it is not to say that obligations to detect illegal content before it is uploaded (ex ante) cannot exist for other types of illegal content as well. [153] It is allowed by CJEU case law (Glawischnig-Piesczek) and in line with recital 25 DSA. Furthermore, such obligations follow from EU sector-specific regulation, such as the Terrorist Regulation (3.3.2).

54

The discussed legislation and case law thus allow obligations to prevent that identical or equivalent information is re-uploaded imposed on platforms, as long as they do not entail a “general monitoring obligation”. To achieve the prevention of such uploads, online platforms are de facto required to ex ante examine uploaded content. I further refer to these obligations as “de facto obligations to carry out a prior review of content to detect specific illegal content”. [154] The CJEU has made clear that obligations, in whatever form, requiring online platforms to monitor content should 1) not be general monitoring obligations, and 2) are additionally governed by the necessity to balance freedom of expression.

55

Although the Article 17-framework has its own provision using slightly different words, Article 17(8), Article 15 E-Commerce Directive, and Article 8 DSA all contain a prohibition on “general monitoring obligations” for the service providers. [155] It can therefore be assumed that the ban on general monitoring obligations will have the same scope and meaning under the DSA and the Article 17-framework as it did under the E-Commerce Directive. [156] The question is what distinguishes specific monitoring from general monitoring and whether, in line with Glawischnig-Piesczek, it can be regarded as a ‘reasonability-test’, meaning that orders to block content similar or equivalent to previously determined illegal content are allowed, as long as they do not require an independent assessment of the online content-sharing platforms.

56

I displayed the schematic relation between the different obligations in Figure 4. In the next section, I explain why these obligations require platforms to use automatic filtering technologies. In the several above discussed legal provisions and case law of the CJEU, it is repeatedly found that in order to balance the freedom of expression and information of users, lawful content should be left ‘untouched’. The next section explains how the use of automatic filtering technologies make exactly that requirement hard to reach creating an imbalance with the freedom of expression and information.

Figure 4: the relations between different obligations to act against illegal content, governed by the prohibition of a general monitoring obligation

4. Automatic filtering technologies and the risk of over-blocking

57

From the discussion of the EU legal framework in Section 3 we can conclude that the platform liability framework should be enforced in observance of the prohibition of a general monitoring obligation and the freedom of expression as laid down in Article 11 of the Charter. [157] In this section, I distil one of the problems of de facto obligations to carry out a prior review of content for the safeguarding of this fundamental right. I explain that online content-sharing platforms resort to automated filtering technologies to comply with their responsibilities. [158] These technologies enable to search for infringing content (“content recognition”) in order to manually or automatically block that content (“filtering”). [159] The central question to this section is how the use of these technologies (potentially) interferes with users’ freedom of expression and information through the risk of over-blocking.

4.1. Use of automatic filtering technologies is inevitable

58

Under the obligations at issue, online content-sharing platforms have to engage in ex ante content moderation of illegal content to ensure they can be exempted from liability. [160] Consequently, while the exact scope of what types of ‘specific’ monitoring obligations are admissible is disputed, the legal framework leads to the use of automated technologies. When online content-sharing platforms are de facto obliged to prevent uploads of specific content, it is commonly accepted that they need to resort to automatic filtering technologies that block notified content that is indeed found to be illegal. [161] Its use is also stimulated by the Commission. [162] Humans will not be able to cope with the stream of notices following from the notice-and-action-mechanisms in place.  [163] Nor will they be able to search through all the uploads to identify “identical or equivalent” content in the case of an imposition to actively block identical or equivalent notified content (Glawischnig-Piesczek). [164]

59

Content-sharing platforms deploy different technologies to tackle illegal content. [165] Substantial empirical research describes these technologies and how they work. [166] Broadly, these works distinguish “matching technologies” and “predicting technologies”. [167] Matching technologies typically aim for identifying whether uploaded content matches content that was already found or notified as illegal. [168] Matching technologies are particularly useful for a platform that needs to detect predefined illegal content. Predicting technologies aim at classifying (or predicting) the content as falling into one of the categories of illegal content. [169] The scope of this article is too short to exhaustively discuss the technical details of the different technologies that are used.

60

What the technologies have in common, is that they aim for “content recognition”, for which they rely on algorithms. [170] An example of a matching technology is the Content ID-technology of YouTube, used to find copyright infringing content. [171] Very simplified, it works as follows. The technology used is fingerprinting. [172] Copyright holders can add files with details about their copyright protected works to a reference database. [173] This file is given a fingerprint—produced by an algorithm—of a particular characteristic of the content, such as the frequency values of a song. [174] Other uploaded content is submitted to the same algorithm. When the fingerprints match, the content is detected as “infringing”. [175] This is a variant of the technique known as hashing. An algorithm produces a hash on the basis of the characteristics of a digital file. [176] A hash (a numeric code) and fingerprints are unique representations of files. [177]

4.2. Current incapacity according to the state of the art

61

To the current state of art, these technologies are not sufficiently able to distinguish legal content from illegal content. [178] I broadly distil three limitations. First of all, the technologies do not detect illegal content at all times. When a technology uses hashes to find matches, the hash of a piece of content results from algorithmic computation of that piece. [179] When there is a slight difference with regard to the original “illegal” content file, the computation produces a completely different hash; it does not detect the illegality. In relation to hate speech, it has e.g., been found that algorithms can easily be “manipulated” by using wrongly spelled words. [180]

62

A second aspect that makes technologies “imperfect”, is that they do not understand context. For several types of illegal content, its illegality is context-dependent. Even if the technologies are able to adequately identify content that matches with an illegal aspect (e.g., it contains copyright protected aspects or contains the text of a post found to be illegal), they are not sophisticated enough to (at all times) determine its illegality in the specific context. [181] In the context of copyright, the protected material can be used as a quotation or a parody. This would legitimate its use (in line with Article 5 Copyright Directive). For the assessment of hate speech, context is equally essential.

63

Thirdly, these technologies largely depend on reference files. [182] When the quality of the reference files is not assured, e.g., a piece of content initially identified as illegal is not, the use of the technology will not result in the desired detection. [183] The combination of these limitations can be problematic because online content-sharing platforms fear liability when illegal content is not adequately removed. I describe in the next section how this can lead to over-blocking.

4.3. Risk of over-blocking because of the incentive to block excessively

64

The automated filtering technologies depend on parameters that are designed and determined beforehand by human decision. [184] These parameters influence the scope of the content that is deemed to be illegal. In other words: online content-sharing platforms can design the technologies as they wish. In the area of copyright law, it has repeatedly been found that the fear of liability causes online content-sharing platforms to block excessively. [185] Online content-sharing platforms aim, in essence, for profit maximization. [186] Therefore, they want to minimize the risk of liability. [187] On the other hand, the revenue of online content-sharing platforms through advertisements increases when more content is uploaded. The online content-sharing platforms thus have to navigate between legal (liability) interests and factual interests.

65

Currently, the legal framework producing the obligations at issue does not create a ground for liability when too much content is blocked but they do when too little content is blocked. [188] In light of the impreciseness of the technologies at the platform’s disposal and because more advanced technologies are expensive, it is likely that online platforms wishing to avoid liability design their technologies to block everything that is potentially illegal. [189] The risk of extensive blocking when online content-sharing platforms use automated technologies has been extensively discussed in relation to the Article 17-framework. [190] The extensive blocking in combination with the imprecise technologies creates a risk of over-blocking. [191] Legal content is unjustifiably blocked. The user that uploaded legal content is thus not able to express themself and other users are denied access to this information. It is this risk that caused Poland to issue an annulment procedure against Article 17.

5. Case C-401/19 Poland v. Parliament and Council

66

Poland claimed before the CJEU that Article 17, and specifically Article 17(4)(b) and (c) should be annulled, because they require OCSSPs de facto to use automatic filtering technologies to carry out preventive monitoring of all content, which constitutes an infringement of the right to freedom of expression and information (Article 11 Charter). [192] The CJEU ruled on 26 April 2022 that the liability regime of Article 17 survives, but emphasised the strict application of safeguards to protect users’ freedom of expression in the case of filtering obligations. [193]

67

The CJEU’s judgment provides general insights for the question what filtering is permissible in the case of obligations to de facto carry out a prior review of content to detect specific illegal content. [194] In this section, I describe what the CJEU concluded on the compatibility of these obligations with the users’ freedom of expression and the prohibition of a general monitoring obligation. I refer, where relevant, to the discussion in academic literature.

5.1. A limitation of the right to freedom of expression

68

The CJEU assessed whether the liability framework of Article 17 is compatible with Article 11 Charter. First, the CJEU had to decide whether there is a limitation of users’ right to freedom of expression. If so, this limitation can be justified if it is provided for by law, respects the essence of that right and the limitation is proportional given other interests at stake (Article 52(1) Charter). [195] The CJEU importantly decided to review Article 17 in its entirety. [196]

69

As a first point of departure, the CJEU clarified that, following case law of the ECtHR (Vladimirov Kharitonov v. Russia), the internet is now a ‘principal means’ by which individuals express themselves and communicate on the internet. [197] Online content-sharing platforms therefore play an important role in “enhancing the public’s access to news and [facilitating] the dissemination of information (…) providing an unprecedented platform for the exercise of freedom of expression and information”. [198] As a second point, the CJEU confirmed that the liability regime of Article 17 de facto requires these platforms to carry out a prior review of content that users wish to upload. In line with the A-G, the CJEU concluded that to be able to carry out this prior review, according to the current state-of-the-art, online content-sharing platforms need to resort to automatic filtering technologies. [199] These two considerations led the CJEU to conclude that such prior filtering restricts an important means of disseminating online content and therefore constitutes a limitation of Article 11 Charter. [200] The judgment is directed at the EU legislature; the limitation is a direct consequence of the regime laid down by the EU legislature in Article 17 DSM-directive. [201]

70

The CJEU continued to examine whether the limitation is justified. The limitation results from the obligations of Article 17(4)(b) and (c), provisions of an EU act, and is therefore provided for by law. [202] Furthermore, according to the CJEU, the limitation at issue respects the essence of the right to freedom of expression and information. [203] The CJEU considered that Article 17(7) and (9) read together with recitals 66 and 70 DSM-directive constitute an obligation of result to assure that the efforts of the online content-sharing platforms do not result in the unavailability of ‘lawful works’. [204] In this way, according to the CJEU, the Directive reflects the CJEU’s course (UPC Telekabel Wien) that measures adopted by service providers for effective copyright protection should not affect lawfully posted content. [205]

5.2. Proportionality

71

Most refined is the CJEU’s assessment of the proportionality of the filter obligations at hand. Following Article 52(1) Charter, the limitation must 1) protect the rights and freedoms of others [206], 2) be necessary and 3) proportionate. [207] The obligations at issue protect the right to intellectual property (Article 17(2) Charter). [208] Moreover, these obligations are necessary to protect this right, because a less restrictive measure would not be ‘as effective’. [209]

72

The question is whether the obligations do not disproportionately restrict the right to freedom of expression of the users. The CJEU believed they are proportionate and gave six requirements for filtering systems. [210] I display the four most relevant arguments for this article here. [211]

73

First of all, the CJEU recognised the need for strict safeguards in the case of automated processing to prevent the risk that exists for the freedom of expression of the users. [212] These strict safeguards exist in the “clear and precise limit” following from Article 17(7) and (9) and recitals 66 and 70. According to the CJEU, this means that measures that filter and block lawful content when uploading are excluded. [213] In other words, the CJEU said that imprecise filters cannot be used to comply with the filter obligations.

74

Second, the CJEU considered that the liability regime functions around the condition that the rightsholder provide the platform with “undoubtedly relevant and necessary information” with regard to that content. The need for such substantiated notices protects the interests of users who lawfully use the services, since the platforms will not block when such information is not given. [214]

75

Third, in relation thereto, the CJEU affirmed that the obligation on OCSSPs should not result in a general monitoring obligation (Article 17(8) DSM-directive; Article 15(1) E-Commerce Directive). [215] Interestingly enough, the CJEU defined this as an “additional safeguard” for the observance of the users’ freedom of expression. [216] It means, according to the CJEU, that OCSSPs cannot be required to prevent the uploads of content which, in order to be found illegal, would require an “independent assessment of the content” of them. The CJEU referred in analogy to Glawischnig-Piesczek, thereby suggesting that similar arguments exist in other cases where the prohibition of a general monitoring obligation applies. [217] In paragraph 91, the CJEU clarified that this means that in some cases, illegal content cannot be prevented and can only be taken down after notification by a rightholder. In line with YouTube and Cyando, these notifications should enable the platform to judge that content at issue is illegal “without a detailed legal examination”. [218]

76

These considerations show that the CJEU wished to emphasise that lawful content must not be blocked ex ante, but the OCSSP cannot be required either to carry out an independent assessment. It seems that the CJEU wanted to clarify that automatic filtering technologies should only be used if there is enough information, e.g., specified in a notice, to specifically target the illegal content and prevent its upload.

77

Fourth, the CJEU’s last relevant requirement for a ‘proportionate filtering system’ are the procedural safeguards contained in Article 17(9). The requirement of an effective and expeditious complaint-and-redress-mechanisms and out-of-court-redress-mechanisms are sufficient to tackle any remaining blocks of legal content (over-blocking). [219] In addition to the ex ante safeguards mentioned above, users must thus have the actual ability to fight over-blocking of their content. [220]

6. General take aways for ex ante content moderation obligations outside copyright

78

In this article, I attempt to map the legal implications of the CJEU’s ruling in C-401/19 on Article 17 DSM-directive for the more general regime of de facto obligations on online content-sharing platforms under EU law to act against illegal content ex ante. The CJEU ruled that the staydown-obligation in Article 17 respects the freedom of expression, as long as strict safeguards are taken into account. As we have seen, online content-sharing platforms can be under the obligation to prevent the upload of certain illegal content outside the area of copyright law. I compare the different obligations in this concluding section. I first construe for which obligations the ruling could be relevant. Second, I address where differences between the specific regime of Article 17 DSM-directive and other obligations to moderate content ex ante limit the comparison. I end with an inventory of the general take-aways.

6.1. Obligation to de facto carry out a prior review of content limits freedom of expression

79

In the C-401/19 judgment, the CJEU confirmed that the Article 17-framework de facto obliges online content-sharing platforms to carry out a prior review of user-uploaded content. The rest of its judgment is centred around the compatibility of this obligation with the freedom of expression as laid down in Article 11 of the Charter. That makes the judgment relevant for de facto obligations by European and national public institutions for online content-sharing platforms to priorly review content to detect illegal content outside the copyright realm. [221] I elaborated on the existence of these obligations in Section 3. These obligations, such as the obligations under the Terrorist Regulation and following from injunctions under the DSA, in line with Glawischnig-Piesczek, only take up a small part of the content moderation responsibilities of online content-sharing platforms. [222]

80

As a first point, the CJEU acknowledges that online content-sharing platforms need to use automatic filtering technologies to carry out the required prior review. As a second point, the CJEU acknowledges the importance of the Internet for the exercise of the right of freedom of expression. Prior automatic filtering constitutes a limitation of this right. In Section 4, I explained that the use of these technologies is inevitable when these platforms face liability if they do not prevent the upload of certain illegal content. Therefore, the second point holds true for other de facto obligations to priorly review content too: the required filtering limits the freedom of expression. The justification of this limitation requires careful examination. [223]

81

The CJEU’s ruling is primarily based within the copyright content moderation triangle relationship between platform, rightsholder and user (see Figure 2). [224] But the CJEU confirms that requiring online content-sharing platforms to filter ex ante restricts an important means of disseminating content online. Its conclusions should therefore be considered against the background of the content moderation triangle relationship (see Figure 1) between platform, parties with protected interests and user. By protecting such interests through the removal of allegedly illegal content, the platform limits the freedom of expression of the user that uploaded that content. Geiger and Jütte call this the “constitutional dimension” of the CJEU’s ruling. [225]

6.2. No one-to-one comparison

82

The CJEU’s demands strict safeguards for the required prior filtering. [226] However, it must be understood that the proportionality assessment of a limitation to a fundamental right predominantly entails assessing whether the infringing act can be balanced in light of the other rights and freedoms it seeks to protect. [227] This limits the extent to which the safeguards described in Case C-401/19 equally apply outside copyright. For the obligation following from Article 17 DSM-directive, the CJEU noted that it seeks to protect intellectual property (Article 17(2) Charter), more specifically copyright. [228] The other discussed obligations do not protect intellectual property. They aim to protect public security (terrorist content) or private life (hate speech). This influences the construction of the proportionality review.

83

Furthermore, the “stay-down”-obligation in Article 17, is up to now, the only obligation known under EU law that requires an ex ante review by online content-sharing platforms on the basis of information provided by private parties (rightsholders). The other discussed obligations, such as the ones following from court orders (Article 9 DSA, Glawischnig-Piesczek) and those of the Terrorist Regulation (Article 5(2) and (4)), require ex ante review on the basis of information about assessed illegal content by a public authority. Consequently, the information on which the platform has to act, and the automatic filtering is based, differs in nature. This presumably influences the proportionality test as well.

6.3. Take-aways: no general monitoring, no imprecise filters and ex post safeguards

84

In consideration of the nuances made in Section 6.2, the relevant conclusions of the CJEU on the justification of prior automated filtering in light of the freedom of expression could be summarized as follows. These could read as a guidance to public authorities (such as the EU legislature or national courts) to formulate the de facto obligations to carry out prior review, such as those under the DSA and the Terrorist Regulation, in line with users’ freedom of expression.

85

The obligation at issue limiting the freedom of expression must be provided for by law. The act permitting the limitation must itself define the scope of this limitation. [229] The CJEU considered that the limitation at issue respects the essence of the freedom of expression, because the act itself (Article 17(7) and (9)) prescribes that the exercise of the obligation must be strictly targeted to illegal (copyright infringing) content. [230] This implies that de facto obligations to carry out a prior review must be strictly confined to not affect lawful content. [231]

86

Unfortunately, as described in Section 4 Tom Nichini TN cross-reference , this is hard to achieve. The CJEU probably saw this too and prescribes some safeguards to ensure that the obligation is a proportional limitation of the freedom of expression. The CJEU seems to do a little ‘trick’ here (and this makes this case so interesting): it places the prohibition of a general monitoring obligation in the key of the proportionality test. That is, online content-sharing platforms cannot be required to prevent the upload from content if that means they would first need to independently assess its illegality (in spirit of Glawischnig-Piesczek). [232]

87

As we have seen, the prohibition of a general monitoring obligation remains very much alive. Filtering should thus be targeted. Under Article 17 DSM-directive, this could be achieved through precise information provided by rightsholders. For the other discussed obligations, under the DSA and Terrorist Regulation, the prohibition continues to apply and will require that the order to act contains sufficient information to ensure that certain content is unmistakably illegal. [233]

88

Additionally, the CJEU emphasises, that a platform can only be obliged to filter when the automatic technologies are precise, in the sense that they do not block lawful content. [234] The required automatic filtering should be constrained. Still, due to the impreciseness of the technologies, the CJEU considers it relevant to emphasise the need to have effective ex post safeguards as complaint and redress mechanisms. [235]

89

These considerations demonstrate that obliging platforms to carry out a prior review which requires them to use automatic filtering technologies should be carefully targeted. Platforms should only be required to use automatic technologies for very specific and clear-cut cases. Nevertheless, it is good to remember that the need to protect certain public interests, such as protecting the public against terrorism, might permit broader filtering. The proportionality test might balance out that way. However, the CJEU in C-401/19 has manifested that the filtering must be surrounded with “effective and expeditious” ex post mechanisms.

7. Concluding remarks

90

This article has shown that the fundamental importance of the freedom of expression and information of the users of the internet needs to be taken seriously when addressing illegal content online both inside and outside the area of copyright law. In C-401/19 the CJEU gives an insight into what that actually means for ex ante content moderations obligations on online content-sharing platforms. Requiring online content-sharing platforms to prevent the uploads of certain illegal content de facto requires them to use automatic filtering technologies. The CJEU treats the prohibition of a general monitoring obligation as a safeguard to the freedom of expression. Consequently, online content-sharing platforms should only block content that is clearly illegal. Automatic filtering technologies should be limited to this content too. For authorities establishing de facto obligations to carry out a prior review under the DSA and Terrorist Regulation, as discussed in this article, Case C-401/19 shows the need to take the freedom of expression of internet users into consideration and provides starting points for this strictly targeted task. Consequently, Case C-401/19 can have implications outside the area of copyright when used to assess whether the legal frameworks of the DSA and the Terrorist regulation could survive the CJEU’s test.

 

*by Willemijn Kornelius, Legal Research master student at Utrecht University and Civil Law (Intellectual Property) graduate from Leiden University. The author would like to thank Dr. Vicky Breemen, Assistant Professor at Utrecht University for her helpful guidance in the course of this work and comments on earlier versions of this article and the researchers of the Centre of Private Governance (CEPRI) of the University of Copenhagen for offering the possibility to visit their centre as a visiting researcher and the fruitful discussions that contributed to this article.

 

 



[1] E.g. S Kulk, ‘Internet Intermediaries and Copyright Law. Towards a Future-proof EU Legal Framework’ (PhD-thesis, Utrecht University 2018) 56; K Erickson and M Kretschmer, ‘Empirical approaches to Intermediary Liability’ in: G Frosio (ed), The Oxford Handbook of Intermediary Liability Online (OUP 2020), also accessible < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3400230 > accessed 5 February 2023, 2.

[2] Vladimir Kharitonov v. Russia App no. 10795/14 (ECtHR, 23 June 2020) [33], as laid down in Article 10 ECHR.

[3] Definition partly derived from the definition given in the Regulation (EU) 2022/2065 of 19 October 2022 on a Single Market For Digital Services and Amending Directive 2000/31/EC (Digital Services Act) (“DSA”) article 3 (h).

[4] Commission, ‘Recommendation on measures to effectively tackle illegal content online’, C(2018) 1177 final 1.

[5] The European legislature wanted to create a technology-neutral regulatory environment: Commission, ‘Online Platforms and the Digital Single Market. Opportunities and Challenges for Europe (Communication)’ COM (2016) 288 final, 7-8; MRF Senftleben and C Angelopoulos, ‘The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market’ (Study for IViR & CIPIL 2020) 2.

[6] M Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ available at https://ssrn.com/abstract=3784149 accessed on 5 February 2023, 7.

[7] Erickson and Kretschmer (n 1) 2; JP Quintais and SF Schwemer, ‘The Interplay between the Digital Services Act and Sector Regulation: How Special Is Copyright?’ (2022) 13 European Journal of Risk Regulation 191, 192.

[8] See already in recital 59 of Directive 2001/29/EC of 22 May 2001 on the harmonisation of certain aspects of copyright and related rights in the information society [2001] OJ L 167/10 (“Copyright Directive”); Commission, ‘Recommendation on measures to effectively tackle illegal content online’, C(2018) 1177 final [2] and [3].

[9] E.g.: Case C-314/12 UPC Telekabel Wien [2014] ECLI:EU:C:2014:192; Case C-70/10 Scarlet Extended / SABAM [2011] ECLI:EU:C:2011:771.

[10] Directive (EU) 2019/790 of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC OJ L 130/92 (“DSM-Directive”).

[11] During the adoption procedure, the article was known as the “meme ban”. See for example: M. Reynolds, ‘What is article 13? The EU’s divisive new copyright plan explained’, WIRED 24 mei 2019, <wired.co.uk/article/what-is-article-13-article-11-european-directive-on-copyright-explainedmeme-ban> accessed on 21 June 2022. And see further on this controversial article: A Metzger and M Senftleben, ‘Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market into National Law – Comment of the European Copyright Society’ (2020) 2 JIPITEC 115, 115; JP Quintais ea, ‘Safeguarding User Freedoms in Implementing Article 17 of the Copyright in the Digital Single Market Directive: Recommendations from European Academics’ (2019) 3 JIPITEC 277, 277; Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (n 6) 6-7.

[12] It can also refer to ex ante safeguards: meaning that safeguards have to be implemented before content is uploaded.

[13] Case C-401/19 Poland v Parliament and Council [2022], Opinion of AG Saugmandsgaard Øe, ECLI:EU:C:2021:613, para 142; L Fiala and M Husovec, ‘Using experimental evidence to improve delegated enforcement’ (2022) 71 International Review of Law & Economics, 1; C Geiger and BJ Jütte, ‘Platform liability under Article 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match’ (2021) 70 GRUR International 517, 546.

[14] This freedom is protected by Article 10 ECHR and Article 11 Charter of Fundamental Rights of the European Union 2012/C 362/02 (“Charter”).

[15] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 54.

[16] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, paras 85-86, 90-92.

[17] Regulation (EU) 2022/2065 of 19 October 2022 on a Single Market For Digital Services and Amending Directive 2000/31/EC (Digital Services Act) (“DSA”).

[18] F Reda, ‘Wieviel Automatisierung verträgt die Meinungsfreiheit?‘ 2 May 2022, https://verfassungsblog.de/wieviel-automatisierung-vertragt-die-meinungsfreiheit/ accessed 5 February 2023; A Peukert ea, ‘European Copyright Society – Comment on Copyright and the Digital Services Act Proposal’ (2022) 53 IIC 370; JP Quintais, ‘Between filters and fundamental rights. How the Court of Justice saved Article 17 in C-401/19 – Poland v. Parliament’ 16 May 2022, <‘https://verfassungsblog.de/filters-poland/’> accessed 5 February 2023; JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 202331 January 2023, 126.

[19] JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 126.

[20] JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 126ff; Peukert ea (n 18) 358. See also recitals 9-11 DSA.

[21] The definition “information society service providers” was already introduced in Directive 98/34/EC and Directive 98/84/EC (Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (“E-Commerce Directive”), recital 17) and repeated in the E-Commerce Directive ,recital 17 and Article 2(a) and (b).

[22] The definition is thus broader than ‘content- sharing platforms’, it e.g. also comprises ‘online marketplaces’: see J Barata ea, ‘Unravelling the Digital Services Act package’ (IRIS Special 2021-1, European Audiovisual Observatory 2021) 32.

[23] See also: Commission, ‘Recommendation on measures to effectively tackle illegal content online’, C(2018) 1177 final [14] and Article 4(b).

[24] Commission, ‘Recommendation on measures to effectively tackle illegal content online’, C(2018) 1177 final [1]-[2]; recital 12 DSA.

[25] Code of Conduct on Countering Illegal Hate Speech Online, < https://ec.europa.eu/info/files/code-conduct-countering-illegal-hate-speech-online_en > accessed 5 February 2023, 1.

[26] M Rojszcak, ‘Online content filtering in EU law – A coherent framework or jigsaw puzzle?’ (2022) 47 Computer Law & Review.

[27] Commission, ‘Recommendation on measures to effectively tackle illegal content online’, C(2018) 1177 final [2].

[28] E.g. R Gorwa, ‘What is platform governance’ (2019) 22 Information, Communication & Society 854, 856; CS Petersen, VG Ulfbeck and O Hansen, ‘Platforms as Private Governance Systems – The Example of Airbnb’ [2018] NJCL 38.

[29] Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (n 6) 7. Mechanisms such as: Article 17 DSM-directive, Article 12-15 E-Commerce Directive, the Commission, ‘Recommendation on measures to effectively tackle illegal content online’, C(2018) 1177 final, DSA; Regulation (EU) 2021/784 of 29 April 2021 on addressing the dissemination of terrorist content online [2021]: Fiala and Husovec (n 13) 12.

[30] This definition is partly derived from Kulk’s definition of “liability of online intermediaries”, Kulk (n 1) 7.

[31] See on liability of online platforms: Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (n 6).

[32] A Kuczerawy, ’From ‘Notice and Takedown’ to ‘Notice and Staydown’: Risks and Safeguards for Freedom of Expression’ in: G Frosio (ed), Online Intermediary Liability (OUP 1st edn 2020), 526.

[33] JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 30.

[34] JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 29. It refers to R Gorwa, ‘The Shifting Definition of Platform Governance’ (Centre for International Governance Innovation 23 October 2019) < https://www.cigionline.org/articles/shifting-definition-platform-governance/ > accessed 5 February 2023.

[35] See e.g. Quintais and Schwemer, (n 7).

[36] Kuczerawy (n 32) 524-543; Kulk (n 1) 115.

[37] I elaborate on the need to use automatic filtering technologies in section D.

[38] JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 31; Gorwa, ‘What is platform governance’ (n 28) 859.

[39] Article 3(t) DSA: “‘content moderation’ means the activities, whether automated or not, undertaken by providers of intermediary services aimed, in particular, at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account”

[40] Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (n 6) 3 and 7.

[41] Fiala and Husovec also recognise the main types of “players”: Fiala and Husovec (n 13) 5. About this from a U.S. perspective: K Klonick, ‘The New Governors: The People, Rules, and Processes Governing Online Speech’, (2018) 131 Harv L Rev 1598, 1662.

[42] For the rest of the article, I use “freedom of expression” to refer to the “right to freedom of expression and to receive and impart information and ideas in an open democratic society” of users of online content-sharing platforms. This right is protected by both Article 11 Charter and Article 10 ECHR, but since I focus on the EU law context, Article 11 Charter is mentioned.

[43] Article 1 ECHR makes that clear for the rights and freedoms enshrined in the ECHR.

[44] Article 51(1) Charter. See on the broad interpretation of ‘implementing EU law’: J-P Jacqué, ‘The Charter of Fundamental Rights and the Court of Justice of the European Union: A First Assessment of the Interpretation of the Charter’s Horizontal Provisions’ in: LS Rossi and F Casolari (eds), The EU after Lisbon (Springer International Publishing 2014) 139.

[45] This is done because voluntary content moderation concerns the horizontal relation between platform and its user. There could potentially be importance of the Charter for this relation too, but the scope of this article is too small to cover this too. See on such ‘horizontal application’ of the Charter: Jacqué (n 44) 149.

[46] M Husovec, Injunctions against Intermediaries in the European Union (Cambridge University Press 2017) 50.

[47] Commission, ‘Online Platforms and the Digital Single Market Opportunities and Challenges for Europe (Communication)’ COM (2016) 288 final 9.

[48] Commission, ‘Online Platforms and the Digital Single Market Opportunities and Challenges for Europe (Communication)’ COM (2016) 288 final 8-9. See also: G Frosio and C Geiger, ‘Taking Fundamental Rights Seriously in the Digital Services Act’s Platform Liability Regime’ [2022] European Law Journal (forthcoming), https://www.researchgate.net/publication/350320059_Taking_Fundamental_Rights_Seriously_in_the_Digital_Services_Act%27s_Platform_Liability_Regime_European_Law_Journal_2022_forthcoming?enrichId=rgreq-e145523290ca326851f87a780b0646f8-XXX&enrichSource=Y292ZXJQYWdlOzM1MDMyMDA1OTtBUzoxMDA3MTYwODI5OTQzODEwQDE2MTcxMzcyNjI3MTA%3D&el=1_x_2&_esc=publicationCoverPdf > accessed 3 February 2023.

[49] Frosio and Geiger (n 48) 11

[50] E-Commerce Directive, recital 5; Commission, ‘Proposal for a directive on certain legal aspects of electronic aspects in the internal market’ COM(1998) 586 final (Explanatory Memorandum) 6; Kulk (n 1) 104. The uncertainty about the legal position was seen as one of the obstacles that existed for a “genuine single market for electronic commerce” cross-border online services: Commission, ‘Proposal for a directive on certain legal aspects of electronic aspects in the internal market’ COM(1998) 586 final (Explanatory Memorandum) 12.

[51] Senftleben and Angelopoulos (n 5) 6. Creating grounds for intermediary liability is therefore mostly left to the Member States themselves. Article 8(3) Copyright Directive and Article 11 Enforcement Directive are exceptions, see further Section C.II.

[52] See Kulk (n 1) 104 and Barata ea (n 22) 21.

[53] This section discusses the E-Commerce Directive. These rules still apply until the DSA has fully entered into force, which will be on 17 February 2024 (Article 93 DSA).

[54] Kulk (n 1) 105.

[55] Kulk (n 1) 111; Barata ea (n 22) 5; Senftleben and Angelopoulos (n 5) 6; Husovec, Injunctions against Intermediaries in the European Union (n 46) 52; and case law: Case C-236/08-238/08 Google France and Google v Louis Vuitton [2010] ECLI:EU:C:2010:159, para 114; Case C-324/09 L’Oréal v eBay [2011] ECLI:EU:C:2011:474, paras 109-110 and Case C-70/10 Scarlet Extended / SABAM [2011] ECLI:EU:C:2011:771, para 27.

[56] Article 14(1)(a) and (b) E-Commerce Directive.

[57] E-Commerce Directive, recital 42; Case C-324/09 L’Oréal v eBay [2011] ECLI:EU:C:2011:474, para 113.

[58] Husovec, Injunctions against Intermediaries in the European Union (n 46) 53.

[59] Husovec, Injunctions against Intermediaries in the European Union (n 46) 53.

[60] Kulk (n 1) 115. For example: Finland has a statutory notice-and-takedown regime for copyright infringements; the Netherlands has adopted a code of conduct.

[61] Senftleben and Angelopoulos (n 5) 7.

[62] Senftleben and Angelopoulos (n 5) 7.

[63] Kuczerawy (n 32) 524-543; Senftleben and Angelopoulos (n 5) 7-8.

[64] Case C-324/09 L’Oréal v eBay [2011] ECLI:EU:C:2011:474, para 139.

[65] Case C-324/09 L’Oréal v eBay [2011] ECLI:EU:C:2011:474, para 141.

[66] Barata ea (n 22) 9.

[67] Case C-70/10 Scarlet Extended / SABAM [2011] ECLI:EU:C:2011:771, para 34; Case C-360/10 SABAM v. Netlog [2012] ECLI:EU:C:2012:85, paras 29-30.

[68] Case C-360/10 SABAM v. Netlog [2012] ECLI:EU:C:2012:85, para 26; Case C-70/10 Scarlet Extended / SABAM [2011] ECLI:EU:C:2011:771, para 29.

[69] Case C-70/10 Scarlet Extended / SABAM [2011] ECLI:EU:C:2011:771, para 38.

[70] Case C-70/10 Scarlet Extended / SABAM [2011] ECLI:EU:C:2011:771, paras 39-40; Case C-360/10 SABAM v. Netlog [2012] ECLI:EU:C:2012:85, para 38.

[71] Case C-70/10 Scarlet Extended / SABAM [2011] ECLI:EU:C:2011:771, paras 41-45; Case C-360/10 SABAM v. Netlog [2012] ECLI:EU:C:2012:85, para 39.

[72] Case C-70/10 Scarlet Extended / SABAM [2011] ECLI:EU:C:2011:771, para 52; Case C-360/10 SABAM v. Netlog [2012] ECLI:EU:C:2012:85, paras 50-51.

[73] Case C-484/14 McFadden v. Sony [2016] ECLI:EU:C:2016:689.

[74] Case C-484/14 McFadden v. Sony [2016] ECLI:EU:C:2016:689, para 87.

[75] Case C-484/14 McFadden v. Sony [2016] ECLI:EU:C:2016:689, para 93.

[76] Senftleben and Angelopoulos (n 5) 12.

[77] Senftleben and Angelopoulos (n 5) 12 and footnote 34.

[78] Senftleben and Angelopoulos (n 5) 13; JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 80; Kuczerawy (n 32) 540.

[79] Case C-484/14 McFadden v. Sony [2016] ECLI:EU:C:2016:689, para 87.

[80] C Rauchegger and A Kuczerawy, ‘Court of Justice Injunctions to remove illegal online content under the eCommerce Directive: Glawischnig-Piesczek’ (2020) 57 Common Market Law Review 1496.

[81] Case C-18/18 Glawischnig-Piesczek [2019] ECLI:EU:C:2019:821, para 12.

[82] Case C-18/18 Glawischnig-Piesczek [2019] ECLI:EU:C:2019:821, para 34

[83] Case C-18/18 Glawischnig-Piesczek [2019] ECLI:EU:C:2019:821, para 35.

[84] Case C-18/18 Glawischnig-Piesczek [2019] ECLI:EU:C:2019:821, paras 36-38.

[85] Rauchegger and Kuczerawy (n 80) 1504.

[86] Case C-18/18 Glawischnig-Piesczek [2019] ECLI:EU:C:2019:821, paras 43-44.

[87] Case C-18/18 Glawischnig-Piesczek [2019] ECLI:EU:C:2019:821, paras 45-46.

[88] See e.g.: JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 81; F Reda, ‘Wieviel Automatisierung verträgt die Meinungsfreiheit?‘ 2 May 2022, https://verfassungsblog.de/wieviel-automatisierung-vertragt-die-meinungsfreiheit/ accessed 5 February 2023.

[89] Senftleben and Angelopoulos (n 5) 14.

[90] See e.g. D Keller, ‘Facebook Filters, Fundamental Rights, and the CJEU’s Glawischnig-Piesczek Ruling’ (2020) 69 GRUR International 616, 620. She suggests that this ruling prohibits the obligation on online platforms to employ human reviewers.

[91] Senftleben and Angelopoulos (n 5) 15.

[92] Quintais and Schwemer (n 7) 191.

[93] The Enforcement Directive also applies, but will be left outside the scope of this analysis.

[94] Directive 2001/29/EC of 22 May 2001 on the harmonisation of certain aspects of copyright and related rights in the information society [2001] OJ L 167/10 (“Copyright Directive”).

[95] See e.g. its most recent case: Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503.

[96] Copyright Directive, recital 23-24.

[97] Copyright Directive, recital 5 together with recital 23. This broad interpretation has been confirmed by the CJEU in its case law: Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 63. Some constitutive case law of the CJEU on the concept “communication to the public”: Joined Cases C-403/08 and C-429/08 Football Association Premier League and Others [2011] ECLI:EU:C:2011:631; Case C-466/12 Svensson [2014] ECLI:EU:C:2014:76; Case C-160/15 GS Media ECLI:EU:C:2016:644, paras 35 and 37 (concepts ‘communication’ en ‘public’); Case C-527/15 Brein/Filmspeler [2017] ECLI:EU:C:2017:300.

[98] Case C-610/15 Stichting Brein [2017] ECLI:EU:C:2017:456, para 34. See i.a. about this DJG Visser, ‘YouTube and Cyando. Auteursrecht en platformaansprakelijkheid’ [2021] AA 1022, 1023.

[99] Recital 12 DSA.

[100] Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 60.

[101] Case C-610/15 Stichting Brein [2017] ECLI:EU:C:2017:456; Case C-160/15 GS Media [2016] ECLI:EU:C:2016:644.

[102] Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 68; Case C-610/15 Stichting Brein [2017] ECLI:EU:C:2017:456, paras 36-37.

[103] Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 68.

[104] Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 81.

[105] “inter alia, the circumstance that such an operator, despite the fact that it knows or ought to know, in a general sense, that users of its platform are making protected content available to the public illegally via its platform, refrains from putting in place the appropriate technological measures that can be expected from a reasonably diligent operator in its situation in order to counter credibly and effectively copyright infringements on that platform, and the circumstance that that operator participates in selecting protected content illegally communicated to the public, that it provides tools on its platform specifically intended for the illegal sharing of such content or that it knowingly promotes such sharing, which may be attested by the fact that that operator has adopted a financial model that encourages users of its platform illegally to communicate protected content to the public via that platform.”

[106] JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 83; DJG Visser, ‘YouTube and Cyando. Auteursrecht en platformaansprakelijkheid’ [2021] AA 1022, 1026.

[107] Kulk (n 1) 103.

[108] C Angelopoulos, ‘European Intermediary Liability in Copyright. A Tort-Based Analysis’(PhD-thesis, University of Amsterdam 2016) 61.

[109] Kulk (n 1) 103. Article 11 Enforcement Directive repeats this obligation for Member States and expands it to all intellectual property rights infringements.

[110] JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 79-80; Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 134.

[111] Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 131.

[112] C Angelopoulos, ‘YouTube and Cyando, Injunctions against Intermediaries and General Monitoring Obligations: Any Movement?’ Kluwer Copyright Blog 9 August 2021, http://copyrightblog.kluweriplaw.com/2021/08/09/youtube-and-cyando-injunctions-against-intermediaries-and-general-monitoring-obligations-any-movement/ > accessed 3 February 2023.

[113] Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 135.

[114] Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 138.

[115] C Angelopoulos, ‘YouTube and Cyando, Injunctions against Intermediaries and General Monitoring Obligations: Any Movement?’ Kluwer Copyright Blog 9 August 2021 , http://copyrightblog.kluweriplaw.com/2021/08/09/youtube-and-cyando-injunctions-against-intermediaries-and-general-monitoring-obligations-any-movement/ > accessed 3 February; JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278> accessed 31 January 2023.

[116] Commission, ‘Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market (Communication)’ COM(2021) 288 final (“Guidance”).

[117] See DSM-directive, recital 61.

[118] E Rosati, ‘Five considerations for the transposition and application of Article 17 of the DSM Directive’, (2021) 16 Journal of Intellectual Property Law & Practice 265 < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3793056 > accessed 3 February 2023; : M Husovec and JP Quintais, ‘How to license Article 17? Exploring the Implementation Options for the New EU Rules on Content-Sharing Platforms’ (2021) 4 GRUR International 325, 327.

[119] Emphasised added.

[120] Read in connection with recitals 62 and 63. Article 17(6) determines that this liability framework does not apply to new OCSSPs with an annual turnover below EUR 10 million.

[121] There is a lot written about Article 17, previously called Article 13: before ánd after its entry into force. It still continues today, because Member States are struggling with the implementation of the article into their national laws. See e.g.: Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (n 6); F Reda, ‘Filtered Futures Conference: Exploring The Fundamental Rights Constraints of Automated Filtering After the CJEU Ruling on Article 17’ Kluwer Copyright Blog 17 June 2022 < http://copyrightblog.kluweriplaw.com/2022/06/17/filtered-futures-conference-exploring-the-fundamental-rights-constraints-of-automated-filtering-after-the-cjeu-ruling-on-article-17/ > accessed 5 February 2023.

[122] See e.g.: MRF Senftleben ea, ‘The Recommendation on Measures to Safeguard Fundamental Rights and the Open Internet in the Framework of the EU Copyright Reform’ (2018) 40 EIPR 149, 151 and 159; Metzger and Senftleben (n 11) 120.

[123] See e.g. F Reda, J Selinger and M Servatius, ‘Article 17 of the Directive on Copyright in the Digital Single Market: a Fundamental Rights Assessment’ (Study Gesellschaft für die Freiheitsrechte 2020), < https://ssrn.com/abstract=3732223 > accessed 5 February 2023, 4 and 13ff.

[124] Commission, ‘Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market (Communication)’ COM(2021) 288 final (“Guidance”), 2 and 20. See also DSM-directive, recital 66.

[125] See e.g. Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (n 6).

[126] DSM-directive, recital 70.

[127] Barata ea (n 22) 24ff; I Buri and J van Hoboken, ‘“The Digital Services Act (DSA) proposal: a critical overview”’ (Discussion paper IViR/DSA Observatory 28 October 2021), 5.

[128] See for an overview of the different regulatory measures: Barata (n 22) 30-31.

[129] Directive 2011/92 of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography [2011]; Regulation (EU) 2021/784 of 29 April 2021 on addressing the dissemination of terrorist content online [2021], Article 21; Commission, ‘Proposal for a Regulation laying down rules to prevent and combat child sexual abuse’ COM(2022) 209 final (“CSA Proposal”).

[130] Directive 2010/13 of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive), as amended by Directive 2018/1808 [2018].

[131] Commission, ‘Tackling Illegal Content Online. Towards an enhanced responsibility for online platforms’(Communication)’ COM (2017) 555 final 13; Commission, ‘Recommendation on measures to effectively tackle illegal content online’, C(2018) 1177 final.

[132] Regulation (EU) 2022/2065 of 19 October 2022 on a Single Market For Digital Services and Amending Directive 2000/31/EC (Digital Services Act).

[133] Commission, ‘Shaping Europe’s Digital Future’ COM (2020) < https://ec.europa.eu/commission/presscorner/detail/en/fs_20_278 > accessed 5 February 2023, 6; recitals 3-5 DSA; Buri and Van Hoboken (n 127) 4.

[134] Barata ea (n 22) 33; Buri and Van Hoboken (n 127) 8.

[135] Barata ea (n 22) 11.

[136] Recital 41 DSA.

[137] Barata ea (n 22) 32.

[138] The figure is based on the different provisions and recitals of the applicable legislative instruments.

[139] Article 16(3) determines that such a notice gives rise to the ‘actual knowledge or awareness’ meant in Article 6 DSA, which means that the online platform is no longer exempted from liability if he does not act expeditiously to remove or disable access to the content. This concerns an obligation to act ex post (after content is already uploaded) and will thus remain outside the scope of this article.

[140] Recital 22 DSA. It is not immediately clear whether this provision is a ‘due-diligence’-obligation or complements the liability rules. This issue however remains outside the scope of this article, see on this Quintais and Schwemer (n 7) 211.

[141] Proposal for a DSA (Explanatory Memorandum DSA) page 3-4.

[142] Case C-18/18 Glawischnig-Piesczek [2019] ECLI:EU:C:2019:821, para 46.

[143] Buri and Van Hoboken (n 127); EDRI, ‘Delete first, think later’ 24 March 2021 < https://edri.org/our-work/delete-first-think-later-dsa/ > accessed 5 February 2023.

[144] Regulation (EU) 2021/784 of 29 April 2021 on addressing the dissemination of terrorist content online [2021].

[145] See more extensively on these obligations: A de Streel and M Ledger, ‘Regulating the moderation of illegal online content’ in: J Barata ea, ‘Unravelling the Digital Services Act package’ (IRIS Special 2021-1, European Audiovisual Observatory 2021) 26.

[146] Commission, ‘Proposal for a Regulation laying down rules to prevent and combat child sexual abuse’ COM(2022) 209 final (Explanatory Memorandum) 2-3. It builds on the DSA-framework by setting out more specific requirements on detection (Article 7 and 10) and removal (Article 14) of this specific type of illegal activity.

[147] Article 5(2) Terrorist Regulation gives examples of these measures. See also: Rojszcak (n 26) 14.

[148] J Barata, ‘Terrorist content online and threats to freedom of expression. From legal restrictions to choreographed content moderation’ 14 March 2022, https://verfassungsblog.de/os4-content-threats/ accessed 3 February 2023.

[149] Article 5(3) Terrorist Regulation.

[150] Terrorist Regulation, recital 5: “while taking into account the fundamental importance of the freedom of expression, including the freedom to receive and impart information and ideas in an open and democratic society”.

[151] Quintais and Schwemer (n 7) 204; Peukert ea (n 18); E Rosati, ‘The Digital Services Act and copyright enforcement: The case of Article 17 of the DSM Directive’ in: J Barata ea, ‘Unravelling the Digital Services Act package’ (IRIS Special 2021-1, European Audiovisual Observatory 2021) 67. See also recital 11 DSA: the Copyright Directive and the DSM-directive establish specific rules to the DSA and should remain unaffected.

[152] E Rosati, ‘The Digital Services Act and copyright enforcement: The case of Article 17 of the DSM Directive’ (n 151) 71.

[153] Rauchegger and Kuczerawy (n 80) 1523.

[154] I align with the definition used by the CJEU in Case C-401/19 Poland v. Parliament and Council [2022] ECLI:EU:C:2022:297, para 53. Other definitions are: “obligations to proactively monitor content” (Keller (n 90) 616; the A-G in its opinion to Case C-401/19 Poland v. Parliament and Council [2022], Opinion of AG Saugmandsgaard Øe, ECLI:EU:C:2021:613, refers to “preventive measures” and “prior restraints”, para 77.

[155] See also recital 66 DSM-directive and recital 30 DSA. These recitals contain further information that seem to weaken the prohibition on general monitoring obligations to some extent.

[156] This is supported by the Commission’s Guidance, 22. The Commission has expressed the same for its proposed Regulation on Child Sexual Abuse: it wishes these requirements to “comply with the underlying requirement of fairly balancing the various conflicting fundamental rights at stake that underlies [the prohibition on general obligations to monitor]”: Commission, ‘Proposal for a Regulation laying down rules to prevent and combat child sexual abuse’ COM(2022) 209 final (Explanatory Memorandum) 5.

[157] See for example Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 113.

[158] See also the figure in Section C.IV.

[159] See the discussion of these technologies in the opinion of the AG to Case C-401/19 Poland v. Parliament and Council [2022], Opinion of AG Saugmandsgaard Øe, ECLI:EU:C:2021:613, paras 57ff.

[160] Frosio and Geiger (n 48) 13; see on the relation between liability and content moderation also: S Kulk and T Snijders, ‘Casestudy Content Moderation door online platformen’ in: S Kulk and S van Deursen, Juridische aspecten van algoritmen die besluiten nemen. Een verkennend onderzoek, (WODC 2020) 49.

[161] Keller (n 90) 618; R Gorwa, R Binns and C Katzenbach, ‘Algorithmic content moderation: Technical and political challenges in the automation of platform governance’ [2020] Big Data & Society 1, 2; J van Hoboken ea, ‘Hosting intermediary services and illegal content online: An analysis of the scope of article 14 ECD in light of developments in the online service landscape’ (Study for the European Commission by IViR 2019) 26.

[162] Commission, ‘Impact Assessment Report Annexes accompanying the Proposal for a Regulation on a Single Market for Digital Services (Digital Services Act)’ (Staff Working Document) SWD(2020) 248 final Part 2/2, 158 (Annex 9): “Usually, online platforms are well-placed to proactively reduce the amount of illegal content stored by them. Measures range from various filtering technologies (…)”; Commission, ‘Tackling Illegal Content Online. Towards an enhanced responsibility for online platforms’(Communication)’ COM (2017) 555 final.

[163] Frosio and Geiger (n 48) 40-41.

[164] Van Hoboken ea (n 161) 46; Frosio and Geiger (n 48) 41; Keller (n 90) 618.

[165] EUIPO, ‘Automated Content Recognition: Discussion Paper – Phase 1 ‘Existing technologies and their impact on IP’ (Discussion Paper 2020).

[166] See amongst others: EUIPO, ‘Automated Content Recognition: Discussion Paper – Phase 1 ‘Existing technologies and their impact on IP’ (Discussion Paper 2020); Commission, ‘Impact Assessment Report Annexes accompanying the Proposal for a Regulation on a Single Market for Digital Services (Digital Services Act)’ (Staff Working Document) SWD(2020) 248 final Part 2/2 (Annex 11); Gorwa, Binns and Katzenbach (n 161) 3; E Engstrom and N Feamster, ‘The Limits of Filtering: A Look at the Functionality & Shortcomings of Content Detection Tools’ (Engine 2017) < https://www.engine.is/the-limits-of-filtering > accessed 5 February 2023; EUIPO, ‘Automated Content Recognition. Discussion Paper – Phase 2 ‘IP Enforcement and management use cases’ (Discussion Paper 2022) < https://euipo.europa.eu/tunnel-web/secure/webdav/guest/document_library/observatory/documents/reports/2022_Automated_Content_Recognition_Phase_2_Discussion_Paper/2022_Automated_Content_Recognition_Phase_2_Discussion_Paper_FullR_en.pdf- > accessed 3 February 2023. See further: JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 259ff; M Senftleben, ‘Institutionalized Algorithmic Enforcement—The Pros And Cons Of The Eu Approach To UGC Platform Liability’ (2020) 14 FIU Law Review 299; Engstrom and Feamster (n 166); see also Kulk and Snijders (n 160) 49ff.

[167] Commission, ‘Impact Assessment Report Annexes accompanying the Proposal for a Regulation on a Single Market for Digital Services (Digital Services Act)’ (Staff Working Document) SWD(2020) 248 final Part 2/2 (Annex 11): matching tools (hashing) and classification tools (machine learning); Gorwa, Binns and Katzenbach (n 161) 3: systems that aim to match content and systems that aim to classify or predict content as belonging to one of several categories; EUIPO, ‘Automated Content Recognition: Discussion Paper – Phase 1 ‘Existing technologies and their impact on IP’ (Discussion Paper 2020): hashing, watermarking and fingerprinting (matching technologies) and AI-based content recognition (predicting technology).

[168] Gorwa, Binns and Katzenbach (n 161) 3.

[169] Gorwa, Binns and Katzenbach (n 161) 3. Online content-sharing platforms increasingly use these machine learning algorithms to detect illegal content, such as hate speech: Kulk and Snijders (n 160) 55.

[170] Gorwa, Binns and Katzenbach (n 161) 3.

[171] Google, ‘Hoe werkt Content ID?’(video), < https://support.google.com/youtube/answer/2797370> accessed 5 February 2023.

[172] Engstrom and Feamster (n 166) 12-14.

[173] Gorwa, Binns and Katzenbach (n 161) 7.

[174] Engstrom and Feamster (n 166) 14; EUIPO, ‘Automated Content Recognition: Discussion Paper – Phase 1 ‘Existing technologies and their impact on IP’ (Discussion Paper 2020) 15ff.

[175] Gorwa, Binns and Katzenbach (n 161); Engstrom and Feamster (n 166) 14.

[176] Engstrom and Feamster (n 166) 12.

[177] Engstrom and Feamster (n 166) 12-14; EUIPO, ‘Automated Content Recognition: Discussion Paper – Phase 1 ‘Existing technologies and their impact on IP’ (Discussion Paper 2020) 7.

[178] For more extensive discussion of the technicalities of these technologies and their limitations, for example: Engstrom and Feamster (n 166) 17ff; Gorwa, Binns and Katzenbach (n 161).

[179] Engstrom and Feamster (n 166) 18.

[180] Kulk and Snijders (n 160) 49; T Gröndahl e.a., All You Need is “Love”: Evading Hate Speech Detection (Proceedings of the 11th ACM Workshop on Artificial Intelligence and Security), 2018, arXiv:1808.09115.

[181] Engstrom and Feamster (n 166) 18; Gorwa, Binns and Katzenbach (n 161) 8.

[182] See on the problem of “less sophisticated notice senders”: JM Urban, J Karaganis and BL Schofield, ‘Notice and Takedown in Everyday Practice’ (UC Berkeley Public Law Research Paper No 2755628) < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2755628> accessed 5 February 2023, 116 (Study 3).

[183] Commission, ‘Impact Assessment Report Annexes accompanying the Proposal for a Regulation on a Single Market for Digital Services (Digital Services Act)’ (Staff Working Document) SWD(2020) 248 final Part 2/2 (Annex 11), 193.

[184] See e.g. J-P Mochon ea, ‘Content Recognition Tools on Digital Sharing Platforms: proposals for the implementation of article 17 of the EU Copyright Directive’ (CSPLA’s Mission Report 2020) 24.

[185] For empirical evidence of liability risk leading to an incentive to over-block: Urban, Karaginis and Schofield, (n 182) 42-44; S Bar-Ziv and N Elkin-Koren, ‘Behind the Scenes of Online Copyright Enforcement: Empirical Evidence on Notice & Takedown, (2018) 50 CONN L REV 339, 377.

[186] M Senftleben, ‘Bermuda Triangle – Licensing, Filtering and Privileging User-Generated Content Under the New Directive on Copyright in the Digital Single Market’ (2019) < https://ssrn.com/abstract=3367219> accessed 5 February 2023, 8; Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (n 6) 3.

[187] Urban, Karaginis and Schofield (n 182) 42-44; Bar-Ziv and Elkin-Koren (n 185) 377.

[188] Senftleben in the context of copyright content moderation and Article 17 DSM: Senftleben,‘Bermuda Triangle – Licensing, Filtering and Privileging User-Generated Content Under the New Directive on Copyright in the Digital Single Market’ (n 186) 8.

[189] For the relation between higher liability risks and higher costs for advanced technologies: M Senftleben, ‘Bermuda Triangle – Licensing, Filtering and Privileging User-Generated Content Under the New Directive on Copyright in the Digital Single Market’ (n 186) 8; Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (n 6) 3; Kulk and Snijders (n 160) 58. For the risk of over-blocking: Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (n 6) 3; also acknowledged by the AG in Case C-401/19 Poland v Parliament and Council [2022], Opinion of AG Saugmandsgaard Øe, ECLI:EU:C:2021:613, paras 142 and 146; Senftleben, ‘Institutionalized Algorithmic Enforcement—The Pros And Cons Of The Eu Approach To Ugc Platform Liability’ (n 166) 312. For empirical evidence of liability risk leading to an incentive to over-block: Urban, Karaginis and Schofield (n 182) 42-44; Bar-Ziv and Elkin-Koren (n 185) 377. See for the risk of over-blocking in relation to terrorist content: J Barata, ‘Terrorist content online and threats to freedom of expression. From legal restrictions to choreographed content moderation’ 14 March 2022, https://verfassungsblog.de/os4-content-threats/ accessed 3 February 2023.

[190] See for example: Reda, Selinger and Servatius (n 123) 4 and 13ff.

[191] I defined over-blocking in Section A.I. Over-blocking means that not only illegal, but also legal content is blocked

[192] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, paras 23-24.

[193] BJ Jütte, ‘Poland’s challenge to Article 17 CDSM Directive fails before the CJEU, but Member States must implement fundamental rights safeguards’ (2022) 17 JIPLP 693; C Geiger and BJ Jütte, ‘Constitutional Safeguards in the “Freedom of Expression Triangle” – Online Content Moderation and User Rights after the CJEU’s judgment on Article 17 Copyright DSM-Directive’ Kluwer Copyright Blog 6 June 2022 http://copyrightblog.kluweriplaw.com/2022/06/06/constitutional-safeguards-in-the-freedom-of-expression-triangle-online-content-moderation-and-user-rights-after-the-cjeus-judgement-on-article-17-copyright-dsm-directive/ accessed 5 February 2023; G Frosio, ‘Freedom to Share’ (2022) 53 IIC 1145.

[194] JP Quintais ea, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (ReCreating Europe 2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4210278 accessed 31 January 2023, 126.

[195] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 63.

[196] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 21.

[197] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 46.

[198] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 46.

[199] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, paras 53-54; see Case C-401/19 Poland v Parliament and Council [2022], Opinion of AG Saugmandsgaard Øe, ECLI:EU:C:2021:613, paras 57-69.

[200] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 55.

[201] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 56.

[202] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 72 read in conjunction with Delfi v. Estonia, App no. 64569/09 (ECtHR, 16 June 2015) paras 121ff.

[203] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, paras 76ff.

[204] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, paras 77-80.

[205] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 80; Case C-314/12 UPC Telekabel Wien [2014] ECLI:EU:C:2014:192, paras 55-56.

[206] Or: ‘genuinely meet objectives of general interest recognised by the Union’ (Article 52(1) Charter), but this is not mentioned by the CJEU, so left outside the scope of this article.

[207] P Graig and G de Búrca, EU Law: Text, Cases, and Materials (Oxford University Press 2020) 431.

[208] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 82: ‘to ensure that intellectual property rights are protected in such a way as to contribute to the achievement of a well-functioning and fair marketplace for copyright (…) copyright protection must necessarily be accompanied to a certain extent, by a limitation on the exercise of the right of users to freedom of expression and information’.

[209] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 83.

[210] JP Quintais, ‘Between filters and fundamental rights. How the Court of Justice saved Article 17 in C-401/19 – Poland v. Parliament’ 16 May 2022, ‘https://verfassungsblog.de/filters-poland/’ accessed 5 February 2023; M Senftleben, ‘The Meaning of “Additional” in the Poland ruling of the Court of Justice: Double Safeguards – Ex Ante Flagging and Ex Post Complaint Systems – are Indispensable’ Kluwer Copyright Blog 1 June 2022, http://copyrightblog.kluweriplaw.com/2022/06/01/the-meaning-of-additional-in-the-poland-ruling-of-the-court-of-justice-double-safeguards-ex-ante-flagging-and-ex-post-complaint-systems-are-indispensable/ accessed 5 February 2023.

[211] The other two arguments concern the specific characteristics of Article 17 DSM-directive as a copyright-provision and are thus (less) relevant outside copyright. The first is given in paragraph 87 and concerns the protection of copyright exceptions and limitations in Article 17(7) DSM-directive. The second is given in paragraph 96 and concerns Article 17(10) that requires the Commission to organise stakeholder dialogues.

[212] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 85; it refers to its own considerations in Case C-311/18 Facebook Ireland v. Schrems [2020] ECLI:EU:C:2020:559, para 176, which concerned the right to the protection of personal data.

[213] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 85.

[214] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 89.

[215] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 90.

[216] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 90.

[217] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 90; Case C-18/18 Glawischnig-Piesczek [2019] ECLI:EU:C:2019:821, paras 41-46.

[218] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 91; Joined cases C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, para 116.

[219] JP Quintais, ‘Between filters and fundamental rights. How the Court of Justice saved Article 17 in C-401/19 – Poland v. Parliament’ 16 May 2022, <https://verfassungsblog.de/filters-poland/> accessed 5 February 2023.

[220] M Senftleben, ‘The Meaning of “Additional” in the Poland ruling of the Court of Justice: Double Safeguards – Ex Ante Flagging and Ex Post Complaint Systems – are Indispensable’ Kluwer Copyright Blog 1 June 2022, http://copyrightblog.kluweriplaw.com/2022/06/01/the-meaning-of-additional-in-the-poland-ruling-of-the-court-of-justice-double-safeguards-ex-ante-flagging-and-ex-post-complaint-systems-are-indispensable/ accessed 5 February 2023.

[221] Since this article focuses on content moderation by platforms resulting from regulation of platforms the question whether C-401/19 has implications for voluntarily automatic prior filtering by online content-sharing platforms is left outside the scope of this article.

[222] See further Section C.III.

[223] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, paras 60ff; the CJEU demonstrates that in interpreting the measure or rule at issue, preference should be given to an interpretation in accordance with the Charter: para 70: “(…) in accordance with a general principle of interpretation, an EU measure must be interpreted, as far as possible, in such a way as not to affect its validity (…) preference should be given to the interpretation which renders the provision consistent with primary law (…)”.

[224] Section B.III.

[225] C Geiger and BJ Jütte, ‘Constitutional Safeguards in the “Freedom of Expression Triangle” – Online Content Moderation and User Rights after the CJEU’s judgment on Article 17 Copyright DSM-Directive’ Kluwer Copyright Blog 6 June 2022 http://copyrightblog.kluweriplaw.com/2022/06/06/constitutional-safeguards-in-the-freedom-of-expression-triangle-online-content-moderation-and-user-rights-after-the-cjeus-judgement-on-article-17-copyright-dsm-directive/ accessed 5 February 2023.

[226] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, paras 60ff, and explicitly para 67: ‘(…) The need for such safeguards is all the greater where the interference stems from an automated process (…)’.

[227] As follows from Article 52(1) Charter. See Graig and De Búrca (n 207) 431.

[228] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 82.

[229] Article 52(1) Charter. Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 64: the requirement that any limitation (…) must be provided for by law implies that the act (…) must itself define the scope of the limitation (…)”. Nevertheless, it may be necessary to leave it to the platforms to decide on the specific measures taken: para 75.

[230] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, paras 77ff: “(…) shall not result in the prevention of the availability of works or other subject matter uploaded by users, which do not infringe copyright (…)”. It prescribes a “specific result to be achieved”.

[231] In line with Case C-314/12 UPC Telekabel Wien [2014] EU:C:2014:192 paras 55-56.

[232] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 90.

[233] In the literature, some authors have argued in favour of limiting filtering to ‘manifestly infringing content’, but the CJEU does not use these words. See e.g. C Geiger and BJ Jütte, ‘Constitutional Safeguards in the “Freedom of Expression Triangle” – Online Content Moderation and User Rights after the CJEU’s judgment on Article 17 Copyright DSM-Directive’ Kluwer Copyright Blog 6 June 2022 http://copyrightblog.kluweriplaw.com/2022/06/06/constitutional-safeguards-in-the-freedom-of-expression-triangle-online-content-moderation-and-user-rights-after-the-cjeus-judgement-on-article-17-copyright-dsm-directive/ accessed 5 February 2023.

[234] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, para 85.

[235] Case C-401/19 Poland v Parliament and Council [2022] ECLI:EU:C:2022:297, paras 91-94.

Fulltext

License

Any party may pass on this Work by electronic means and make it available for download under the terms and conditions of the Digital Peer Publishing License. The text of the license may be accessed and retrieved at http://www.dipp.nrw.de/lizenzen/dppl/dppl/DPPL_v2_en_06-2004.html.

JIPITEC – Journal of Intellectual Property, Information Technology and E-Commerce Law
Article search
Extended article search
Newsletter
Subscribe to our newsletter
Follow Us
twitter
 
Navigation