Document Actions

No section

Guardians of the UGC Galaxy – Human Rights Obligations of Online Platforms, Copyright Holders, Member States and the European Commission Under the CDSM Directive and the Digital Services Act

  1. Martin Senftleben

Abstract

With the shift from the traditional safe harbour for hosting to statutory content filtering and licensing obligations in Article 17 of the CDSM Directive, EU copyright law imperils the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, may prove to be mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of polic-ing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design may thus conceal human rights violations instead of bringing them to light. The Digital Services Act rests on a similar – equally problematic – approach. Against this backdrop, the analysis addresses the risk of human rights interference, which is exacerbated by the fact that the Court of Justice, in its Poland decision, upheld the regulatory approach underlying Article 17, rather than exposing and discussing the corrosive effect of human rights outsourcing. Luckily, the new rules in the CDSM Directive and the Digital Services Act also contain several safeguards that allow EU Member States and the European Commission to actively take measures against the erosion of human rights.

Keywords

1. Introduction*

1

User-generated content (“UGC”)  [1] is a core element of many internet platforms. With the opportunity to upload photos, films, music and texts, formerly passive users have become active contributors to (audio-)visual content portals, wikis, online marketplaces, discussion and news fora, social networking sites, virtual worlds and academic paper repositories. Today’s internet users upload a myriad of literary and artistic works every day.  [2] A delicate question arising from this user involvement concerns copyright infringement. UGC may consist of self-created works and public domain material. However, it may also include unauthorized takings of third-party material that enjoys copyright protection. As UGC has become a mass phenomenon and a key factor in the evolution of the modern, participative web,  [3] this problem raises complex issues and requires the reconciliation of fundamental rights ranging from the right to (intellectual) property  [4] to freedom of expression and information, and freedom to conduct a business.  [5] Users, platform providers and copyright holders are central stakeholders.  [6]

2

With the adoption of Article 17 of the Directive on Copyright in the Digital Single Market (“CDSMD” or “CDSM Directive”),  [7] specific EU legislation seeking to regulate the UGC galaxy has become a reality. Article 17 puts an end to the traditional notice-and-takedown system and the corresponding liability privilege for providers of hosting services.  [8] Under Article 17(1) CDSMD, online content-sharing service providers (“OCSSPs”)  [9] are directly liable for infringing user uploads. To avoid liability risks, they must enter into agreements with copyright owners. In practice, this regulatory approach leads to the application of an amalgam of licensing and filtering obligations.  [10] If an OCSSP does not manage to conclude sufficiently broad licensing agreements with rightholders in line with Article 17(1) and (4)(a) CDSMD, Article 17(4)(b) and (c) CDSMD offers the prospect of a reduction of the liability risk in exchange for content filtering. The OCSSP can avoid liability for unauthorized acts of communication to the public or making available to the public when it manages to demonstrate that it:

made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information,...  [11]

3

Although the provision contains neutral terms to describe this scenario, there can be little doubt in which way the “unavailability of specific works and other subject matter” can be achieved: the use of algorithmic filtering tools seems inescapable.  [12]

4

In the legislative process leading to this remarkable climate change in the EU, the human rights impact of the departure from the traditional notice-and-takedown model has not gone unnoticed. The wording of Article 17 CDSMD itself shows that the new legislative design gave rise to concerns about encroachments upon human rights and, in particular, freedom of expression and information. Article 17(10) CDSMD stipulates that, in stakeholder dialogues seeking to identify best practices for the application of content moderation measures, “special account shall be taken, among other things, of the need to balance fundamental rights and of the use of exceptions and limitations.”  [13] After the adoption of the CDSM Directive, the preparation of the Digital Services Act (“DSA”)  [14] offered further opportunities for the EU legislature to refine and stabilize its strategy for safeguarding human rights that may be affected by algorithmic content filtering tools. Article 14 DSA – regulating terms and conditions of intermediary services ranging from mere conduit and caching to hosting services  [15] – reflects central features of the EU strategy. Article 14(1) DSA requires that providers of hosting services – the category covering UGC platforms – inform users about:

any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system.  [16]

5

This information duty already indicates that users are expected to play an active role in the preservation of their freedom of expression and information. Article 14(4) DSA complements this transparency measure with a fundamental rule that goes far beyond sufficiently clear and accessible information in the terms and conditions. Providers of intermediary services, including platforms hosting UGC:

shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions [that they impose in relation to the use of their service in respect of information provided by the recipients of the service], with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.  [17]

6

In other words: in the case of upload and content sharing restrictions following from the employment of content moderation tools, the UGC platform is bound to safeguard the fundamental rights of users, including freedom of expression and information. As a guiding principle, Article 14(4) DSA refers to the principle of proportionality (“proportionate manner”)  [18] that plays a central role in the reconciliation of competing fundamental rights under Article 52(1) of the EU Charter of Fundamental Rights (“Charter” or “CFR”).  [19]

7

At first glance, it makes sense to impose the obligation to safeguard fundamental rights of users on UGC platforms. In UPC Telekabel Wien, the CJEU already laid groundwork for this approach. Discussing website blocking orders, the Court stated that, when an internet service provider was subject to an injunction requiring the blocking of a website whose users notoriously infringed copyright, it had to ensure compliance with the fundamental right of internet users to freedom of information.  [20] More specifically, the measures adopted by the internet service provider had to be strictly targeted, in the sense that they had to serve to bring an end to a third party’s infringement of copyright “but without thereby affecting internet users who are using the provider’s services in order to lawfully access information.”  [21] The Court added that, failing the implementation of a sufficiently targeted blocking mechanism, the provider’s interference in the freedom of information would be unjustified in the light of the objective pursued.  [22] Considering this earlier case law, the task of safeguarding fundamental rights of users is thus neither new nor surprising for internet service providers.

8

The crux of the approach chosen in Article 14(4) DSA, however, clearly comes to the fore when raising the question whether the possibility of imposing human rights survival obligations on internet service providers, such as UGC hosting platforms,  [23] exempts the state power itself from the noble task of ensuring the observance of fundamental rights. Can the legislator legitimately outsource the obligation to safeguard fundamental rights, such as freedom of expression and information, to private parties? And can the legislator – when passing on that responsibility – confidently leave the task of defending the public interest in this sensitive area in the hands of companies belonging to the platform and creative industry? Arguably, an outsourcing strategy, such as the strategy reflected in Article 17(4)(b) and (c) CDSMD and Article 14(1) and (4) DSA, is highly problematic if it is not accompanied by robust and reliable control mechanisms that allow public authorities to verify the effectiveness of the measures taken by the private party concerned (content sharing platforms in the case of UGC) and the alignment of these measures with the broader public interest (following section II). Instead of focusing on control by public authorities, however, EU legislation leaves measures against excessive content blocking primarily to users (section III). The Member State obligation to safeguard quotations, parodies and pastiches etc. in Article 17(7) CDSMD and the audit system established in Article 37 DSA are welcome exceptions to this rule (section IV).

2. Outsourcing of Human Rights Obligations

9

As already indicated, legislation that applies outsourcing strategies refrains from providing concrete solutions for human rights tensions in the law itself. Instead, the legislator imposes the burden on private entities to safeguard human rights that may be affected by the legislative measure at issue, such as the statutory content filtering obligation in Article 17(4)(b) and (c) CDSMD. In the case of UGC, the addressees of this type of outsourcing legislation are online platforms – OCSSPs – that offer users a forum for uploading and sharing their creations. Discussing the increasing tendency to take refuge in human rights outsourcing, Tuomas Mylly has observed that “gradually, intermediaries and other key private entities become more independent regulators.”  [24] He describes central characteristics of this process as follows:

Courts are starting to rely increasingly on private entities to balance and adjust rights on technological domains but seek to secure formal appeal rights for users. Similarly, when legislatures shift decision-making power to intermediaries, they try to maintain some of the safeguards of traditional law and write wish-lists for private regulators. The executive pushes private regulation further to compensate for its policy failures and enters – at the request of the legislature – into regulatory conversations with private regulators to issue “guidance” in the spirit of co-regulation, thus establishing an enduring link to private regulators.  [25]

10

Arguably, Article 17 CDSMD and Article 14 DSA offer prime examples of provisions that outsource human rights obligations to private entities – with the features Mylly describes. As explained above, Article 14(4) DSA places an obligation on intermediaries to apply content moderation systems in “a diligent, objective and proportionate manner.”  [26] In addition to this reference to the principle of proportionality, the provision emphasizes that online platforms are bound to carry out content filtering with due regard to the fundamental rights of users, such as freedom of expression.  [27] With regard to copyright limitations that support freedom of expression,  [28] more specific rules follow from specific copyright legislation. According to Article 17(7) CDSMD, the cooperation between OCSSPs and the creative industry in the area of content moderation  [29] must not result in the blocking of non-infringing UGC, including situations where UGC falls within the scope of a copyright limitation. Confirming Mylly’s prediction that the executive power will enter into regulatory conversations with private entities to establish best practices and guiding principles, Article 17(10) CDSMD adds that the European Commission shall organize stakeholder dialogues to discuss best practices for the content filtering cooperation:

The Commission shall, in consultation with online content-sharing service providers, rightholders, users’ organisations and other relevant stakeholders, and taking into account the results of the stakeholder dialogues, issue guidance on the application of this Article, in particular regarding the [content moderation] cooperation referred to in paragraph 4.  [30]

11

In the quest for best practices, the stakeholder dialogues shall take “special account”  [31] of the need to balance fundamental rights and the use of copyright limitations. As in Article 14(4) DSA, reference is thus made to human rights tensions. The private entities involved – copyright holders and OCSSPs – are expected to resolve these tensions in the light of the guidance evolving from the co-regulatory efforts of the European Commission.

12

Evidently, industry “cooperation” is the kingpin of this outsourcing scheme for human rights obligations. To fully understand risks that may arise from this regulatory approach, it is important to analyse Article 17 CDSMD in more detail. At the core of the obligation to filter UGC – and industry cooperation that is necessary to implement this obligation in practice – lies the grant of a specific exclusive right in Article 17(1) CDSMD that leads to strict, primary liability of OCSSPs for infringing content that is uploaded by users:

Member States shall provide that an online content sharing service provider performs an act of communication to the public or an act of making available to the public when it gives the public access to copyright protected works or other protected subject matter uploaded by its users.  [32]

13

By clarifying that the activities of UGC platform providers amount to communication to the public or making available to the public, the new legislation collapses the traditional distinction between primary liability of users who upload infringing content, and secondary liability of online platforms that encourage or contribute to infringing activities. Under Article 17(1) CDSMD, it no longer matters whether the provider of a UGC platform had knowledge of infringement, encouraged infringing uploads or failed to promptly remove infringing content after receiving a notification. Instead, the platform provider is directly and primarily liable for infringing content that arrives at the platform.

14

In this way, EU legislation incentivizes rights clearance initiatives. To reduce the liability risk, the platform provider will have to obtain a license for UGC uploads. Evidently, this is an enormous task. Even though it is unforeseeable which content users will upload, the license should ideally encompass the whole spectrum of potential posts. While this dimension of the licensing obligation may be good news for users (whose activities would fall within the scope of the license and, therefore, no longer amount to infringement),  [33] it creates a rights clearance task which platform providers can hardly ever accomplish in respect of all conceivable user contributions.  [34]

15

Inevitably, the licensing imperative chosen in Article 17(1) CDSMD culminates in the introduction of filtering tools. As copyright holders and collecting societies are unlikely to offer all-embracing umbrella licenses,  [35] OCSSPs must rely on algorithmic tools to ensure that content uploads do not overstep the limits of the use permissions they managed to obtain.  [36] From the perspective of freedom of expression and information, this amalgam of licensing and filtering is highly problematic.  [37] Outside the licensing deals which UGC platforms have concluded, algorithmic enforcement measures will curtail the freedom of users to participate actively in the creation of online content.

16

The more specific regulation of content moderation in Article 17 CDSMD confirms that the EU legislator has willingly accepted inroads into freedom of expression and information to achieve the goal of subordinating UGC to the control of copyright holders. As explained, the law does not shy away from imposing institutionalized – statutory – content filtering obligations.  [38] In the absence of licensing arrangements, Article 17(4)(b) and (c) CDSMD offers OCSSPs the prospect of a reduction of the liability risk in exchange for content filtering. The fundamental rights tension caused by this regulatory approach is evident. In decisions rendered prior to the adoption of Article 17 CDSMD, the CJEU has stated explicitly that in transposing EU directives and implementing transposing measures:

Member States must […] take care to rely on an interpretation of the directives which allows a fair balance to be struck between the various fundamental rights protected by the Community legal order.  [39]

17

Interestingly, the application of filtering technology to a social media platform hosting UGC already occupied centre stage in Sabam/Netlog. The case concerned Netlog’s social networking platform, which offered every subscriber the opportunity to acquire a globally available “profile” space that could be filled with photos, texts, video clips etc.  [40] Claiming that users make unauthorized use of music and films belonging to its repertoire, the collecting society Sabam sought to obtain an injunction obliging Netlog to install a system for filtering the information uploaded to Netlog’s servers. As a preventive measure and at Netlog’s expense, this system would apply indiscriminately to all users for an unlimited period and would have been capable of identifying electronic files containing music and films from the Sabam repertoire. In case of a match, the system would prevent relevant files from being made available to the public.  [41] Given these underlying facts, the Sabam/Netlog case offered the CJEU the chance to provide guidance on a filtering system that has become a standard measure with the adoption of Article 17(4)(b) CDSMD.  [42]

18

However, the CJEU did not arrive at the conclusion that such a filtering system could be deemed permissible. Instead, the Court saw a serious infringement of fundamental rights. It took as a starting point the explicit recognition of intellectual property as a fundamental right in Article 17(2) CFR. At the same time, the Court recognized that intellectual property must be balanced against the protection of other fundamental rights and freedoms.  [43] Weighing the right to intellectual property asserted by Sabam against competing fundamental rights of Netlog’s users, namely their right to the protection of their personal data and their freedom to receive or impart information,  [44] The Court recalled that the use of protected material in online communications may be lawful under statutory limitations of copyright in the Member States, and that some works may have already entered the public domain, or been made available for free by the authors concerned.  [45] Given this corrosive effect on fundamental rights, the Court concluded:

Consequently, it must be held that, in adopting the injunction requiring the hosting service provider to install the contested filtering system, the national court concerned would not be respecting the requirement that a fair balance be struck between the right to intellectual property, on the one hand, and the freedom to conduct business, the right to protection of personal data and the freedom to receive or impart information, on the other (see, by analogy, Scarlet Extended, paragraph 53).  [46]

19

This case law confirms that the filtering obligation arising from Article 17(4)(b) CDSMD is highly problematic. As a way out of the dilemma, the EU legislature walks the fine line of distinguishing between monitoring all UGC in search of a whole repertoire of works,  [47] and monitoring all UGC in search of specific, pre-identified works.  [48] Sabam/Netlog concerned a filtering obligation targeting all types of UGC containing traces of works falling within the Sabam rights portfolio.  [49] The drafters of Article 17(4)(b) CDSMD seem to make an attempt to avoid this prohibited general monitoring obligation (and escape the verdict of a violation of fundamental rights) by establishing the obligation to filter “specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information.”  [50]

20

At this point, the above-described element of industry cooperation enters the picture. The content filtering system established in Article 17 CDSMD relies on a joint effort of the creative industry and the online platform industry. To set the filtering machinery in motion, copyright holders in the creative industry must first notify “relevant and necessary information”  [51] with regard to those works which they want to ban from user uploads. Once relevant and necessary information on protected works is received, the OCSSP is obliged to include that information in the content moderation process and ensure the filtering – “unavailability”  [52] – of content uploads that contain traces of the protected works. It is this cooperation which, according to Article 17(7) CDSMD, must not result in the prevention of UGC that does not infringe copyright, including situations where UGC is covered by a copyright limitation. The same cooperation constitutes the central item on the agenda of stakeholder dialogues which the Commission must initiate under Article 17(10) CDSMD to identify best practices.

21

The problem of the whole cooperation concept, however, lies in the fact that, unlike public bodies and the judiciary, the central players in the cooperation scheme – the creative industry and the online platform industry – are private entities that are not intrinsically motivated to safeguard the public interest in the exercise and furtherance of fundamental rights and freedoms. Despite all invocations of diligence and proportionality – “high industry standards of professional diligence” in Article 17(4)(b) CDSMD; “diligent, objective and proportionate” application in Article 14(4) DSA – the decision-making in the context of content filtering is most probably much more down to earth: the moment the balancing of competing human rights positions is confidently left to industry cooperation, economic cost and efficiency considerations are likely to occupy centre stage. Arguably, they will often prevail over more abstract societal objectives, such as flourishing freedom of expression and information.

22

A closer look at the different stages of industry cooperation resulting from the regulatory model of Article 17 CDSMD confirms that concerns about human rights deficits are not unfounded. As explained, the first step in the content moderation process is the notification of relevant and necessary information relating to “specific works and other subject matter”  [53] by copyright holders. In the light of case law precedents, in particular Sabam/Netlog,  [54] use of the word “specific” can be understood to reflect the legislator’s hope that copyright holders will only notify individually selected works. For instance, a copyright holder could limit use of the notification system to those works that constitute cornerstones of the current exploitation strategy. The principle of proportionality and high standards of professional diligence also point in the direction of a cautious approach that confines work notifications to those repertoire elements that are “specific” in the sense that they generate a copyright holder’s lion’s share of revenue.  [55] In line with this approach, other elements of the work catalogue could be kept available for creative remix activities of users. This, in turn, would reduce the risk of overbroad inroads into freedom of expression and information.

23

In practice, however, rightholders are highly unlikely to adopt this cautious approach. The legal basis for requiring a focus on individually selected works lies in the fact that the legislator has used the expression “specific works and other subject matter”  [56] in Article 17(4)(b) CDSMD. Proportionality and diligence considerations only form the broader context in which the specificity requirement is embedded. Strictly speaking, the requirement of “high industry standards of professional diligence” in Article 17(4)(b) CDSMD concerns the subsequent filtering step taken by an OCSSP to ensure the unavailability of notified works – not the primary notification sent by copyright holders.

24

Like the requirement of “high industry standards of professional diligence”  [57] , the imperative of “diligent, objective and proportionate” application in Article 14(4) DSA relates to platform content moderation measures that restrict user freedoms – not the rightholder notification system that sets the filtering process in motion. The success of the risk reduction strategy surrounding the word “specific” in Article 17(4)(b) CDSMD is thus doubtful. In the cooperation with OCSSPs, nothing seems to prevent the creative industry from sending copyright notifications that cover each and every element of long and impressive work catalogues. UGC platforms may thus receive long lists of all works which copyright holders have in their repertoire. Adding up all “specific works and other subject matter” included in these notifications, the conclusion seems inescapable that Article 17(4)(b) DSMD may culminate in a filtering obligation that is very similar to the filtering measures which the CJEU prohibited in Sabam/Netlog. The risk of encroachments upon human rights is evident.

25

Turning to the second step in the content moderation process – the act of filtering carried out by OCSSPs to prevent the availability of notified works on UGC platforms – it is noteworthy that proportionality and diligence obligations are directly applicable. As explained, the requirements of “high industry standards of professional diligence”  [58] and “diligent, objective and proportionate”  [59] application only form the broader context surrounding the notification of specific works by rightholders. When it comes to the content moderation process as such, however, these diligence and proportionality rules impact the activities of OCSSPs directly: the UGC filtering process must be implemented in a way that complies with these diligence and proportionality requirements.

26

As to the practical outcome of UGC filtering in the light of diligence and proportionality requirements, however, it is to be recalled that OCSSPs will most probably align the concrete implementation of content moderation systems with cost and efficiency considerations. Abstract commandments, such as the instruction to act in accordance with “high standards of professional diligence”  [60] and in a “proportionate manner in applying and enforcing [UGC upload] restrictions”  [61] can hardly be deemed capable of superseding concrete commercial cost and efficiency necessities. Tuomas Mylly accurately characterizes litanies of diligence and proportionality requirements as “wish-lists for private regulators.”  [62] On its merits, the legislator whitewashes statutory content filtering obligations by adding a diligence and proportionality gloss to reassure itself that the drastic measure will be implemented with sufficient care and caution to avoid the erosion of human rights. The success of this ingredient of the outsourcing recipe is doubtful. In reality, the subordination of industry decisions to diligence and proportionality imperatives – the acceptance of more costs and less profits to reduce the corrosive effect on freedom of expression and information – would come as a surprise. Instead, OCSSPs can be expected to be rational in the sense that they seek to achieve content filtering at minimal costs.

27

Hence, there is no guarantee that industry cooperation in the field of UGC will lead to the adoption of the most sophisticated filtering systems with the highest potential to avoid unjustified removals of content mash-ups and remixes. A test of proportionality is unlikely to occupy centre stage unless the least intrusive measure also constitutes the least costly measure. A test of professional diligence is unlikely to lead to the adoption of a more costly and less intrusive content moderation system unless additional revenues accruing from enhanced popularity among users offsets the extra investment of money.

28

In addition, EU legislation itself sends mixed signals. Article 17(5) CDSMD provides guidelines for the assessment of the proportionality of filtering obligations. The relevant factors listed in the provision, however, focus on “the type, the audience and the size of the service,” “the type of works or other subject matter” and “the availability of suitable and effective means and their cost for service providers.”  [63] Hence, cost and efficiency factors have made their way into the proportionality assessment scheme. Paradoxically, it is conceivable that these factors encourage the adoption of cheap and unsophisticated filtering tools that lead to excessive content blocking.

29

An assessment of liability questions also confirms that excessive filtering risks must be taken seriously. A UGC platform seeking to minimize the risk of liability is likely to succumb to the temptation of overblocking.  [64] Filtering more than necessary is less risky than filtering only clear-cut cases of infringement. After all, the primary, direct liability for infringing user uploads following from Article 17(1) CDSMD is hanging above the head of OCSSPs like the sword of Damocles. The second step of the industry cooperation concept underlying Article 17 CDSMD is thus at least as problematic as comprehensive notifications of entire work catalogues. The OCSSP obligation to embark on content filtering to police the borders of use permissions and prevent content availability in the absence of licenses raises serious concerns about interferences with human rights, in particular freedom of expression and information.

30

Surveying the described human rights risks that arise from the industry cooperation scheme in Article 17 CDSMD, the conclusion is inescapable that, despite all invocations of diligence and proportionality as mitigating factors, the outsourcing strategy underlying the EU regulation of content moderation in the CDSM Directive and the DSA is highly problematic. Instead of safeguarding human rights, the regulatory approach is likely to culminate in human rights violations. Against this background, it is of particular importance to analyse mechanisms that could bring human rights deficits to light and remedy shortcomings.

3. Concealing Human Rights Deficits Caused by Reliance on Industry Cooperation

31

The question of mechanisms that allow the detection and correction of human rights deficits in content moderation leads back to the information duty laid down in Article 14(1) DSA.  [65] Under this provision, UGC platforms are obliged to make information on content moderation “policies, procedures, measures and tools”  [66] available to users. This must be done in “clear, plain, intelligible, user-friendly and unambiguous language.”  [67] Moreover, the information must be publicly available in an easily accessible and machine-readable format.  [68] These information and transparency obligations can be regarded as exponents of a broader human rights preservation strategy.  [69] The broader pattern comes to the fore when the information flow generated in Article 14(1) DSA is placed in the context of the complaint and redress mechanism for unjustified content filtering that forms a building block of Article 17 CDSMD. Article 17(9) CDSMD requires that OCSSPs put in place:

an effective and expeditious complaint and redress mechanism that is available to users of their services in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them.  [70]

32

With regard to the role of users in the human rights arena, the complementary character  [71] of Article 17(9) CDSMD and Article 14(1) DSA yields important insights: the legislator confidently leaves the identification and correction of excessive content blocking to users. A relatively low number of user complaints, however, may be misinterpreted as an indication that content filtering does hardly ever encroach upon freedom of expression and information even though limited user activism may be due to overly slow and cumbersome procedures (following section 1). Instead of addressing this problematic concealment mechanism, the CJEU has confirmed the validity of the content moderation rules in Article 17 CDSMD. In this context, the Court has qualified elements of the problematic outsourcing and concealment strategy as valid safeguards against the erosion of freedom of expression and information. Instead of uncovering human rights risks, the Court, thus, preferred to condone and stabilize the system (section 2). Under these circumstances, only legislative countermeasures taken by EU Member States (section 3) and content moderation assessments in audit reports for the European Commission (section 4) give some hope that violations of human rights may finally be prevented despite the corrosive outsourcing and concealment scheme underlying the regulation of content moderation in the EU.

3.1. Reliance on User Complaints as Part of a Concealment Strategy

33

As explained, Article 17(9) CDSMD and Article 14(1) DSA both make users the primary addressees of information about content moderation systems and potential countermeasures. Article 17(9) CDSMD stipulates that OCSSPs shall inform their users “in their terms and conditions that they can use works and other subject matter under exceptions or limitations to copyright and related rights provided for in Union law.”  [72] In addition to this specific rule dealing with copyright limitations, Article 14(1) DSA applies: users shall receive information on upload and content sharing restrictions arising from the employment of content moderation tools.  [73] If they want to take measures against content restrictions, Article 17(9) CDSMD ensures that complaint and redress mechanisms are available to users of OCSSP services “in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them.”  [74]

34

Again, this regulatory model is not new. In UPC Telekabel Wien, the CJEU sought to ensure that, in the case of website blocking measures, the national courts in EU Member States would be able to carry out a judicial review. This, however, was only conceivable if a challenge was brought against the blocking measure implemented by an internet service provider:

Accordingly, in order to prevent the fundamental rights recognised by EU law from precluding the adoption of an injunction such as that at issue in the main proceedings, the national procedural rules must provide a possibility for internet users to assert their rights before the court once the implementing measures taken by the internet service provider are known.  [75]

35

Therefore, the rights assertion option for users served the ultimate purpose of paving the way for judicial review. In Article 17(9) CDSMD, this pattern reappears. Users can avail themselves of the option to instigate complaint and redress procedures at platform level and, ultimately, go to court. The DSA also contains specific user complaint and redress rights. Complementing Article 17(9) CDSMD,  [76] Article 20 DSA sets forth detailed rules for internal complaint handling on UGC platforms. Article 54 DSA confirms that users are entitled to compensation for any damage or loss they suffered due to an infringement of DSA obligations. As pointed out above, one of these obligations follows from Article 14(4) DSA. This provision obliges UGC platforms to apply content moderation measures in a proportionate manner – with due regard to freedom of expression and information. In addition, Article 86(1) DSA affords users the opportunity to mandate a non-profit body, organization or association to exercise their complaint, redress and compensation rights on their behalf. According to their statutes, these non-profit institutions must have a legitimate interest in safeguarding DSA rights and obligations.

36

However, the broad reliance placed on user activism – ranging from complaints to damage claims and work with non-profit bodies – is surprising. Evidence from the application of the DMCA counter-notice system in the U.S. shows quite clearly that users are unlikely to file complaints in the first place.  [77] Data from recent transparency reports covering the largest UGC platforms confirm the assumption of user inactivism.  [78] If users have to wait relatively long for a final result, it is foreseeable that a complaint and redress mechanism that depends on user initiatives is incapable of safeguarding freedom of expression and information. Moreover, an overly cumbersome complaint and redress mechanism may thwart user initiatives from the outset. The hope that users will bring damage claims and collaborate with non-profit institutions to assert their rights, thus, finds little support in the real world. While it cannot be ruled out that some users will exhaust the full arsenal of complaint, redress and compensation options, it seems unrealistic to assume that user complaint mechanisms have the potential of revealing the full spectrum and impact of free expression restrictions that result from automated content moderation systems.

37

In the context of UGC, it must also be considered that it is often crucial to react quickly to current news and film, book and music releases. If the complaint and redress mechanism finally yields the insight that a lawful content remix or mash-up has been blocked, the decisive moment for the affected quotation or parody may already have passed.  [79] From this perspective, the elastic timeframe for complaint handling – “shall be processed without undue delay”  [80] – also gives rise to concerns. This standard differs markedly from an obligation to let blocked content reappear promptly. As Article 17(9) CDSMD also requires human review, it may take quite a while until a decision on the infringing nature of content is taken. Considering these features, the complaint and redress option may appear unattractive to users.  [81]

38

Instead of dispelling concerns about human rights deficits, the reliance on user complaints, thus, constitutes a further risk factor. Apart from being ineffective as a remedy for human rights violations, the complaint and redress mechanism in Article 17(9) CDSMD may allow authorities to hide behind a lack of user activism. It may be that users refrain from complaining because they consider the mechanism too cumbersome and/or too slow. However, when taking the number of user complaints as a yardstick for assessing human rights risks, a relatively low number of user complaints may be misinterpreted as evidence that content moderation does not lead to excessive content blocking. As long as users refrain from taking action, human rights deficits stay under the radar. The oversimplified equation “no user complaint = no human rights problem” offers the opportunity of praising an overly restrictive content moderation system as a success. Instead of shedding light on human rights deficits, the complaint and redress mechanism can be used strategically to disguise encroachments upon freedom of expression and information.

39

The outsourcing problem described in the preceding section (inappropriate reliance on OCSSPs and copyright holders as human rights guardians) is thus aggravated by heavy reliance on complaint and redress mechanisms which users are unlikely to embrace. Leaving measures against the erosion of freedom of expression and information to users, the legislator cultivates a culture of concealing human rights deficits. Reliance on user complaints as indicators of human rights violations is simply inadequate. Even if users lodge a complaint, any redress, moreover, remains an ex post measure: a remedy that reinstates freedom of expression and information only after initial harm – in the form of unjustified UGC impoverishment – has occurred. The EU approach is thus wanting for at least two reasons: the outsourcing of human rights obligations to private entities and the expectation that users will take countermeasures against human rights violations.

3.2. Confirmation of the Outsourcing and Concealment Strategy in CJEU Jurisprudence

40

This outcome of the risk assessment raises the additional question whether other institutions in the platform governance arena could fulfil the role of human rights guardians more reliably. The judiciary seems a logical candidate. Interestingly, the CJEU already had the opportunity to discuss violations of freedom of expression and information that may arise from content moderation under Article 17(4)(b) and (c) CDSMD. In Poland/Parliament and Council, the Republic of Poland brought an annulment action against the content filtering branch of Article 17 CDSMD.  [82] More specifically, Poland argued that OCSSPs were bound under Article 17(4)(b) and (c) CDSMD to carry out preventive – ex ante – monitoring of all user uploads. To fulfil this Herculean task, they had to employ automatic filtering tools. In Poland’s view, EU legislation imposed this preventive monitoring obligation on OCSSPs “without providing safeguards to ensure that the right to freedom of expression and information is respected.”  [83] The contested provisions, thus, constituted a limitation on the exercise of the fundamental right to freedom of expression and information, which respected neither the essence of that right nor the principle of proportionality. Hence, the filtering obligations arising from Article 17(4)(b) and (c) CDSMD could not be regarded as justified under Article 52(1) CFR.  [84]

41

Discussing these annulment arguments, the CJEU pointed out that prior review and filtering of user uploads, indeed, created the risk of limiting a central avenue for the online dissemination of UGC. The filtering regime in Article 17(4)(b) and (c) CDSMD imposed a restriction on the ability of users to exercise their right to freedom of expression and information which was guaranteed by Article 11 CFR and Article 10 of the European Convention on Human Rights (“ECHR”).  [85] However, the Court considered that such a limitation met the requirements set forth in Article 52(1) CFR – mandating that any limitation on the exercise of the right to freedom of expression and information had to be legally established and had to preserve the essence of those freedoms.  [86] The Court was satisfied that the limitation arising from the filtering obligations in Article 17(4)(b) and (c) CDSMD could be deemed justified in the light of the legitimate objective to ensure a high level of copyright protection to safeguard the right to intellectual property enshrined in Article 17(2) CFR.  [87]

42

More specifically, the Court identified no less than six freedom of expression safeguards in the regulatory design of Article 17 CDSMD – safeguards which, in the Court’s view, gave sufficient reassurance that freedom of expression and information would not be unduly curtailed. A key aspect in this assessment of Article 17 CDSMD is the first point. The Court assumed that the introduction of automated content filtering tools would not prevent users from uploading lawful content, including UGC containing traces of protected third-party material that was permissible under statutory exceptions to copyright.  [88] In this context, the Court recalled its earlier ruling in Sabam/Netlog from which it followed that:

a filtering system which might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications, would be incompatible with the right to freedom of expression and information, guaranteed in Article 11 of the Charter, and would not respect the fair balance between that right and the right to intellectual property.  [89]

43

Hence, the Court was confident that, in the light of its case law, OCSSPs would refrain from introducing content filtering measures unless these systems could reliably distinguish between lawful parody and infringing piracy – unless they were capable of leaving all kinds of lawful uploads unaffected.  [90]

44

The second point made by the Court addresses statutory exceptions to copyright more directly. In line with earlier decisions, the CJEU confirmed that copyright limitations supporting freedom of expression, such as the right of quotation and the exemption of parody, constituted “user rights.”  [91] To avoid the dismantling of these free expression strongholds, EU Member States had to ensure that automated filtering measures did not deprive users of their freedom to upload content created for the purposes of quotation, criticism, review, caricature, parody, or pastiche.  [92]  On this point the judgment endorsed, by reference, the Advocate General Opinion stating that filters “must not have the objective or the effect of preventing such legitimate uses,” and that providers must “consider the collateral effect of the filtering measures they implement” as well as “take into account, ex ante, respect for users’ rights.”  [93]

45

As a third aspect that mitigated the corrosive effect of Article 17(4)(b) and (c) CDSMD on freedom of expression and information, the Court pointed out that the filtering machinery was only set in motion on condition that rightholders provided OCSSPs with the “relevant and necessary information”  [94] concerning protected works that should not become available on the UGC platform. In the absence of such information, OCSSPs would not be led to make content unavailable.  [95] The fourth point highlighted by the Court was the clarification in Article 17(8) CDSMD that no general monitoring obligation was intended.  [96] The fifth point was the complaint and redress mechanism allowing users to bring unjustified content blocking to the attention of the platform provider.  [97] Finally, the Court recalled that Article 17(10) CDSMD tasked the European Commission with organizing stakeholder dialogues to ensure a uniform mode of OCSSP/rightholder cooperation across Member States and establish best filtering practices in the light of industry standards of professional diligence.  [98]

46

Qualifying all six aspects as valid safeguards against an erosion of freedom of expression and information, the Court concluded that the regulatory design of Article 17 CDSMD included appropriate countermeasures to survive Poland’s annulment action.  [99] Still, the Court cautioned EU Member States, as well as their authorities and courts, that transposing and applying Article 17 CDSMD, they had to follow a fundamental rights-compliant path.  [100]  

47

Undoubtedly, the Poland decision is a milestone that contains several important clarifications. With regard to the above-described human rights risks arising from the outsourcing and concealment strategy underlying Article 17 CDSMD, however, it is disappointing. A critical assessment of the regulatory scheme is missing. The Court did not seize the opportunity to unmask human rights risks that, as explained in the preceding sections, are inherent in the heavy reliance on industry cooperation. The Court also refrained from reflecting on human rights risks that could arise from the ineffectiveness of complaint and redress mechanisms for users. Instead of exposing the outsourcing and concealment strategy and addressing human rights deficits, the Court rubberstamped not only the broader regulatory design but also its individual elements. Singling out no less than six aspects of Article 17 CDSMD and declaring them valid safeguards against violations of freedom of expression and information, the Court readily accepted the very ingredients of the Article 17 recipe that create the outsourcing and concealment risks discussed above. In the Poland ruling, the Court went far beyond condoning the approach chosen in Article 17 CDSMD. The CJEU expressly confirmed its validity – and the positive, mitigating effect of all its elements.

48

This central problem of uncritical rubberstamping in the Poland decision clearly comes to the fore when the six free expression safeguards are re-evaluated in the light of the above-described outsourcing and concealment risks. With regard to the necessity of distinguishing between lawful/unlawful content uploads (first point highlighted by the Court),  [101] a reality check is sought in vain in the judgment. From a legal-theoretical perspective, the CJEU assumption – namely that filtering systems must not be applied as long as they cannot reliably distinguish permitted parody from infringing piracy – may be right and correct. The lack of incentives to refrain from the employment of such overblocking systems in practice, however, does not enter the picture. The Court does not even mention that, instead of discouraging the use of unsophisticated filtering machines, Article 17(1) CDSMD, quite clearly, gives a very strong impulse to implement automated filtering systems regardless of their capacity to distinguish between lawful and unlawful content. As pointed out above, the risk of direct liability for infringing UGC uploads is hanging above the head of OCSSPs like the sword of Damocles. Overblocking allows OCSSPs to avert this risk, escape direct liability under Article 17(1) CDSMD and avoid lengthy and costly lawsuits. Adopting an excessive filtering approach, they only have to deal with user complaints which are unlikely to come in large numbers. Practically speaking, the implementation of an underblocking approach to safeguard freedom of expression and information is thus unlikely. In its imaginary and pure universe of legal-theoretical assumptions, the Court may assume that content filtering will only occur when automated systems are capable of separating the wheat from the chaff. To whitewash the Article 17 approach on the basis of such unrealistic assumptions, however, creates a human rights risk of its own.

49

The same can be said about the inclusion of rightholder notifications in the list of effective free expression safeguards (third point made by the Court).  [102] As pointed out above, nothing in Article 17 CDSMD prevents copyright owners from notifying long lists – entire catalogues – of protected works. Adding up all repertoire notifications arriving at OCSSPs, it seems naïve to assume that the notification mechanism laid down in Article 17(4)(b) CDSMD will never lead to a filtering volume that is comparable with the general filtering obligation which the Court prohibited in Sabam/Netlog.  [103] From this perspective, the ban on general filtering obligations in Article 17(8) CDSMD (fourth safeguard identified by the Court)  [104] can also be unmasked as mere cosmetics. The fifth safeguard which the Court accepted,  [105] is the complaint and redress mechanism that causes the corrosive concealment risk described above. The sixth and final safeguard – stakeholder dialogues seeking to establish best practices  [106] – has also been analysed above. It is a toothless tiger. Article 17(10) CDSMD is silent on measures which the Commission could take to enforce the best practices guidelines following from meetings with stakeholders. It remains unclear why the Court is willing to accept this type of fig-leaf measures as a valid free expression safeguard.

50

On balance, the Court has not only missed an important opportunity to reveal and address human rights risks that arise from the outsourcing and concealment strategy underlying Article 17 CDSMD. Choosing the most favourable interpretation of Article 17 features as a reference point for its assessment of human rights risks, and refusing to consider the practical reality of industry cooperation and the practical impact of the overblocking incentive resulting from the risk of direct liability for infringing UGC, the Court has made itself an accomplice in the outsourcing and concealment strategy that puts freedom of expression and information at risk.

3.3. Member State Legislation Seeking to Safeguard Transformative UGC

51

The foregoing critique of the six free expression safeguards which the CJEU identified in its Poland decision did not address the second point made by the Court: the obligation placed on EU Member States to ensure that transformative UGC – consisting of quotations, parodies, pastiches etc. – survives the implementation of automated content filtering systems.  [107] The reason for this omission is simple: in contrast to other Article 17 aspects, this element appears as a valid safety valve that could effectively safeguard freedom of expression and information in practice. This insight does not change the critical assessment of the Poland judgment. With regard to outsourcing and concealment risks, the decision remains a missed opportunity to address and minimize human rights risks.

52

As to the valid second point in the Poland phalanx of free expression safeguards – the need to preserve copyright limitations for creative remix activities, in particular use for the purposes of “quotation, criticism and review,” and “caricature, parody and pastiche”  [108] – Article 17(7) CDSMD plays a central role. The provision leaves no doubt that EU Member States are expected to ensure that automated content filtering does not submerge areas of freedom that support the creation and dissemination of transformative user productions that are uploaded to UGC platforms. The second paragraph of Article 17(7) reads as follows:

Member States shall ensure that users in each Member State are able to rely on any of the following existing exceptions or limitations when uploading and making available content generated by users on online content-sharing services:

(a) quotation, criticism, review;

(b) use for the purpose of caricature, parody or pastiche.  [109]

53

Use of the formulation “shall not result in the prevention” and “shall ensure that users […] are able” give copyright limitations for “quotation, criticism, review” and “caricature, parody or pastiche” an elevated status. In Article 5(3)(d) and (k) of the Information Society Directive 2001/29/EC (“ISD”),  [110] these use privileges were only listed as limitation prototypes which EU Member States are free to introduce (or maintain) at the national level. The adoption of a quotation right  [111] and an exemption of caricature, parody or pastiche  [112] remained optional. Article 17(7) CDSMD, however, transforms these use privileges into mandatory breathing space for transformative UGC – at least in the specific context of OCSSP content moderation.  [113] This metamorphosis makes copyright limitations in this category particularly robust: they “shall” survive the application of automated filtering tools.

54

Under Article 17(7) CDSMD, EU Member States are the guardians of these user rights.  [114] This regulatory decision comes as a welcome surprise. In contrast to the prevailing preference for solutions based on outsourcing (passing on human rights responsibilities to private entities) and concealment (relying in user complaints to remedy human rights deficits), Article 17(7) CDSMD entrusts the Member States – the state power itself – with the important task of guaranteeing (“shall ensure”) that, despite content filtering on OCSSP platforms, users can share creations made for the purposes of “quotation, criticism, review” and “caricature, parody or pastiche.” In this regard, the Poland decision adds an important nuance. In its discussion of safeguards against an erosion of freedom of expression and information, the CJEU qualified the complaint and redress mechanisms mandated by Article 17(9) CDSMD as additional safeguards against content overblocking:

the first and second subparagraphs of Article 17(9) of Directive 2019/790 introduce several procedural safeguards, which are additional to those provided for in Article 17(7) and (8) of that directive, and which protect the right to freedom of expression and information of users of online content-sharing services in cases where, notwithstanding the safeguards laid down in those latter provisions, the providers of those services nonetheless erroneously or unjustifiably block lawful content.  [115]

55

Hence, user complaint mechanisms evolving from Article 17(9) CDSMD only constitute additional ex post measures. As they allow corrections of wrong filtering decisions only after the harm has occurred, they can hardly be considered sufficient. First and foremost, it is necessary to have ex ante mechanisms in place that allow permissible content uploads – quotations, parodies, pastiches etc. – to survive automated content scrutiny. This is an important guideline for EU Member States. Implementing Article 17 CDSMD, they must ensure that UGC containing quotations, criticism, review, caricatures, parodies or pastiches  [116] appear directly on the platform.

56

In practice, this goal can be achieved by introducing mandatory flagging options for users. To ensure ex ante content availability – without exposure to content filtering tools – domestic legislation in EU Member States can enable users to mark quotations, parodies, pastiches etc. as permissible content uploads and oblige OCSSPs to make these uploads directly available on the UGC platform. An example of national legislation following this approach can be found in Germany.  [117] Alarmingly, however, the central importance of the state responsibility arising from Article 17(7) CDSMD seems to have escaped the attention of many other EU Member States. The German implementation model has not become widespread. Instead, the majority of Member States opted for a national transposition that does not offer users specific legal tools, such as statutory flagging options, to benefit from the exemption of quotations, parodies, pastiches etc.  [118] The Netherlands, for instance, gave preference to a literal implementation of Article 17 CDSMD. Effective ex ante mechanisms – capable of placing quotations, parodies, pastiches etc. beyond the reach of content filtering systems from the outset – are sought in vain. Instead, the Dutch legislator places reliance on complaint and redress mechanisms even though this legal instrument only allows users to take measures ex post: after quotations, parodies, pastiches etc. have been filtered out and the UGC spectrum has been impoverished.  [119] In the light of the Poland decision, it is doubtful that this implementation is adequate. As explained, the CJEU characterized ex post complaint and redress mechanisms as additional safeguards that supplement – but cannot replace – ex ante safeguards, such as the statutory flagging options in Germany.  [120]

3.4. European Commission Taking Action on the Basis of Audit Reports

57

As many EU Member States seem reluctant to translate their human rights responsibility under Article 17(7) CDSMD into statutory ex ante mechanisms that immunize quotations, parodies, pastiches etc. against content filtering measures, it is important to look beyond the regulatory framework in the CDSM Directive. An analysis of Article 17 CDSMD does not exhaust the full spectrum of legal tools that could contribute to the preservation of freedom of expression and information in content moderation contexts. In line with the interplay between the CDSM Directive and the DSA configurated in Article 2(4)(b) and Recital 11 DSA, it is possible to factor DSA provisions into the equation when the CDSM Directive does not contain more specific rules.

58

A legal tool that does not appear in the CDSM Directive is the possibility for the executive to exercise control over content moderation systems on the basis of audit reports. In the DSA, this avenue for public authorities seeking to fulfil a watchdog function ex officio has been developed in Article 37. With respect to very large online platforms (“VLOPs”)  [121] and very large online search engines (“VLOSEs”),  [122] Article 37(1) DSA orders annual audits to assess compliance, among other things, with the obligations set forth in Chapter III of the DSA. Interestingly, one of the obligations laid down in Chapter III concerns the “diligent, objective and proportionate”  [123] application of content moderation systems in line with Article 14(4) DSA.

59

Supplementing the complaint and redress system of Article 17(9) CDSMD that depends on user initiatives, Article 37 DSA may thus offer an important alternative basis that allows the executive power to prevent human rights violations. Article 37(3) DSA ensures that auditors establishing the report are independent from the VLOPs and VLOSEs under examination. In particular, it prevents organizations from performing an audit when they have a conflict of interest with the VLOP or VLOSE concerned, or with a legal person connected to that service provider. The audit report must contain an opinion – in the categories “positive,” “positive with comments,” and “negative” – on whether the VLOP or VLOSE has complied with the obligations and commitments under Chapter III DSA, including the above-described human rights and proportionality obligations laid down in Article 14(1) and (4) DSA.  [124] If the audit opinion is not “positive,” auditors are bound to include operational recommendations and specify the measures necessary to achieve compliance. They must also recommend a timeframe for achieving compliance.  [125] In such a case, the VLOP or VLOSE concerned must adopt, within one month from receiving the recommendations, an audit implementation report. If the VLOP or VLOSE does not intend to implement the operational recommendations, it must give reasons for not doing so and set out alternative measures that it has taken to address the instances of non-compliance identified in the audit report.  [126]

60

As to the role of the European Commission, Article 42(4) DSA is of particular importance. This provision obliges VLOPs and VLOSEs to transmit audit reports and audit implementation reports to the Commission without undue delay. If, based on this information, the Commission suspects a VLOP or VLOSE of infringing Article 14 DSA, it can initiate proceedings pursuant to Article 66(1) DSA. It may request further information, conduct interviews and inspect premises to learn more about the suspected infringement.  [127] In case of a “risk of serious damage for the recipients of the service,” Article 70(1) DSA entitles the Commission to order interim measures on the basis of a prima facie finding of infringement. If the Commission finally establishes non-compliance with “the relevant provisions of this Regulation” – including the human rights safeguards in Article 14(4) DSA – in a decision pursuant to Article 73(1) DSA, it may impose fines of up to six percent of the VLOP’s or VLOSE’s total worldwide annual turnover in the preceding financial year.  [128] For the imposition of fines, Article 74(1) DSA requires a finding that the service provider under examination has infringed Article 14(4) DSA intentionally or negligently.

61

Considering this cascade of possible Commission actions, the potential of the audit mechanism in Article 37 DSA must not be underestimated. The audit system may be an important addition to the canon of norms in the CDSM Directive and, in particular, a promising counterbalance to outsourcing/concealment risks arising from the regulatory design of Article 17 CDSMD. Like the Member State legislation discussed in the preceding section, Commission interventions evolving from the problem analysis in an audit report are welcome departures from the strategy to pass on human rights responsibilities to private companies or users: the state power itself – in this case the Commission as the executive body of an international intergovernmental organization – remains directly responsible for detecting and remedying human rights deficits.

62

A potential blind spot of the described audit cascade leading to investigations, however, is this: in order to offer sufficient starting points for Commission action, audit reports addressing content moderation systems must go beyond a general problem analysis. The audit opinion must convincingly discuss a platform’s failure to satisfy human rights obligations evolving from Article 14(4) DSA. It must contain a concrete assessment of the risk of human rights violations and a sufficient substantiation of that risk. Hence, auditors should be bound to devote sufficient attention to human rights implications of content moderation. They must insist on detailed information on the practical implementation of content filtering tools that allows a proper assessment of the actual impact on users. An audit opinion merely scratching the surface – remaining at the superficial level of general platform policies and procedures to somehow tick off the point of freedom of expression risks – is not enough.

63

Luckily, the DSA itself points in this direction anyway. The general transparency obligation set forth in Article 15(1) DSA already obliges UGC platforms to publish annually clear and easily comprehensible content moderation reports. These reports must include information on the number of illegal content notices that have been submitted,  [129] categorized by the type of alleged illegal content concerned and the number of notices submitted by trusted flaggers,  [130] and information on any action taken pursuant to the notices, differentiating whether the action was taken on the basis of the law or the provider’s terms and conditions. The reports must also specify the number of notices processed by using automated means and the median time needed for taking the action.  [131] If automated content moderation tools have been deployed, the reports must include a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.  [132]

64

Arguably, the source material for audit reports in the sense of Article 37(1) DSA must be richer than this standard information which UGC platforms must make available under Article 15(1) DSA anyway. Article 37(2) DSA points out that VLOPs and VLOSEs must afford auditors the cooperation and assistance necessary for conducting the audit in an effective, efficient and timely manner. This includes the obligation to provide access to all relevant data and give answers to oral or written questions. It would thus come as a surprise if audit opinions only reflected the generally available information flowing from Article 15(1) DSA. If this becomes necessary, the Commission can also ensure sufficient focus on the examination of human rights deficits by adopting a delegated act on the basis of Article 37(7) DSA that creates clarity about the necessity to devote particular attention to human rights questions in audit reports and seek all information necessary for this purpose.

4. Conclusion

65

On balance, the closer inspection of content moderation rules in the CDSM Directive and the DSA confirms a worrying tendency of reliance on industry cooperation and user activism to safeguard human rights. Instead of putting responsibility for detecting and remedying human rights deficits in the hands of the state, the EU legislature prefers to outsource this responsibility to private entities, such as OCSSPs, and conceal potential violations by leaving countermeasures to users. Considering the pattern of regulatory outsourcing and concealment decisions in the CDSM Directive and the DSA, it is justified to speak of a broader outsourcing and concealment strategy that endangers the fundamental rights of users. The risk of human rights encroachments is compounded by the fact that, instead of exposing and discussing the corrosive effect of human rights outsourcing, the CJEU has rubberstamped the regulatory approach in Article 17 CDSMD. In its Poland decision, the Court has even qualified problematic features of the outsourcing and concealment strategy as valid safeguards against the erosion of freedom of expression and information.

66

As a welcome departure from the Court-approved outsourcing and concealment scheme, Article 17(7) CDSMD obliges Member States to ensure that transformative UGC, containing quotations, parodies, pastiches etc., survives content filtering and appears on online platforms. In addition, audit reports evolving from Article 37 DSA can offer important information for the European Commission to identify and eliminate human rights violations. Both exceptions to the rule of outsourcing to private entities, however, are currently underdeveloped. Many EU Member States refrained from taking specific legislative action to protect transformative UGC from content filtering measures. The success of the DSA cascade of European Commission interventions – from audit reports to non-compliance decisions and fines that ensure human rights compliance  [133] – is unclear. Therefore, it would be premature to sound the all-clear. To safeguard human rights in the UGC galaxy, the state power itself must become much more active. Litanies of due diligence and proportionality obligations for private entities and reliance on user activism are not enough.

* Martin Senftleben, Ph.D.; Professor of Intellectual Property Law and Director, Institute for Information Law (IViR), University of Amsterdam; Of Counsel, Bird & Bird, The Hague, The Netherlands.



[1] For a definition and description of central UGC features, see OECD, 12 April 2007, “Participative Web: User-Created Content”, Doc. DSTI/ICCP/IE(2006)7/Final, available at https://www.oecd.org/sti/38393115.pdf (last visited on 12 August 2023), 8-12.

[2] For example, statistics relating to the online platform YouTube report over one billion users uploading 500 hours of video content every minute. Cf. https://www.statista.com/statistics/259477/hours-of-video-uploaded-to-youtube-every-minute/#:~:text=This%20equates%20to%20approximately%2030%2C000,for%20online%20video%20has%20grown (last visited on 12 August 2023).

[3] OECD, supra note 1, 8-22.

[4] Article 17(2) CFR.

[5] Articles 11 and 16 CFR. Cf. CJEU, 16 February 2012, case C-360/10, Sabam/Netlog, para. 51.

[6] As to the debate on user-generated content and the need for the reconciliation of divergent interests in this area, see M.R.F. Senftleben, “Breathing Space for Cloud-Based Business Models – Exploring the Matrix of Copyright Limitations, Safe Harbours and Injunctions”, Journal of Intellectual Property, Information Technology and E-Commerce Law 4 (2013), 87 (87-90); M.W.S. Wong, “Transformative User-Generated Content in Copyright Law: Infringing Derivative Works or Fair Use?”, Vanderbilt Journal of Entertainment and Technology Law 11 (2009), 1075; E. Lee, “Warming Up to User-Generated Content”, University of Illinois Law Review 2008, 1459; B. Buckley, “SueTube: Web 2.0 and Copyright Infringement”, Columbia Journal of Law and the Arts 31 (2008), 235; T.W. Bell, “The Specter of Copyism v. Blockheaded Authors: How User-Generated Content Affects Copyright Policy”, Vanderbilt Journal of Entertainment and Technology Law 10 (2008), 841.

[7] Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on Copyright and Related Rights in the Digital Single Market and Amending Directives 96/9/EC and 2001/29/EC, Official Journal of the European Communities 2019 L 130, 92.

[8] Article 17(3) CDSMD. For a discussion of this regulatory approach, see M.R.F. Senftleben, “Institutionalized Algorithmic Enforcement – The Pros and Cons of the EU Approach to Online Platform Liability”, Florida International University Law Review 14 (2020), 299 (308-312); N. Elkin-Koren, “Fair Use by Design”, UCLA Law Review 64 (2017), 1082 (1093); M. Husovec, “The Promises of Algorithmic Copyright Enforcement: Takedown or Staydown? Which Is Superior? And Why?”, Columbia Journal of Law and the Arts 42 (2018), 53 (76-84).

[9] Article 2(6) and Recitals 62, 63 CDSMD. Cf. A. Metzger/M.R.F. Senftleben, “Understanding Article 17 of the EU Directive on Copyright in the Digital Single Market – Central Features of the New Regulatory Approach to Online Content-Sharing Platforms”, Journal of the Copyright Society of the U.S.A. 67 (2020), 279 (284-286).

[10] M.R.F. Senftleben, “Bermuda Triangle: Licensing, Filtering and Privileging User-Generated Content Under the New Directive on Copyright in the Digital Single Market”, European Intellectual Property Review 41 (2019), 480 (481-485); M. Husovec/J.P. Quintais, “How to License Article 17? Exploring the Implementation Options for the New EU Rules on Content-Sharing Platforms under the Copyright in the Digital Single Market Directive”, Gewerblicher Rechtsschutz und Urheberrecht – International 70 (2021), 325; M. Leistner, “European Copyright Licensing and Infringement Liability Under Art. 17 DSM-Directive Compared to Secondary Liability of Content Platforms in the U.S. – Can We Make the New European System a Global Opportunity Instead of a Local Challenge?”, Zeitschrift für Geistiges Eigentum/Intellectual Property Journal 12 (2020), 123 (123-214); C. Geiger/B.J. Jütte, “Towards a Virtuous Legal Framework for Content Moderation by Digital Platforms in the EU? The Commission’s Guidance on Article 17 CDSM Directive in the light of the YouTube/Cyando judgement and the AG’s Opinion in C-401/19”, European International Property Review 43 (2021), 625 (625-635).

[11] Article 17(4)(b) CDSMD.

[12] See CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 53, where this assumption has been confirmed.

[13] Article 17(10) CDSMD.

[14] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act), Official Journal of the European Union 2022 L 277, 1.

[15] See the definition of “intermediary services” in Article 3(g) DSA.

[16] Article 14(1) DSA.

[17] Article 14(4) DSA.

[18] Article 14(4) DSA.an

[19] Charter of Fundamental Rights of the European Union, Official Journal of the European Communities 2000 C 364, 1. Article 52(1) CFR reads as follows: “Any limitation on the exercise of the rights and freedoms recognised by this Charter must be provided for by law and respect the essence of those rights and freedoms. Subject to the principle of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of

general interest recognised by the Union or the need to protect the rights and freedoms of others.”

[20] CJEU, 27 March 2014, case C-314/12, UPC Telekabel Wien, para. 55.

[21] CJEU, id., para. 56.

[22] CJEU, id., para. 56.

[23] See the concept of hosting services in Article 3(g)(iii) DSA.

[24] T. Mylly, “The New Constitutional Architecture of Intellectual Property”, in: J. Griffiths/T. Mylly (eds.), Global Intellectual Property Protection and New Constitutionalism – Hedging Exclusive Rights, Oxford: Oxford University Press 2021, 50 (71).

[25] Mylly, supra note 24, 71.

[26] Article 14(4) DSA. Article 14(1) DSA explicitly refers to content moderation measures.

[27] Article 14(4) DSA.

[28] CJEU, 1 December 2011, case C-145/10, Painer, para. 132; CJEU, 3 September 2014, case C-201/13, Deckmyn, para. 26. See also CJEU, 29 July 2019, case C-476/17, Pelham, para. 32, 37 and 59.

[29] See the interplay of creative industry notifications and filtering measures applied by the platform industry that results from Article 17(4)(b) and (c) CDSMD.

[30] Article 17(10) CDSMD.

[31] Article 17(10) CDSMD.

[32] Article 17(1) DSMD.

[33] Article 17(2) CDSMD.

[34] Cf. M.R.F. Senftleben, “Content Censorship and Council Carelessness – Why the Parliament Must Safeguard the Open, Participative Web 2.0”, Tijdschrift voor Auteurs-, Media- & Informatierecht 2018, 139 (141-142).

[35] Cf. Senftleben, supra note 8, 305-307.

[36] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para 53.

[37] For a more candid statement, see M.R.F. Senftleben, “The Original Sin – Content ‘Moderation’ (Censorship) in the EU”, Gewerblicher Rechtsschutz und Urheberrecht – International 69 (2020), 339-340.

[38] For a more detailed discussion of this development, see Senftleben, supra note 8, 299-328; Elkin-Koren, supra note 8, 1093.

[39] CJEU, case C-275/06, Productores de Música de España (Promusicae)/Telefónica de España SAU, para. 68.

[40] CJEU, 16 February 2012, case C-360/10, Sabam/Netlog, para. 16-18.

[41] CJEU, ibid., para. 26 and 36-37.

[42] As to the different levels of content monitoring that can be derived from CJEU jurisprudence, see M.R.F. Senftleben/C. Angelopoulos, The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market, Amsterdam: Institute for Information Law/Cambridge: Centre for Intellectual Property and Information Law 2020, 7-16.

[43] CJEU, ibid., para. 41-44.

[44] Articles 8 and 11 CFR. See CJEU, ibid., para. 48-50.

[45] CJEU, ibid., para. 50.

[46] CJEU, ibid., para. 51.

[47] CJEU, ibid., para. 26 and 36-37.

[48] Cf. Senftleben/Angelopoulos, supra note 42, 8-9.

[49] CJEU, ibid., para. 26.

[50] Article 17(4)(b) DSMD. The intention to obviate the impression of a prohibited general monitoring obligation also lies at the core of Article 17(8) DSMD. This provision declares that UGC licensing and filtering “shall not lead to any general monitoring obligation.”

[51] Article 17(4)(b) CDSMD.

[52] Article 17(4)(b) CDSMD.

[53] Article 17(4)(b) CDSMD.

[54] CJEU, 16 February 2012, case C-360/10, Sabam/Netlog, para. 51.

[55] For a corresponding concept of “normal exploitation” in the sense of the three-step test in copyright law, see M.R.F. Senftleben, Copyright, Limitations and the Three-Step Test – An Analysis of the Three-Step Test in International and EC Copyright Law, The Hague/London/New York: Kluwer Law International 2004, 189-194.

[56] Article 17(4)(b) CDSMD (emphasis added).

[57] Article 17(4)(b) CDSMD.

[58] Article 17(4)(b) CDSMD.

[59] Article 14(4) DSA.

[60] Article 17(4)(b) CDSMD.

[61] Article 14(4) DSA.

[62] Mylly, supra note 24, 71.

[63] Article 17(5) DSMD.

[64] Cf. M. Perel/N. Elkin-Koren, “Accountability in Algorithmic Copyright Enforcement”, Stanford Technology Law Review 19 (2016), 473 (490-491). For empirical studies pointing towards overblocking, see Sharon Bar-Ziv/Niva Elkin-Koren, “Behind the Scenes of Online Copyright Enforcement: Empirical Evidence on Notice & Takedown”, Connecticut Law Review 50 (2017), 3 (37); Jennifer M. Urban/Joe Karaganis/Brianna L. Schofield, “Notice and Takedown in Everyday Practice”, UC Berkeley Public Law and Legal Theory Research Paper Series, Version 2, March 2017, available at https://ssrn.com/abstract=2755628, 2.

[65] Further information and transparency obligations are found in elsewhere in Article 14, namely in paras (2), (3), (5) and (6).

[66] Article 14(1) DSA.

[67] Article 14(1) DSA.

[68] Article 14(1) DSA.

[69] Examples can be found in the GDPR and Terrorist Content Regulation.

[70] Article 17(9) CDSMD.

[71] Article 2(4)(b) and Recital 11 DSA. For an extensive analysis of this topic, see J.P. Quintais/S.F. Schwemer, The Interplay between the Digital Services Act and Sector Regulation: How Special Is Copyright?, European Journal of Risk Regulation 13 (2022), 191; Alexander Peukert et al., European Copyright Society – Comment on Copyright and the Digital Services Act Proposal, International Review of Intellectual Property and Competition Law 53 (2022), 358.

[72] Article 17(9) CDSMD.

[73] For more general transparency obligations, see Article 15(1) DSA and the discussion of these more general obligations in section 4.

[74] Article 17(9) CDSMD.

[75] CJEU, 27 March 2014, case C-314/12, UPC Telekabel Wien, para. 57.

[76] As to the complementary character of Article 20 DSA, see Article 2(4)(b) and Recital 11 DSA. Cf. Quintais/Schwemer, supra note 71, 358.

[77] See the study conducted by J.M. Urban/L. Quilter, “Efficient Process or “Chilling Effects”? Takedown Notices Under Section 512 of the Digital Millennium Copyright Act”, Santa Clara Computer and High Technology Law Journal 22 (2006), 621, showing, among other things, that 30% of DMCA takedown notices were legally dubious, and that 57% of DMCA notices were filed against competitors. While the DMCA offers the opportunity to file counter-notices and rebut unjustified takedown requests, Urban and Quilter find that instances in which this mechanism is used are relatively rare. However, cf. also the critical comments on the methodology used for the study and a potential self-selection bias arising from the way in which the analyzed notices have been collected by F.W. Mostert/M.B. Schwimmer, “Notice and Takedown for Trademarks”, Trademark Reporter 101 (2011), 249 (259-260).

[78] See the analysis conducted by M.R.F. Senftleben/J.P. Quintais/A. Meiring, “Outsourcing Human Rights Obligations and Concealing Human Rights Deficits: The Example of Monetization Under the CDSMD and the DSA”, Berkeley Technology Law Journal 38 (2023), III.B.1 (forthcoming).

[79] Apart from the time aspect, complaint systems may also be implemented in a way that discourages widespread use. Cf. Perel/Elkin-Koren, supra note 64, 507-508 and 514. In addition, the question arises whether users filing complaints are exposed to copyright infringement claims in case the user-generated quotation, parody or pastiche at issue (which the user believes to be legitimate) finally proves to amount to copyright infringement. Cf. N. Elkin-Koren, supra note 8, 1092.

[80] Article 17(9) CDSMD.

[81] Cf. Senftleben, supra note 10, 484.

[82] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 24. For a more detailed discussion of the decision, see J.P. Quintais, Between Filters and Fundamental Rights: How the Court of Justice saved Article 17 in C-401/19 - Poland v. Parliament and Council, Verfassungsblog (2022), https://verfassungsblog.de/filters-poland/ (last visited 5 April 2023); M. Husovec, “Mandatory filtering does not always violate freedom of expression: Important lessons from Poland v. Council and European Parliament”, Common Market Law Review 60 (2023), 173.

[83] CJEU, id., para. 24.

[84] CJEU, id., para. 24.

[85] CJEU, id., para. 55, 58, 82.

[86] CJEU, id., para. 63 et seq., referring to the principle of proportionality.

[87] CJEU, id., para. 69.

[88] CJEU, id., para. 86.

[89] CJEU, id., para. 86. Cf. CJEU, 16 February 2012, case C-360/10, Sabam/Netlog, para. 50-51.

[90] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 86.

[91] CJEU, id., para. 87-88; CJEU, 29 July 2019, case C-516/17, Spiegel Online, para. 50-54; CJEU, 29 July 2019, case C‑469/17, Funke Medien NRW, para. 65-70. Cf. Tanya Aplin and Lionel Bently, Global Mandatory Fair Use: The Nature and Scope of the Right to Quote Copyright Works, Cambridge: Cambridge University Press 2020, 75-84; C. Geiger/E. Izyumenko, “The Constitutionalization of Intellectual Property Law in the EU and the Funke Medien, Pelham and Spiegel Online Decisions of the CJEU: Progress, but Still Some Way to Go!”, International Review of Intellectual Property and Competition Law 51 (2020), 282 (292-298).

[92] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 87. With regard to the particular importance of the inclusion of the open-ended concept of “pastiche,” see M.R.F. Senftleben, “User-Generated Content – Towards a New Use Privilege in EU Copyright Law”, in: T. Aplin (ed.), Research Handbook on IP and Digital Technologies, Cheltenham: Edward Elgar 2020, 136 (145-162); Senftleben, supra note 8, 320-327; E. Hudson, “The pastiche exception in copyright law: a case of mashed-up drafting?”, Intellectual Property Quarterly 2017, 346 (348-352 and 362-364); F. Pötzlberger, “Pastiche 2.0: Remixing im Lichte des Unionsrechts”, Gewerblicher Rechtsschutz und Urheberrecht 2018, 675 (681); J.P. Quintais, Copyright in the Age of Online Access – Alternative Compensation Systems in EU Law, Alphen aan den Rijn: Kluwer Law International 2017, 235-237.

[93] Opinion of Advocate General Saugmandsgaard Øe, 15 July 2021, case C-401/19, Poland/Parliament and Council, para. 193.

[94] Article 17(4)(b) CDSMD.

[95] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 89.

[96] CJEU, id., para. 90. See Article 17(8) CDSMD; Article 8 and Recital 30 DSA. Cf. Senftleben/Angelopoulos, supra note 42, for a more detailed discussion on the prohibition of general monitoring obligations.

[97] CJEU, id., para. 94. See Article 17(9) CDSMD.

[98] CJEU, id., para. 96-97. As to existing best practices guidelines, see Communication from the Commission to the European Parliament and the Council, Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market, COM/2021/288 final.

[99] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 98.

[100] CJEU, id., para. 99.

[101] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 86.

[102] CJEU, id., para. 89.

[103] CJEU, id., para. 86; CJEU, 16 February 2012, case C-360/10, Sabam/Netlog, para. 50-51.

[104] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 90.

[105] CJEU, id., para. 93.

[106] CJEU, id., para. 96.

[107] CJEU, id., para. 87-88.

[108] Article 17(7) CDSMD. Cf. Senftleben, supra note 10, 485-490; P.B. Hugenholtz/M. Senftleben, Fair Use in Europe. In Search of Flexibilities, Amsterdam: Institute for Information Law/VU Centre for Law and Governance 2011, 29-30.

[109] Article 17(7) CDSMD.

[110] Article 5(3)(d) and (k) of Directive 2001/29/EC of the European Parliament and of the Council of 22 May 2001, on the Harmonisation of Certain Aspects of Copyright and Related Rights in the Information Society, Official Journal of the European Communities 2001 L 167, 10).

[111] Article 5(3)(d) ISD.

[112] Article 5(3)(k) ISD.

[113] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 87. Cf. J.P. Quintais/G. Frosio et al., “Safeguarding User Freedoms in Implementing Article 17 of the Copyright in the Digital Single Market Directive: Recommendations From European Academics”, Journal of Intellectual Property, Information Technology and Electronic Commerce Law 10 (2020), 277 (278-279).

[114] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 87-88.

[115] CJEU, id., para. 93.

[116] Article 17(7) CDSMD.

[117] See Sections 11(1), no. 1 and 3, 9(1) and (2), and 5(1) of the German Act on the Copyright Liability of Online Content Sharing Service Providers, available in official English translation at: https://www.gesetze-im-internet.de/englisch_urhdag/index.html .

[118] For studies of national implementations of Article 17, see J.P. Quintais/P. Mezei et al., Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis, reCreating Europe Report 2022, https://papers.ssrn.com/abstract=4210278 (last visited 12 August 2023); C. Angelopoulos, Articles 15 & 17 of the Directive on Copyright in the Digital Single Market Comparative National Implementation Report, Cambridge: Centre for Intellectual Property and Information Law 2022, available at: https://informationlabs.org/copyright/ (last visited on 12 August 2023).

[119] Article 29c(7) of the Dutch Copyright Act (Auteurswet).

[120] CJEU, 26 April 2022, case C-401/19, Poland/Parliament and Council, para. 93. As to the German legislation, see the description above and German Act on the Copyright Liability of Online Content Sharing Service Providers, id., Sections 11(1), no. 1 and 3, 9(1) and (2), 5(1).

[121] In accordance with Article 33(1) DSA, an online platform is qualified as a VLOP when it has a number of average monthly active service recipients in the EU that is equal to, or higher than, 45 million, and has been designated as a VLOP by the European Commission pursuant to Article 33(4) DSA.

[122] In accordance with Article 33(1) DSA, a search engine is qualified as a VLOSE when it has a number of average monthly active service recipients in the EU that is equal to, or higher than, 45 million, and has been designated as a VLOSE by the European Commission pursuant to Article 33(4) DSA.

[123] Article 14(4) DSA.

[124] Article 37(4)(g) DSA.

[125] Article 37(4)(h) DSA.

[126] Article 37(6) DSA.

[127] Articles 67 to 69 DSA.

[128] Article 74(1) DSA.

[129] Article 16 DSA.

[130] Article 15(1)(b) DSA.

[131] Article 15(1)(b) DSA.

[132] Article 15(1)(e) DSA.

[133] Articles 66 to 74 DSA.

Fulltext

License

Any party may pass on this Work by electronic means and make it available for download under the terms and conditions of the Digital Peer Publishing License. The text of the license may be accessed and retrieved at http://www.dipp.nrw.de/lizenzen/dppl/dppl/DPPL_v2_en_06-2004.html.

JIPITEC – Journal of Intellectual Property, Information Technology and E-Commerce Law
Article search
Extended article search
Newsletter
Subscribe to our newsletter
Follow Us
twitter
 
Navigation