Document Actions

No section

HYBRID SPEECH GOVERNANCE

  1. Prof. Wolfgang Schulz
  2. LL.M. Christian Ollig

Abstract

The normative development of communication rules on online platforms puts traditional notions of rulemaking and rule application in trouble. The overlap, interdependence, and inseparability of private and public communication rules on social media platforms should therefore be analyzed under the lens of a specific category: hybrid speech governance. This perspective can help to find appropriate approaches to contain private power without simply transferring state-centric concepts unchanged to platform operators. This applies to questions of the basis for validity of communication rules, rule of law requirements, and fundamental rights obligations. The EU’s Digital Services Act (DSA) adopts this perspective of hybrid speech governance and thus finds initial legislative answers to the questions raised. Art. 14 DSA is noteworthy in that regard, but it is only the beginning of the story. Academia, practice, and jurisprudence will have to flesh out the DSA’s approaches to hybrid speech governance in detail. In particular, the current parallel debate in the U.S. on the question of the constitutional obligations of social media platforms could benefit from this European approach as a source of inspiration—it does not seem out of the question that the Supreme Court will add a balancing model to the current dichotomy of state action doctrine. Only such a balancing model can do justice to the phenomenon of hybrid speech governance, for platform governance and beyond.

Keywords

1. Introduction  [1]  [2]

Yesterday was my last day at Twitter: the entire Human Rights team has been cut from the company. I am enormously proud of the work we did to implement the UN Guiding Principles on Business & Human Rights (...).

1

With these words, Shannon Raj Singh, Former Human Rights Council at Twitter, now “X”, commented on her resignation after Elon Musk’s takeover of the platform. [3] Her comment sheds some light on how platform operators structure the rules in their communication spaces. On the one hand, they shape online communication according to their own private rules. On the other, these private communication rules are increasingly influenced by standards set by public actors—such as the UN Guiding Principles on Business and Human Rights. In short, limits to free speech on online platforms result from two different sets of rules, namely private rules and state-set rules, interacting with each other.

2

The fact that not only entrepreneurial autonomy but also government rules have an impact on a product is nothing new. State actors regularly set rules to prevent the dangers posed by the production, distribution or consumption of a product—and this influences our product experience. [4] If coffeeshops are required by government regulations to warn us on our coffee mug that its content is potentially hot, then corporate compliance impacts product design. Nothing else is fundamentally true for the product of social media platforms: How we can communicate publicly with each other online is the result of private and public rules. Here, however, a special feature arises. Private rules and public rules are inextricably linked. The latter are not mere external requirements, but the resulting body of communication rules becomes the product itself. [5] In that sense, private and public rules are interconnected, they overlap, and are mutually dependent.

3

This phenomenon of overlapping and intertwining public and private rules poses challenges to traditional legal concepts. In particular, legal thinking in liberal democracies such as the EU and the U.S. has been characterized by the binary distinction between private and public rules. [6] Such a binary perspective is actor-centric. It focuses primarily on who sets rules. If the state sets law, different requirements have to be considered than if private actors set rules. Yet, such binary legal thinking fails to adequately map—and understand—what the regulatory structure of social media platforms actually is. We argue in this paper that, with a view to ordering in digital communication spaces, the focus should move away from the actors of rules to the actual emergence of communication rules and the corresponding regulatory structures. That does not mean that concepts like the state actor doctrine become obsolete, but that legal thinking can profit from changing perspectives. This new perspective consequently takes a look at the overlapping and coalescing of private and public communication rules on social media platforms. This is what we will call “hybrid speech governance”.

4

In the non-legal analysis of norms, switching between these perspectives is already common. A phase that looked at the possibilities and limits of state control was followed by one that examined the normative structures that result from the setting of norms by different actors. [7] This change of perspective can easily be addressed by theories of law, such as Teubner’s “reflexive law”. [8] For legal practice, however, the actor-centric perspective is essential, as will be seen, for example, in the link to fundamental rights. But legal analysis can profit from looking at the structural level and is forced to do so to adequately deal with phenomena such as hybrid speech governance.

5

This all begs the fundamental question of how hybrid speech governance can fit into a legal system based on the binary distinction between state and private rules. It seems that the DSA recently passed by the EU [9] provides an initial legislative answer to this question. Indeed, the DSA takes up the phenomenon of hybrid speech governance in its Article 14. This prompts us to take a closer look at the hybrid normative field on social media platforms. We will first trace the origins of hybrid speech governance (2). Second, we will classify hybrid speech governance as a category of analysis (3). Third, we will specify what constitutional challenges hybrid speech governance raises (4). Fourth, we will shed light on how the DSA addresses these challenges (5). Fifth, against the background of the insights thus gained, we will take a look at parallel issues in the U.S. legal system (6).

6

The main goal of this paper is to argue for research specifically on structures of hybrid speech governance. While initial approaches to the general phenomenon of the hybridization of Internet governance can already be found, [10] our work is intended to provide a concrete prelude to more research in relation to hybrid speech governance. However, our thesis does not claim to reach conclusive answers. Here, our essential aim is to determine the phenomenon and specific problem areas of hybrid speech governance. In particular, we would like to highlight EU law as a starting point so as to stimulate the transatlantic dialogue on platform regulation.

2. The Origins of Hybrid Speech Governance

7

Legal regulation is an essential component of ordering on social media platforms. [11] If we look at the actors involved in speech regulation, this can be depicted as a pluralistic triangle of free speech regulation, as Balkin has graphically elaborated [12] (2.1). If, on the other hand, the focus is to be on the actual regulatory structures on social media platforms, the picture is different (2.2).

2.1. From an Actor-Centered Perspective…

8

In his triangular model, Balkin vividly systematizes the relationships between free speech actors in the digital world. On the first side of the triangle are the private communication rules of digital infrastructures (e.g., social media companies), which they impose on speakers. [13] When private parties provide other private parties with a space to communicate, they can autonomously regulate the freedom of speech exercised there as part of their freedom of contract. Thus, digital companies determine what speech is permitted on their platform by way of private ordering. [14] Platform operators regularly lay down such rules in their Terms and Conditions (T&C). [15] In these terms, they decide what may and may not be said on their platforms. In many cases, these rules are stricter than what the state allows. In other words, this is “governance by platforms” [16].

9

On the second side of the triangle are rules by actors with public authority defining limits to free speech (“old-school speech regulation”): [17] Legislators shape the communication space through public rules, courts interpret these rules, and administrative authorities enforce them. Of course, even before the age of digital communication, this public body of rules consisted of criminal statutes, civil liability laws, and administrative orders. [18] Such rules directly define the permitted scope of free speech.

10

On the third side of the triangle, in the age of digitization old-school speech regulation is increasingly being joined by “new-school speech regulation”. [19] This means that state actors are addressing the operators of digital infrastructures. Accordingly, the state sets rules that impose obligations on intermediaries. [20] Consequently, the state, in cooperation with private parties, only indirectly regulates free speech in these private infrastructures. One example is the German Network Enforcement Act, [21] which will soon be replaced by the DSA. In other words, this is “governance of platforms”. [22]

11

The three sides of the triangle may be presented as follows: [23]


 

12

This model is able to systematize who sets communication rules in the digital space: public authorities vis-à-vis platforms and speakers, platforms vis-à-vis speakers. This is a helpful perspective. It illustrates the authors and legal relationships in digital communication spaces. In particular, it can be used to identify dangers within the individual legal relationships, [24] in order to design suitable government regulation. [25]

2.2. … to a Governance-Centered Perspective

13

We argue that in addition to this actor-centered perspective, a governance-centered approach is necessary. If we are interested in the rules that actually govern free speech on a platform, it is not enough to focus only on the authors of the rules. Rather, we need to consider the connections between private and state rules of communication. State and private rules of communication interact. This can be illustrated by another focus on Balkin’s model:

 

 

14

The focus of hybrid speech governance is not on the corners and sides of the triangle, but on its bottom. Here, the private and public ordering of free speech overlap in their outcome. To this extent, the actual outcome of the regulatory structure on social media platforms is a hybrid form of private and state rules, represented by the emerging pattern at the bottom of the illustration.

15

Those links between private and public communication rules have two central foci. On the one hand, private platforms are (increasingly) deciding voluntarily to comply with regulations that were originally directed only at public authorities. Private governance by platform operators is thus oriented towards public values. In other words, platforms are voluntarily integrating public ordering requirements into their private ordering (2.2.1). On the other hand, public authorities are (increasingly) obliging private platform operators to take into account public values that were originally only directed at public authorities. Therefore, traditional public ordering requirements are becoming mandatory for the private ordering of platform operators (2.2.2).

2.2.1. The Approximation of Private Ordering to Public Ordering Requirements

16

Influenced by the U.S. approach, [26] “digital liberalism” prevailed in the EU at the beginning of the 2000s. [27] European integration was primarily market-driven. Safe harbor regulations were intended to promote the free development of digital platform operators. [28] The entrepreneurial freedom of platform operators was paramount in the EU. This liberal environment provided the original impetus for the extensive private ordering of platforms. [29] However, it is becoming apparent that social media companies are more and more likely to voluntarily move closer to the logic of classic public ordering in their rulemaking.

17

Indeed, the rules of social media companies have become differentiated in recent years. While there were initially a few general house rules, today there are various interlinked normative texts that frequently change. Not only do the norms affect more and more subject areas, but regular hierarchies of norms can also emerge. These include abstract principles or values that would guide all the company’s actions as well as concrete regulations that implement these principles or values. [30] When it comes to content moderation decisions, norms usually refer to the interests of others, the community, or the platform provider that may be affected by content or certain behavior. [31]

18

Not only does the process of making and changing these rules correspond to public ordering in its complexity, [32] platform operators also decide to integrate public values from democratically legitimized standards into their content moderation. Admittedly, insights into the actual sources of inspiration in rulemaking by platforms are quite limited. [33] Nonetheless, initial examples make it clear that platform operators are using the standards of public authorities as a guide for their private communication rules. For example, the normative content of communication rules is shaped by the U.S. legal tradition of the First Amendment. [34] In particular, regional characteristics of legal systems can also influence platform rules. [35] Platform operators are also implementing international human rights standards in their rules, [36] as shown not least by the tweet from the former Human Rights Council on Twitter that introduces this paper.

19

Social media platforms are increasingly basing their private ordering on standards that correspond to public ordering requirements. The more platform operators voluntarily implement such requirements, the more private ordering becomes interwoven with public values independent of state laws and public regulation. From the perspective of the platform users, this results in a hybrid set of rules consisting of private rules and public rules.

2.2.2. The Imposition of Public Ordering Requirements on Private Ordering

20

The second thrust in the emergence of hybrid speech governance stems from mandatory requirements by public authorities. In view of the growing importance of digital platforms, such requirements are ever more superimposed on private ordering. [37] In this way, they are shaping the regulatory structure in private online communication spaces.

21

This shaping of private ordering initially took place in Europe only through fragmentary specifications. European lawmakers began to fill out their mosaic of platform regulation—in the tailwind of the European Court of Justice’s case law, which increasingly recognized the importance of online platforms for the exercise of users’ fundamental rights. [38] The EU set the first limits on private communication rules for platform operators in order to protect copyrights online [39] or to prevent the online dissemination of terrorist content [40]. [41] The EU also built up pressure on platform operators via numerous codes of conduct to shape their private communication rules in relation to certain dangers, such as those relating to hate speech or disinformation. [42] At the member state level, too, increasingly dense government regulation has loomed over the private ordering of social media companies. In Germany, for example, the Network Enforcement Act stipulates that social networking sites must delete illegal content as a result of user complaints; [43] furthermore, under an amendment to German media law, intermediaries may not discriminate against journalistic-editorial content; [44] the German Federal Court of Justice has also ruled that those sites may face constitutional obligations in their private communication rules,  [45]  [46]. In other EU member states, too, public ordering requirements have increasingly shaped the private ordering of platforms. [47]

22

Beyond these initially fragmentary requirements, the EU introduced comprehensive obligations for all intermediaries with its DSA in October 2022. The DSA aims to create a safe digital space: It intends to limit the distribution of illegal content; at the same time, the fundamental rights of users as enshrined in the Charter shall be protected (Article 1(1) DSA). Through detailed liability and due diligence provisions, private ordering by platform operators is shaped by this act. [48] In particular, the European legislature recognizes the platforms’ T&C and codes as the basis for private governance. [49] This is the regulatory gateway through which the EU integrates public ordering requirements into private ordering by platforms. [50]

3. Hybrid Speech Governance: An Analytical Category of its Own

23

Communication rule structure in the digital public sphere is thus developing into a hybrid of private and state rules. The question arises of whether this normative field cannot simply be captured using the conventional categories of governance. We argue that none of these categories can accurately capture the phenomenon of overlapping and intertwined rules (3.1). Against this background, we want to propose a definition of the distinct category of hybrid speech governance (3.2).

3.1. Hybrid Speech Governance as a Traditional Governance Mechanism?

24

The picture of platform governance is classically composed of three mechanisms, namely command-and-control regulation, self-regulation, and co-regulation. [51] The normative field of communication rules on social media platforms contains elements of all these forms of regulation. However, none of these classical forms accurately captures the actual set of over- lapping private and public communication rules:

25

In command-and-control regulation, the state determines commands and prohibitions in order to fulfill the control objectives. The addressees of the rules must follow the latter. The state monitors their compliance. In fact, public authorities increasingly make stipulations in communication spaces on social media platforms that influence the actual regulatory structure, such as rules stipulating that illegal content must be deleted. At the same time, social media companies, as fundamental rights bearers themselves, retain the leeway to autonomously determine their content moderation strategy on the market of digital platforms. Theoretically, of course, it would be conceivable to have a legal system in which public authorities prescribe in detail what communication rules must look like on these platforms. [52] In this case, platform governance would be classic command-and-control regulation. However, such a paternalistic system is alien to liberal democracies.

26

In self-regulation, the state essentially stays out of the process of regulation. It is assumed that the governance goals can be achieved through voluntary, social processes as a rule by the industry itself. As a decisive instrument of platform governance, self-regulation is, in many cases, the basis for shaping the rules of communication for social media companies. As illustrated, platform operators voluntarily submit to standards and thus shape their communication rules. [53] At the same time, this is only one component of the normative inventory of communication rules. These self-regulation measures are often supplemented by mandatory government requirements. Whether the implementation of public requirements is voluntary or serves to ensure compliance with public ordering requirements cannot regularly be traced: The interaction of private and public communication rules does not fully reflect the perspective of self-regulation.

27

It might seem obvious to classify the increasing overlap of private and public rules on social media platforms as a form of co-regulation. Co-regulation can be understood as self-regulation that is fitted into a framework of state law or takes place on a legal basis. [54] Co-regulation pursues the exclusive purpose of the common good. Only the means of self-regulation within the state framework is used to carry out this purpose. [55] In co-regulation, public ordering thus stands hierarchically above private ordering in two respects: Temporally, private ordering takes place only after the framework has been set by public ordering; normatively, private ordering takes place only to implement the public ordering framework. [56] Admittedly, the normative structure on social media platforms has parallels to co-regulation: Public authorities set a regulatory framework through new-school speech regulation; within this framework, social media platforms set their own law by way of private ordering. However, there is a key difference with co-regulation. In the case of communication rules on platforms, the state-set framework and the private communication rules inextricably comingle. In contrast to co-regulation, private communication rules do not necessarily serve the pure implementation of state interests. Rather, public welfare purposes of public ordering are intermingled with original corporate purposes of private ordering. To take a simple example: If we run a platform about dachshunds, then it is crucial that we can prohibit cat content on the platform and remove it when it is uploaded in violation of that rule to define the identity of the platform. At the same time, we might have to be compliant with general state-set rules governing platforms dealing with animals (e.g., regarding the sale of protected species). [57]

28

In that sense, private ordering and public ordering do not, then, exist in a hierarchical relationship, they overlap. They merge into a hybrid regulatory structure [58]—from a governance perspective, private product and public concerns become one.

3.2. A Definition of Hybrid Speech Governance

29

Accordingly, none of the typical forms of regulation can adequately capture the hybridity of communication rules on online platforms. Against this background, we propose a distinct category of hybrid speech governance. Hybrid speech governance is characterized by three elements.

  1. First, it refers to regulatory structures in which public and private communication rules overlap, i.e. govern the same behavior. [59] This applies regardless of whether platform operators apply public rules voluntarily or are obliged to do so.

  2. Second, public and private communication rules interact with each other: Government requirements shape the form of private platform rules; conversely, the economic interest in private platform rules [60] shapes the degree to which government requirements are expressed on the platform. [61]

  3. Third, this results in a regulatory structure in which private and public communication rules are inextricably linked. [62] The two levels of rules thus form the hybrid field of communication rules. In short, hybrid speech governance refers to the overlap, interdependence, and inseparability of private and public communication rules on social media platforms.

4. Constitutional Challenges of Hybrid Speech Governance

30

In liberal democracies, the requirements for rules differ depending on whether they are private or public rules. Applying this dichotomy between public and private rules to the category of hybrid speech governance encounters limitations. The question emerges as to how concepts based on this dichotomy can be applied with regard to hybrid speech governance. In this respect, legal research is still in its infancy. [63] At its core, the query is to determine how constitutional concepts originally developed to justify and limit state power may be applied when power is exercised through a hybrid rule structure. [64] We would like to focus here on three fundamental aspects. First, there is the question of the basis for validity (Geltungsgrund) behind hybrid rule structures on platforms: Is it democratic legitimacy or private autonomy (4.1)? This is also linked to the question of whether and to what extent hybrid regulatory structures have to meet rule of law requirements that apply to state-set law (4.2). Third, there is the question of the extent to which fundamental rights obligations apply to hybrid rule structures (4.3). These questions may come up at the levels of rulemaking, rule implementation, rule interpretation, and rule enforcement. While we question whether and how state-oriented concepts apply to hybrid speech governance, we do not use our approach to address the question of whether platforms may fall under the concept of “state”. [65] We rather stick to the state/non-state distinction with platforms as non-state actors; yet this does not preclude testing normative concepts developed for state acts for application to platforms.

4.1. The Basis for Validity of Communication Rules: Democratic Legitimacy or Private Autonomy?

“Classifying an act as public or private determines what kind of legitimacy it requires”. [66]

31

State law has a unilateral effect on citizens and can be enforced by coercion. In view of this effect, the community in which the law is applied must legitimize this law democratically. Conversely, private rules are based on private autonomy. [67] These contractual obligations form the basis of private communication rules. Citizens are fundamentally free to enter into contractual ties with other citizens and these can entail rules for future behavior. Platform users are confronted with rules whose basis for validity is blurred: In hybrid speech governance public and private rules are intertwined. This complicates the question of the basis of validity of hybrid rule structures on platforms. [68] Are those rules based on the private autonomy of the platform operators and users or on the democratic legitimacy of state laws?

32

With regard to the rule structure on platforms, some scholars implicitly address the mix of state legitimacy and private autonomy. They emphasize the legitimizing character of state law for platform rules. Indeed, the private rules of online platforms could be legitimized by their compliance with state law, in particular with legal requirements for T&C. Indeed, on the one hand, T&C regulations provide for the approval of rules by users, while on the other, they serve the common good and platform rules based on them would thus gain legitimacy. [69] From the perspective of political science, a democratic-legitimacy framework could be transferred to private platform rules. [70] For example, some believe that platform governance requires a holistic combination of input, throughput and output legitimacy. [71] In particular, the focus has so far been placed too strongly on legitimacy through processes in the sense of throughput legitimacy, [72] while input legitimacy, i.e. the question of who makes decisions on rules, has been underexposed. [73] In this respect, regulation by the democratic state should be the main source of legitimacy in platform governance. [74]

33

Overall, the question of the basis of validity of hybrid rule structures will heavily depend on the underlying legal theory. This applies both to rulemaking in general [75] and to platform governance in particular [76]. The search for such basis is all the more challenging when the familiar dichotomous logic of private and public rulemaking is inadequate, as in the case of hybrid speech governance. In view of this phenomenon, new concepts will have to be devised.

4.2. The Requirements for Communication Rules: Rule of Law or Freedom of Contract?

34

The same holds true for the principle of the rule of law. This principle is originally state-centered. According to the principle, sovereign power is subject to obligations. On the one hand, these obligations can be formal. Accordingly, state power must be exercised in an orderly procedure. On the other hand, substantive requirements can arise from the rule of law principle. [77] These include, inter alia, the requirement of legal certainty, the guarantee of legal protection, the prohibition of arbitrariness, judicial independence, and the requirement of transparency. [78] However, the rule of law principle is directed exclusively at public authorities in order to limit the vertical balance of power. If, however, private individuals are in conflict, none of the private individuals is hierarchically superior from a legal point of view. Therefore, private persons cannot unilaterally issue orders over another private person. Rather, this horizontal relationship between private individuals is governed by the freedom of contract. According to this principle, everyone is free to choose whether to enter into a contract at all, with whom and to what content. Consequently, the content requirements for private agreements do not arise from the principle of the rule of law, but from the free will of the parties.

35

In the context of hybrid speech governance, nonetheless, mechanisms of state law and private agreements largely overlap. This raises the question of whether, and if so how, requirements of the rule of law principle should be applied to hybrid rule structures. In actor-centered legal thinking, this question falls into two aspects. First, in actions that influence private regulation: How can it be ensured that the state’s obligations are not circumvented by the indirect effect on the actions of platforms? Secondly: Are the platforms themselves bound by the principle, or would they have to be bound by it through state regulation?

36

The emergence of hybrid speech governance will require understanding the rule of law principle from a non-state-centric perspective. Simply transferring familiar rule of law principles to hybrid communication rule structures will not be helpful in this regard. [79] Scholars have made initial suggestions to transfer rule of law principles to private communication rules. [80] The governance-centric perspective of hybrid speech governance should play a central role in the further development of such rule of law standards on platforms.

37

Certainly, the fact that concepts of the rule of law are transferred to situations of private power is, of course, not a novelty in itself. Indeed, the legislature regularly responds by imposing mandatory requirements to situations in which power is unequally distributed among private parties in actual terms, e.g., in labor law, consumer protection law, or private utilities law. While legal traditions could develop for decades with respect to these areas of law, the transfer of rule of law principles to private ordering by platform operators is still being tested by lawmakers, jurisprudence and academia.

4.3. Fundamental Rights and Communication Rules: Entitlement and/or Obligation?

38

Hybrid speech governance also poses challenges to the structure of fundamental rights. In liberal democracies, the latter is generally based on the private/public dichotomy: Public authorities are obliged to observe the fundamental rights of private individuals, but are not entitled to invoke fundamental rights. [81] Conversely, private individuals are entitled to invoke fundamental rights vis-à-vis public authorities, but are not obliged to observe the fundamental rights of other private individuals. The basic concept is, therefore, that public authorities are obliged to respect fundamental rights, while private individuals are entitled to invoke those rights. This conception poses challenges for hybrid speech governance, all of those that are recently debated in constitutional law, based on the respective legal system.

39

A first challenge results from the positive obligation of state actors to protect the fundamental rights of citizens. Indeed, certain constitutional provisions oblige state actors to take legislative action. [82] This is rather alien to U.S. constitutional thinking, but quite common in Europe. This raises the question of the degree to which state actors must shape the hybrid communication order themselves by means of a minimum set of state communication rules on platforms, so as not to leave this field exclusively to private platform operators.

40

This also poses a second challenge. When prescribing public rules that shape hybrid speech governance, state actors must respect the fundamental rights of platform operators. When state actors regulate the private moderation of content, they interfere with the fundamental rights of platforms. In particular, the setting and exercise of private rules may itself be protected by fundamental rights, such as the entrepreneurial freedom, professional freedom, or freedom of expression of platform operators. [83] In this respect, the question arises as to the extent to which state actors may shape the hybrid regulatory structure themselves without disregarding fundamental rights interests. These rights have the same protective orientation as the rights of users to access information and the general information interest of the public, which we do not address separately here.

41

A third challenge of hybrid speech governance concerns the attribution of rules to fundamental rights. In view of the overlap between state and private rules, it is sometimes difficult to classify their authorship. This also makes it unclear whether the state, which is bound by fundamental rights, or the platform operator, which is not bound by fundamental rights, is acting. If a private platform merely implements a public rule, this could possibly be attributed to the state. In particular, if state actors exert pressure on platform operators to enact a certain communication rule, it may be obvious to classify this as state action bound by fundamental rights, despite the outwardly private form of action. [84]

42

Finally, even if private action cannot be attributed to the state, the question arises as to whether platform operators can be obliged to respect the fundamental rights of their users—even though they are bearers of fundamental rights themselves. Concepts are increasingly being proposed to bind platform operators to fundamental rights with different binding effects when moderating content. For example, the German Federal Court of Justice has ruled that online platforms that are market-dominant and have an open communicative orientation must consider the fundamental rights of their users when moderating content. Essentially, this results in procedural obligations for platform operators when moderating content. [85] In other EU Member States, too, national courts are applying concepts to bind platform operators to fundamental rights in their content moderation. [86] Nevertheless, such concepts are still in their infancy; a profound understanding of the fundamental rights obligations of private platform operators is still lacking. In the future, it will be necessary to find suitable solutions in the area of hybrid speech governance in order to determine the scope of potential fundamental rights obligations.

5. The Digital Services Act Embracing Hybrid Speech Governance

43

The DSA addresses some of those challenges. Certainly, in principle, the DSA resorts to classic forms of regulation. Essentially, the European legislature relies on mechanisms of co-regulation. In particular, very large online platforms are to assess systemic risks themselves (Article 34) and take measures against such risks (Article 35). Similarly, the DSA provides for the development and implementation of voluntary standards and codes of conduct (Article 44 et seq.), in which the EU Commission is to play a supporting role. The regulation also standardizes commitments by platforms (Article 71 et seq.), which can be declared binding by the EU Commission. In addition to these mechanisms of co-regulation, the DSA contains forms of command-and-control regulation. For example, it obligates platform operators to provide procedural safeguards to users (Article 16 et seq.), imposes transparency requirements (Article 15, 24, 27, 39, 42), and provides fines for non-compliance (Article 74). Old-school speech regulation, i.e., direct regulation of what is permissible speech, is essentially left to the Member States by the EU.

44

In this respect, the approach of the DSA is first and foremost to address procedural requirements to the platform operators, but not to establish an online communication order with regard to content. Nevertheless, the DSA creates the content-related framework of the communication order. Article 14(4) reads as follows:

“Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions (provided for in their terms and conditions), with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.”

45

In doing so, the DSA creates a new form of hybrid speech governance. Certainly, prior to the DSA, the EU’s Terrorist Content Online Regulation [87] had already taken on the phenomenon of hybrid speech governance. [88] Article 5(1) of the regulation stipulates that the T&C upon which hosting service providers must moderate terrorist content have to respect the fundamental rights of users. Yet this provision only applies to the limited scope of public terrorist content online. As a result, the Terrorist Content Regulation is indeed the silent pioneer of hybrid speech governance. By contrast, the DSA now features prominently. This is because the DSA applies regardless of the subject matter of online content. Consequently, the DSA is the world’s first comprehensive approach to address the challenges of hybrid speech governance. The DSA finds initial answers to the constitutional challenges mentioned above (5.1). At the same time, it also leaves a number of questions unanswered (5.2).

5.1. Answers to Constitutional Challenges Posed by the Digital Services Act

46

The DSA responds to the problem areas identified above by, on the one hand, recognizing the private law-making authority of the platform operators, but, on the other hand, integrating requirements into private ordering which are, in themselves, only imposed on state actors. This concerns the basis for validity of communication rules (5.1.1), rule of law requirements (5.1.2) and fundamental rights protection (5.1.3).

5.1.1. The Basis for Validity of Communication Rules: Private Autonomy in a Framework of Democratic Legitimacy

47

The DSA explicitly recognizes that the legal basis for content moderation are the T&C of online platforms. In this way, the European legislature stipulates that the basis for content moderation continues to be the private autonomy of the platforms and users. This is because the rules of communication on social media platforms continue to be in private hands, as provided for in their T&C. Accordingly, it is primarily the users’ consent to the T&C that justifies the content moderation by platform operators. In order to be able to give this consent as autonomously as possible, the DSA provides for numerous transparency requirements for the benefit of users. For example, the T&C must contain information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system; furthermore, the T&C must be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format (Article 14(1) DSA).

48

This model of consent is embedded in a framework of (weak) democratic legitimacy. The obligation to pursue general welfare objectives in T&C law can have a legitimizing effect. National T&C law regularly provides that the content of T&C can be reviewed on the basis of general interest purposes. [89] In particular, Article 14(4) of the DSA now explicitly states that platform operators must observe the fundamental rights of their users when checking the content of their T&C. The EU Charter of Fundamental Rights is in turn part of the democratically legitimized primary law of the EU. By effectively integrating the standards that follow from this into the T&C of the platforms, the platform rules can consequently gain democratic legitimacy, at least indirectly.

49

Overall, the response to hybrid speech governance by the EU is therefore a mixed legitimation model: In essence, the European legislature follows the logic of private ordering in the DSA, but ties this to public welfare purposes. The DSA does not provide for direct user participation in the development of communication rules. In the long term, greater democratic legitimacy in content moderation might be achieved through proposals such as social media councils. Such councils would be staffed by citizens or users to draft or enforce communication rules. [90] In Germany, for example, the government is planning to advance corresponding legitimacy models. [91]

5.1.2. The Requirements for Communication Rules: Rule of Law Guarantees in Content Moderation

50

It is true that, according to the language of the DSA, social media platforms remain free in principle to determine their communication rules within the framework of their T&C. However, the DSA obliges platform operators to comply with requirements that were originally imposed only on state actors as an expression of the principle of the rule of law.

51

First, content moderation must be proportionate under Article 14(4) DSA. According to its original understanding, the principle of proportionality is state-centric. It serves to limit public authority. [92] The principle states that state action may only pursue legitimate purposes in an appropriate, necessary and proportionate manner. [93] In this context, encroachments on fundamental rights by state actors should be as freedom-preserving as possible. With its Article 14(4) DSA, the European legislature is now explicitly transferring this state-centric concept to content moderation by private platform operators. [94] The concrete requirements for the application of the principle of proportionality in private content moderation have yet to be clarified. [95] Nevertheless, it is already clear that the EU is integrating requirements for public ordering into the private ordering of private platform operators. This is to be understood as recognition of hybrid speech governance.

52

Second, according to Article 14(4) DSA, moderation decisions must be “objective”. This requires that moderation decisions be non-discriminatory and non-arbitrary. [96] Objectivity is a prohibition of arbitrariness. Users must be treated equally; unequal treatment must be justified. Binding particularly powerful digital companies to the principle of equality is a constitutional trend in the area of digital communications. [97] The DSA is now responding to this by integrating the obligation of equal treatment as a public ordering requirement into private ordering by platform operators.

53

Requirements, which in themselves stem from the rule of law principle in the vertical relationship between state actor and citizen, can also be found outside of Article 14(4) DSA. For example, Article 14 DSA provides in its paragraph 1 that the rules of communication must be written in clear, simple, understandable, user-friendly and unambiguous language and be publicly available in an easily accessible and machine-readable format. Very large platforms must also provide information on available remedies and redress mechanisms (Article 14(5) DSA). Simply put, the rules for private communications must be transparent and sufficiently specified in terms of the requirement for legal certainty. With these requirements, the European legislature is taking its cue from the rule of law, which otherwise only applies to state actors. The same applies to the introduction of legal protection mechanisms against moderation decisions. Here, the EU uses the logic of administrative law: Moderation decisions must be justified (Article 17), platforms must provide for an appeal procedure (Article 20), and offer external legal protection (Article 21).

5.1.3. Fundamental Rights and Communication Rules: Entitlement and Obligation

54

In particular, the DSA addresses the challenges that arise in view of the fact that private platform operators—unlike state actors—are in principle only entitled to invoke fundamental rights, but not obliged to respect them vis-à-vis other private parties. So, it follows that, on the one hand, the European legislature confirms in its Article 14(4) DSA that platform operators may moderate content on the basis of their T&C. In this way, the EU respects the fact that platform operators are themselves protected by fundamental rights when moderating content. [98] This is also supported by the wording in Article 14(4) DSA “with due regard to the rights and legitimate interests of all parties involved” [99]. “All parties” includes platform operators and their respective fundamental rights. In this way, the lawmaker gives expression to the European legal tradition [100] of weighing fundamental rights among private parties. [101] In this respect, companies are afforded the flexibility to define their own communication rules by way of private ordering.

55

On the other hand, despite the recognition of a comprehensive balancing of fundamental rights, Article 14(4) DSA is clearly centered on the protection of fundamental rights by, not for platforms. [102] Article 14(4) DSA obliges private platform operators to respect users’ fundamental rights when moderating content. With regard to content moderation, the DSA explicitly mentions users’ fundamental rights only, namely their freedom of expression and the freedom and pluralism of the media. In recital 47, the DSA adds the freedom of information and refers to relevant international standards for the protection of human rights, such as the UN Guiding Principles on Business and Human Rights. With this obligation for digital companies to respect the fundamental rights of users, the European legislators are meeting the demands of the advocates of “digital constitutionalism”. [103]

56

With Article 14(4) DSA, the European legislature thus takes up the phenomenon of hybrid speech governance. It states that platform companies are both bearers of fundamental rights and are obliged to respect the fundamental rights of users at the same time. Accordingly, the DSA recognizes that content moderation consists of an overlap of private ordering and public ordering. This goes beyond the constitutional doctrine of horizontal effects. According to this doctrine, courts in European member states already combine private ordering of content moderation with public ordering requirements. However, the EU no longer leaves the horizontal effect of fundamental rights to the courts, but codifies it explicitly in Article 14(4) DSA. This turns the judicial source of the horizontal effect of fundamental rights into a legislative source. In particular, the legislature provides courts with certain guidelines for the balancing process. Certainly, within this framework the courts have leeway to develop dimensions of horizontal effect. [104]

5.2. Questions Left Open in the Digital Services Act

57

However, the DSA also leaves a number of questions unanswered with regard to those constitutional challenges of hybrid speech governance. In particular, the incorporation of horizontal fundamental rights as enshrined in the Charter in an ordinary legislative act is a novel construction; this gives rise to a colorful bouquet of questions, the comprehensive answers to which will be given elsewhere. We will address only two aspects here. First, the wording of Article 14(4) DSA does not refer to rulemaking on communication rules (5.21). Second, the norm applies across the board to all platforms; the European legislature does not create a tiered system of obligations (5.2.2).

5.2.1. Rulemaking Covered by the Scope of Article 14 DSA?

58

Article 14(4) DSA, when strictly read, refers only to the application and enforcement of the communication rules stemming from the T&C. In contrast, the upstream level of rulemaking is not covered. It could be argued, therefore, that platforms are not bound by users’ fundamental rights at the rulemaking level. In this case, platform operators would be essentially free to design their rules. Only in the concrete application and enforcement of the rules could fundamental rights evaluations become relevant. Such a reading would reduce the fundamental rights obligation to a procedural moment: When moderating content, platform operators would have to be able to plausibly demonstrate that they have taken fundamental rights into consideration in some form. [105] It is nevertheless more likely that the European legislators have wanted to bind platform operators to fundamental rights when drafting their communication rules. Indeed, recital 47 of the DSA goes beyond the wording of Article 14(4) DSA and covers the design, application and enforcement of the T&C. [106] The DSA does not provide a clear cut response; a clearer formulation would have been desirable here. [107]

5.2.2. Graduation of Obligations for Platforms?

59

More serious than the above-mentioned ambiguity, however, is the fact that Article 14(4) DSA does not specify any criteria for weighing the scope of the intermediaries’ obligations. This applies both in personal and in substantive terms.

60

In personal terms, it is noteworthy that the provision is located in the general part of the DSA. Accordingly, the provision applies to all intermediaries, regardless of their mode of operation or size. Consequently, small platforms that have not yet established themselves in the market are covered by the provision, as are platforms that dominate the market. According to the current conception, the dominance of a platform has no impact on the content of the communication rules. [108] For smaller platforms in particular, the far-reaching obligations of Article 14(4) DSA can mean an unjustified restriction of their contractual freedom. [109] The undifferentiated scope of application of Article 14 DSA is surprising, especially in light of the fact that obligations in the regulatory system of the DSA are otherwise determined by the size and functioning of an intermediary. Recital 47 does, however, emphasize the fundamental rights of very large online platforms; this might serve as a weighing criterion.

61

Article 14(4) DSA also remains vague in substantive terms. The legislature does not provide any concrete criteria for determining the substantive scope of the obligations under Article 14(4) DSA, in particular regarding the fundamental rights obligation of platforms. [110] The formulation that platforms must have “due regard” to the fundamental rights of users when moderating content leaves open the extent to which platforms may moderate content. [111] At least it can be inferred from the wording of Article 14(4) DSA that platforms should not be bound state-like, given the consideration of the platforms’ own fundamental rights. [112] The case law of the European Court of Justice and the European Court of Human Rights could provide guidance in the balancing process, although this has so far only concerned vertical cases between state actors and citizens, not horizontal cases between platforms and citizens. [113] The emphasis on individual fundamental rights positions in Article 14(4) DSA, namely freedom of expression and freedom and pluralism of the media, could also inform the balancing process. The same goes for the reference to the UN Guiding Principles on Business and Human Rights in recital 47 which provide certain guidelines for internal company processes. [114] Furthermore, in line with rulings of Member States’ courts, the fundamental rights obligations could vary depending on the market dominance of the platform, the communicative orientation of the platform, or the degree of dependence of the users on the platform. [115] The pressure exerted by state actors on private content moderation could also be taken into account when determining the extent of a fundamental rights obligation.

62

Overall, the open wording of Article 14(4) DSA leaves room for academia, practice, and courts to develop concrete criteria to approach the scope of substantive obligations of private platform companies. In particular, the application of other provisions of the DSA could be helpful in operationalizing the vague scope of Article 14(4) DSA. On the one hand, the procedural guarantees of the DSA, namely the obligation to give reasons for moderation decisions, the internal complaint handling system, out-of-court dispute settlement bodies or the right of appeal to the Digital Services Coordinator could contribute to the concretization of the requirements of Article 14(4) DSA. [116] In addition, the systemic risk assessment and the risk mitigation measures based on it can concretize the requirements of Article 14(4) DSA. [117] Moreover, the use of codes of conduct is also likely to promote the legal certainty of the substantive requirements of Article 14(4) DSA. [118] But ultimately, only the European Court of Justice or legislators will be able to make binding statements on the extent of Article 14(4) DSA.

6. Hybrid Speech Governance and the U.S. Legal Framework

63

It is questionable how this European approach relates to U.S. legal thinking, which significantly shapes the corporate compliance of large tech companies on the other side of the Atlantic. Indeed, the European approach of Article 14(4) DSA to legally embed hybrid speech governance runs counter to fundamental convictions in the U.S. legal system. While it is at least in principle compatible with the European understanding that requirements for state actors are transferred to private ordering—for example, through the horizontal application of fundamental rights—U.S. legal thinking is characterized by the idea of an autonomous sphere of private ordering and a strict limitation of the binding force of the constitution to state action. [119] Private ordering by platforms is extensively protected by the First Amendment in the USA; the First Amendment thus constitutes the limit of state regulation of private ordering. [120] This protection is underpinned by Section 230 Communications Decency Act and the Digital Millennium Copyright Act. [121] This provides the framework for a liberal system of private ordering by platform operators.

64

Admittedly, U.S. law, too, recognizes possibilities, through the state action doctrine, to draw private action into the scope of constitutional obligations by way of exception. [122] The public forum doctrine, in particular, may serve as a vehicle for tying private actors to fundamental rights. [123] Yet the Supreme Court traditionally applies a narrow understanding of the public forum. [124] Notably, against this backdrop, lower courts have thus far rejected the argument that private social media platforms constitute a public forum in which the First Amendment would have to be respected. [125]

65

The U.S. is though facing a “moment when everything might change”: [126] It seems not to be excluded that the Supreme Court will draft a model to bind platforms to fundamental rights in a weakened form according to the EU model discussed above. Such concept would add a balancing model to the current dichotomy of the state action doctrine. Indeed, there is currently a jurisprudential debate about the extent to which social media companies may moderate the content of their users. According to the traditional approach, this depends on whether these companies act as state actors; only in this case would the companies be bound by constitutional law. Lower courts in the U.S. are at odds on this issue. The U.S. Court of Appeals for the 5th Circuit rejects the idea that “corporations have a freewheeling First Amendment right to censor what people say”. [127] That is why in September 2022, the court upheld a Texas social media law, House Bill 20, which prohibits major platforms from deleting content based on a speaker’s viewpoint. So, social media platforms are understood to be state actors. They must comply with the First Amendment. Notably, according to the Court, the law does not regulate “the Platforms’ speech at all”. It rather protects “other people’s speech and regulates the Platforms’ conduct”. Conversely, in May 2022 the U.S. Court of Appeals for the 11th Circuit struck down key parts of a comparable Florida law. [128] It was unconstitutional for the state to ban social media companies from content moderation. The court argued that content moderation activities by platforms are “free speech” within the meaning of the First Amendment.

66

It will now be up to the Supreme Court to decide to what extent social media companies may moderate content. [129] In any case, the Supreme Court has no shortage of nuanced suggestions from academia to find answers to this question. [130] For now, the current constitutional framework for regulating hybrid speech governance in the USA is still subject to numerous ambiguities. Whatever legal path is viable for regulating the rules of communication on social media platforms, an option should be chosen that takes into account both the fundamental rights of the platform operators and the fundamental rights of the users. The European legal tradition, in which “fundamental rights and freedoms interact with each other in a dialectic relationship of balancing” [131], may serve as a source of inspiration for the Supreme Court and/or lawmakers. Only such a balancing model can do justice to the phenomenon of hybrid speech governance. This is because, as shown, on platforms, private and public interests have common intersections and do not exist in a dichotomous relationship. Article 14(4) DSA overcomes this dichotomy; the norm is a regulatory prime example. However, it is also clear that the approach of the DSA, which already breaks with traditional ideas in the EU, [132] would all the more have to overcome constitutional hurdles even more so in the U.S. legal system.

7. Conclusion and Outlook

67

The normative development of communication rules on online platforms puts traditional notions of rulemaking and rule application in trouble. The overlap, interdependence, and in- separability of private and public communication rules on social media platforms should therefore be analyzed under the lens of a new category: hybrid speech governance. This perspective can help to find appropriate approaches to contain private power without simply transferring state-centric concepts unchanged to platform operators. This applies to questions of the basis of communication rules, rule of law requirements, and fundamental rights obligations.

68

The EU’s DSA adopts this perspective of hybrid speech governance and thus finds initial legislative answers to the questions raised. However, this is only the beginning of the story. Academia, practice, and jurisprudence will have to flesh out the DSA’s approaches to hybrid speech governance in detail. If the Brussels Effect [133] can contribute here to the radiation of European standards into U.S. law remains questionable, given the constitutional structures on the western side of the Atlantic. [134]

69

Further challenges will arise in the (not so distant) future. Considering the increasing importance of online platforms for the exercise of fundamental rights, the issues discussed will not only affect free speech rights. Other fundamental rights of users will be affected by platform rules, such as academic freedoms, artistic freedom, or entrepreneurial freedom. This is all the more true when platforms offer functions that go beyond mere communication, in the metaverse, for example. In this context, the hybrid rule structure will not only affect communication, but conduct more generally. Furthermore, orders within orders may emerge when third parties can create their own worlds with their own rules on platforms. The governance structure thus takes on yet another level: The state influences private rulemaking, which in turn influences the private order of the third parties. In other words, the complexity of platform governance continues to increase. Hybridization processes are also emerging outside the field of platform regulation, especially with regard to the question of how constitutional values can find their way into technical systems, a question that goes by the catchword of “Constitutional AI” [135]. [136] While the development of responsible AI systems oriented to constitutional standards is still based primarily on entrepreneurial initiative, i.e. moral rules set by private companies for AI systems to adhere to, regulators could increasingly incorporate constitutional requirements into AI legislation—the EU’s forthcoming AI Act already bears witness to the claim of “Constitutional AI” in its approach. All of this makes it necessary to delve into hybrid regulatory structures to find well-founded ways to legally deal with them.

 



[1] This paper is based on a talk we gave at the Information Society Project, Yale Law School, on October 25, 2022. Thanks to Prof. Dr. Robert C. Post, Dr. Tobias Mast, Prof. Dr. Matthias C. Kettemann, who provided their great help in finalizing the article. Errors are entirely our own.

[2] Prof. Dr. Wolfgang Schulz is Professor of Law, University of Hamburg, and Director of the Leibniz Institute for Media Research | Hans Bredow Institute and of the Humboldt Institute for Internet and Society and UNESCO Chair for Freedom of Information and Communication.

Christian Ollig, LL.M. (College of Europe), Maître en Droit (Cergy-Paris) is PhD Candidate at the University of Hamburg, Junior Researcher at the Leibniz Institute for Media Research | Hans Bredow Institute and Associated Researcher at the Humboldt Institute for Internet and Society.

[3] Shannon Raj Singh on Twitter: "Yesterday was my last day [@ShannonRSingh] <https://twitter.com/ShannonRSingh/status/1588591603622772736> accessed 15 November 2022.

[4] Wolfgang Schulz, ‘Changing the Normative Order of Social Media from Within: Supervisory Bodies’ in Edoardo Celeste, Amélie Heldt and Clara Iglesias Keller (eds), Constitutionalising Social Media (Hart Publishing 2022) 238.

[5] Schulz (n 4) 239.

[6] Matthias Goldmann, ‘A Matter of Perspective: Global Governance and the Distinction between Public and Private Authority (and Not Law)’ (2016) 5 Global Constitutionalism 48; Giovanni De Gregorio, ‘Digital Constitutionalism across the Atlantic’ (2022) 11 Global Constitutionalism 297.

[7] Beate Kohler-Koch and Berthold Rittberger, ‘The Governance Turn in EU Studies Review Article’ (2006) 44 Journal of Common Market Studies 27; Stephen J Ball, ‘The Governance Turn!’ (2009) 24 Journal of Education Policy 537.

[8] Gunther Teubner, ‘Substantive and Reflexive Elements in Modern Law’ (1982) 17 Law & Society Review 239.

[9] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (‘Digital Services Act’; ‘DSA’).

[10] From a U.S. perspective, Daphne Keller, ‘Who Do You Sue? State and Platform Hybrid Power over Online Speech’ (Stanford PACS) <https://pacscenter.stanford.edu/publication/who-do-you-sue-state-and-platform-hybrid-power-over-online-speech/> accessed 16 November 2022 has already addressed the intersection and interaction between private and public power on online platforms; Giovanni De Gregorio and Roxana Radu, ‘Digital Constitutionalism in the New Era of Internet Governance’ (2022) 30 International Journal of Law and Information Technology 68 view the takeover of public functions by private digital companies as a form of hybridization of Internet governance as well; Jean-Marie Chenou and Roxana Radu, ‘The “Right to Be Forgotten”: Negotiating Public and Private Ordering in the European Union’ (2019) 58 Business & Society 74 illustrate how the regulation of search engines is causing the private-public dichotomy to blur. Sometimes self-regulation that comes about with the cooperation of government agencies is also called ‘hybrid regulation‘, Wolfgang Hoffmann-Riem, ‘Selbstregelung, Selbstregulierung und regulierte Selbstregulierung im digitalen Kontext’ in Michael Fehling and Utz Schliesky (eds), Neue Macht- und Verantwortungsstrukturen in der digitalen Welt (Nomos Verlagsgesellschaft mbH & Co KG 2016) 41 et seq.

Otherwise, previous research on platform regulation has focused less on substantive hybrid communication rule structures and more on hybrid institutions, Kate Klonick, ‘The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression’ (2020) Vol. 129 Yale Law Journal 2418; Martin Fertmann and others, ‘Hybrid Institutions for Disinformation Governance: Between Imaginative and Imaginary’ (Internet Policy Review) <https://policyreview.info/articles/news/hybrid-institutions-disinformation-governance-between-imaginative-and-imaginary/1669> accessed 16 November 2022; Thomas Kadri and Kate Klonick, ‘Facebook v. Sullivan: Public Figures and Newsworthiness in Online Speech’ (2019) 93 Southern California Law Review 37; Event: Surveying the Hybrid Speech Governance Landscape (Directed by R Street Institute, 2022) <https://www.youtube.com/watch?v=0O2SgDd3S3k> accessed 16 November 2022; Schulz (n 4).

[11] For a comprehensive analysis of the complex regulatory structures on the Internet, Matthias C Kettemann, The Normative Order of the Internet (Oxford University Press 2020).

[12] Jack Balkin, ‘Free Speech Is a Triangle’ (2018) Vol. 118 Columbia Law Review 2011; Jack Balkin, ‘Free Speech in the Algorithmic Society: Big Data, Private Governance, and New School Speech Regulation’ (2018) 51 UC Davis Law Review 1151, 1186 et seqq.

[13] Balkin, ‘Free Speech Is a Triangle’ (n 12) 2021 et seqq.

[14] See also Kate Klonick, ‘The New Governors: The People, Rules, and Processes Governing Online Speech’ (2018) 131 Harvard Law Review 1599; Heike Schweitzer, ‘Digitale Plattformen Als Private Gesetzgeber: Ein Perspektivwechsel Für Die Europäische „Plattform-Regulierung“’ (2019) Zeitschrift für Europäisches Privatrecht 1; Ulrich Dolata, ‘Plattform-Regulierung. Koordination von Märkten und Kuratierung von Sozialität im Internet’ (2019) 29 Berliner Journal für Soziologie 179; Matthias C Kettemann and Wolfgang Schulz, ‘Setting Rules for 2.7 Billion: A (First) Look into Facebook’s Norm-Making System; Results of a Pilot Study’ (2020) Working Papers of the Hans-Bredow-Institut | Works in Progress.

[15] Luca Belli and Jamila Venturini, ‘Private Ordering and the Rise of Terms of Service as Cyber-Regulation’ (2016) 5 Internet Policy Review <https://policyreview.info/node/441> accessed 16 November 2022; Edoardo Celeste, ‘Terms of Service and Bills of Rights: New Mechanisms of Constitutionalisation in the Social Media Environment?’ (2019) 33 International Review of Law, Computers & Technology 122.

[16] Tarleton Gillespie, ‘Regulation of and by Platforms’ in Jean Burgess, Alice Marwick and Thomas Poell, The SAGE Handbook of Social Media (SAGE Publications Ltd 2018); Tarleton Gillespie, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (Yale University Press 2018).

[17] Balkin, ‘Free Speech Is a Triangle’ (n 12) 2015; see also Jack Balkin, ‘Old-School/New-School Speech Regulation’ (2014) 127 Harvard Law Review 2296.

[18] Cf. Balkin, ‘Old-School/New-School Speech Regulation’ (n 17) 2298.

[19] Balkin, ‘Free Speech Is a Triangle’ (n 12) 2015 et seqq.; see also Balkin, ‘Old-School/New-School Speech Regulation’ (n 17).

[20] On the trend of delegation of law enforcement by public authorities to private entities see Jody Freeman and Martha Minow (eds), Government by Contract: Outsourcing and American Democracy (Harvard University Press 2009); De Gregorio and Radu (n 10) 78.

[21] Balkin, ‘Free Speech Is a Triangle’ (n 12) 2030 et seqq.

[22] Gillespie (n 16).

[23] Cf. Balkin, ‘Free Speech Is a Triangle’ (n 12) 2014. The presentation here is simplified and modified from the original model for presentation purposes. Our presentation focuses on constellations on social media platforms. In particular, voice, protest and exit by users vis-à-vis public authorities and digital infrastructures have been omitted here.

[24] Such as collateral censorship and digital prior restraint in relation to new-school speech regulation, and arbitrary private bureaucracy without due process and transparency in relation to private governance, Balkin, ‘Free Speech Is a Triangle’ (n 12) 2016 et seqq.

[25] Balkin, ‘Free Speech Is a Triangle’ (n 12) 2032 et seqq.; Jack Balkin, ‘How to Regulate (and Not Regulate) Social Media’ (2019) SSRN Electronic Journal <https://www.ssrn.com/abstract=3484114> accessed 16 November 2022.

[26] For an extensive analysis of this approach in the U.S. see Elettra Bietti, ‘A Genealogy of Digital Platform Regulation’ (2023) Georgetown Law Technology Review <https://papers.ssrn.com/abstract=3859487> accessed 17 November 2022; see also Klonick (n 14) 1603 et seqq.

[27] Giovanni De Gregorio, ‘The Rise of Digital Constitutionalism in the European Union’ (2021) 19 International Journal of Constitutional Law 41, 43 et seqq.; De Gregorio (n 6) 299; Amélie P Heldt, ‘EU Digital Services Act: The White Hope of Intermediary Regulation’ in Terry Flew and Fiona R Martin (eds), Digital Platform Regulation (Springer International Publishing 2022) 70.

[28] For online intermediaries, a general exemption from liability for user-generated content was introduced in Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, particularly electronic commerce, in the Internal Market (‘Directive on electronic commerce’; ‘E-Commerce-Directive’). Accordingly, intermediaries are not liable if, after becoming aware of the illegality of the stored information or content, they immediately remove it or block access to the illegal information or content (Article 14). Furthermore, they have no general obligation to monitor content (Article 15). Section 230 of the U.S. Communications Decency Act served as a model in this regard.

[29] Robert Gorwa, ‘The Platform Governance Triangle: Conceptualising the Informal Regulation of Online Content’ (2019) 8 Internet Policy Review <https://policyreview.info/articles/analysis/platform-governance-triangle-conceptualising-informal-regulation-online-content> accessed 16 November 2022; Michael Denga, ‘Plattformregulierung Durch Europäische Werte: Zur Bindung von Meinungsplattformen an EU-Grundrechte’ (2021) Europarecht 569, 575 et seqq.; De Gregorio (n 6) 301 et seq. Others already saw limits to private rule-making by platforms in the E-Commerce-Directive, see Sophie Stalla-Bourdillon and Robert Thorburn, ‘The Scandal of Intermediary: Acknowledging the Both/and Dispensation for Regulating Hybrid Actors’ in Bilyana Petkova and Tuomas Ojanen, Fundamental Rights Protection Online (Edward Elgar Publishing 2020).

[30] Cf. Klonick (n 14) 1630 et seqq. who makes clear how general standards have evolved into increasingly concrete rules at YouTube and Facebook.

[31] Cf. Evelyn Douek, ‘Governing Online Speech: From “Posts-As-Trumps” to Proportionality and Probability’ (2020) Vol. 121 Columbia Law Review 759, 763“Content moderation is a question of systemic balancing: Rules are written to encompass multiple interests, not just individual speech rights (…)”.

[32] Hannah Bloch-Wehba, ‘Global Platform Governance: Private Power in the Shadow of the State’ (2019) 72 SMU Law Review 27, 29; Schweitzer (n 14). On the proceduralization of private ordering at Facebook see also Kettemann and Schulz (n 14) 28 et seqq.; cf. also Thiago Dias Oliva, ‘Content Moderation Technologies: Applying Human Rights Standards to Protect Freedom of Expression’ (2020) 20 Human Rights Law Review 607, 157 et seq.

[33] Ruby O’Kane, ‘Meta’s Private Speech Governance and the Role of the Oversight Board: Lessons from the Board’s First Decisions’ (2021) 25 Stanford Technology Law Review 167, 180.

[34] Klonick (n 14) 1621; Kettemann and Schulz (n 14) 31.

[35] Concerning Facebook see Chinmayi Arun, ‘Facebook’s Faces’ (2021) Vol. 135 Harvard Law Review 236, 240; see also Alexendre De Streel and others, Online Platforms’ Moderation of Illegal Content Online: Laws, Practices and Options for Reform (Publications Office 2020).

[36] In the Global Network Initiative, for example, numerous platforms have committed themselves to observing principles based on international standards, in particular the International Covenant on Civil and Political Rights and the International Covenant on Economic, Social and Cultural Rights; see also Gorwa (n 29) 11 et seq.

For a comprehensive analysis of the voluntary implementation of International Human Rights Law in content moderation see Brenda Dvoskin, ‘Expert Governance of Online Speech’ <https://papers.ssrn.com/abstract=4175035> accessed 17 November 2022.

[37] De Gregorio and Radu (n 10) 79 summarize this as follows: “The hybridization trend stems from state ambitions of tight control via private intermediaries in illiberal regimes, on the one hand, and the transfer of public functions to unregulated digital platforms in liberal regimes, on the other”.

[38] See CJEU, Judgement of 24 Nov 2011, Case C-70/10, Scarlet; CJEU, Judgement of 16 Feb 2012, Case C‑360/10, Netlog. See De Gregorio, supra note 25 at 51–53.

[39] Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC.

[40] Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online.

[41] De Gregorio (n 27) 59 et seqq.

[42] Although they are legally non-binding instruments, codes of conduct can have a de facto binding effect through pressure from governments, Keller (n 10) 6 et seq.; Fertmann and others (n 10); Gorwa (n 29) 13 et seq. This regulatory model is also taken up by the DSA in its Articles 45-47.

[43] Section 3(2) No. 2, 3 German Network Enforcement Act of 1 Sep, 2017 (BGBl. I p. 3352), as last amended by Article 3 of the Act of July 21, 2022 (BGBl. I p. 1182)

[44] Section 94 German Media State Treaty as amended by the Second Interstate Treaty Amending Interstate Treaties under Media Law (Second Interstate Treaty Amending Media Law) of 27 Dec, 2021 in force since Jun 30, 2022.

[45] German Federal Supreme Court, Judgements of 29 Jul 2021, III ZR 179/20 and ZR 192/20; see also Matthias C Kettemann and Torben Klausa, ‘Regulating Online Speech: Ze German Way’ (Lawfare, 20 September 2021) <https://www.lawfareblog.com/regulating-online-speech-ze-german-way> accessed 18 November 2022; Matthias C Kettemann and Anna Sophia Tiedeke, ‘Back Up: Can Users Sue Platforms to Reinstate Deleted Content?’ (2020) 9 Internet Policy Review 1. For further examples of rulings in other countries, see Daphne Keller, ‘The EU’s new Digital Services Act and the Rest of the World’ (2022) Verfassungsblog <https://verfassungsblog.de/dsa-rest-of-world/> accessed 2 August 2023.

[46] In general, on the question of how T&C law is evolving into regulatory law in Germany Tobias Mast, ‘AGB-Recht Als Regulierungsrecht’ (2023) 78 JuristenZeitung (JZ) 287.

[47] For example, in Austria the Federal Act on Measures for the Protection of Users on Communication Platforms, Federal Law Gazette I No. 151/2020 or in France the Law No. 2020-766 of June 24, 2020 to fight against hateful content on the internet, which was yet declared unconstitutional in large parts.

[48] Cf. Giancarlo Frosio, ‘Platform Responsibility in the Digital Services Act: Constitutionalising, Regulating and Governing Private Ordering’ <https://papers.ssrn.com/abstract=4236510> accessed 18 November 2022.

[49] See Article 14 and Article 27 DSA.

[50] For the embedding of hybrid speech governance in the DSA, see section 5.

[51] Robert Gorwa, ‘What Is Platform Governance?’ (2019) 22 Information, Communication & Society 854, 861 et seqq.; Wolfgang Schulz and Thorsten Held, Regulierte Selbstregulierung als Form modernen Regierens: Endbericht (Hans-Bredow-Institut 2002); Gerald Spindler and Christian Thorun, ‘Die Rolle Der Ko-Regulierung in Der Informationsgesellschaft Handlungsempfehlung Für Eine Digitale Ordnungspolitik’ (2016) MultiMedia und Recht 1, 7 et seqq. who add market control to the picture. On different understandings of Internet governance, see Jeanette Hofmann, Christian Katzenbach and Kirsten Gollatz, ‘Between Coordination and Regulation: Finding the Governance in Internet Governance’ (2017) 19 New Media & Society 1406.

[52] Schulz (n 4) 239.

[53] Gorwa (n 51) 862 et seq.; Michael A Cusumano, Annabelle Gawer and David B Yoffie, ‘Can Self-Regulation Save Digital Platforms?’ (2021) 30 Industrial and Corporate Change 1259, 1271 et seqq. See also section 2.2.1.

[54] Schulz and Held (n 51); see also Christopher T Marsden, Internet Co-Regulation: European Law, Regulatory Governance and Legitimacy in Cyberspace (Cambridge University Press 2011); Michèle Finck, ‘Digital Co-Regulation: Designing a Supranational Legal Framework for the Platform Economy’ <https://papers.ssrn.com/abstract=2990043> accessed 22 November 2022.

[55] Hoffmann-Riem (n 10) 44.

[56] Cf. Schulz (n 4) 239 et seq.

[57] Schulz (n 4) 239.

[58] It is this overlap that leads to hybridity; thus, unlike David Levi-Faur, ‘Regulation and Regulatory Governance’ in David Levi-Faur, Handbook on the Politics of Regulation (Edward Elgar Publishing 2011) 13 et seqq. we do not want to call forms of regulation in which private and public ordering can be separated ‘hybrid regulation’.

[59] On the intersection of state and private power see also Keller (n 10).

[60] Kettemann and Schulz (n 14).

[61] On the complex interaction between corporate and government interests in relation to Facebook cf. Arun (n 35).

[62] On the inseparability of public and private ordering in the information society, see also De Gregorio and Radu (n 10) 82.

[63] Cf. Stephan Dreyer and others, ‘European Media Law in Times of Digitality’ in Matthias Kettemann, Alexander Peukert and Indra Spiecker gen. Döhmann, The Law of Global Digitality (1st edn, Routledge 2022) 183.

[64] On the necessity of new dimensions of constitutional law in the digital age cf. also Oreste Pollicino and Giovanni De Gregorio, ‘Constitutional Law in the Algorithmic Society’ in Hans-Wolfgang Micklitz and others (eds), Constitutional Challenges in the Algorithmic Society (1st edn, Cambridge University Press 2021) 14.

[65] On this matter cf., e.g., Anupam Chander, ‘Facebookistan’ (2012) 90 North Carolina Law Review 1807; Susan Benesch, ‘But Facebook’s Not a Country: How to Interpret Human Rights Law for Social Media Companies’ (2020) 38 Yale Journal on Regulation Bulletin 86; Anna Sophia Tiedeke, ‘Self-Statification of Corporate Actors? : Tracing Modes of Corporate Engagements with Public International Law’ (European University Institute 2022) Working Paper <https://cadmus.eui.eu/handle/1814/74562> accessed 14 August 2023.

[66] Goldmann (n 6) 48.

[67] At least in principle one may of course argue about the power imbalance between users and platform providers and the effect on private autonomy. Consequently, in Germany one way of binding private rules to constitutional rights builds on the laws for the control of general terms and conditions, cf. German Federal Supreme Court, Judgements of 29 Jul 2021, III ZR 179/20 and ZR 192/20.

[68] Bloch-Wehba (n 32) 66; Gilad Abiri and Sebastian Guidi, ‘From a Network to a Dilemma: The Legitimacy of Social Media’ <https://papers.ssrn.com/abstract=4230635> accessed 24 November 2022 argue as follows: “Social networks are in a dilemma. The reason their legitimation crisis seems irresolvable is, simply, that they do not fit any existing cultural role. They are too public to be a corporation and too private to be a government”.

[69] On the legal situation in Germany see Louis Jakob Rolfes, ‘The Legitimacy of Rules of Virtual Communities’ (Humboldt-Universität zu Berlin 2022) <https://edoc.hu-berlin.de/handle/18452/24607> accessed 23 November 2022.

[70] For an overview of approaches see, e.g., Nicolas Suzor, Tess Van Geelen and Sarah Myers West, ‘Evaluating the Legitimacy of Platform Governance: A Review of Research and a Shared Research Agenda’ (2018) 80 International Communication Gazette 385.

[71] Blayne Haggart and Clara Iglesias Keller, ‘Democratic Legitimacy in Global Platform Governance’ (2021) 45 Telecommunications Policy 102152. They build on Fritz Wilhelm Scharpf, Governing in Europe: Effective and Democratic? (Oxford University Press 1999); Vivien A Schmidt, ‘Democracy and Legitimacy in the European Union Revisited: Input, Output and “Throughput”’ (2013) 61 Political Studies 2.

[72] Cf. also Kettemann and Schulz (n 14) 28 et seqq.

[73] Haggart and Keller (n 71) 14 et seq.

[74] Haggart and Keller (n 71) 15.

[75] Gregor Bachmann, Private Ordnung (Mohr Siebeck 2006) 179 et seqq.

[76] Haggart and Keller (n 71) 2.

[77] Jeremy Waldron, ‘The Rule of Law in Public Law’ in Mark Elliott and David Feldman (eds), The Cambridge companion to public law (Cambridge University Press 2015); András Sajó, ‘The Rule of Law’ in Roger Masterman and Robert Schütze (eds), The Cambridge Companion to Comparative Constitutional Law (1st edn, Cambridge University Press 2019). For the example of European and German law see Calliess, ‘EU-Vertrag (Lissabon) Art. 2 [Die Werte Der Union]’ in EUV and AEUV (eds), Calliess/Ruffert (6th edn, 2022); Huster and Rux, ‘GG Art. 20 [Bundesstaatliche Verfassung; Widerstandsrecht]’ in Epping and Hillgruber (eds), BeckOK Grundgesetz (52nd edn, 2022) 138 et seq.

[78] On specific requirements in German law see Jarass, ‘GG Art. 20 [Verfassungsrechtliche Grundprinzipien; Widerstand]’ in Jarass and Pieroth (eds), Grundgesetz für die Bundesrepublik Deutschland (17th edn, 2022) 37 et seqq. On EU law see Calliess (n 77) 27. See also Article 2 a) Regulation (EU, Euratom) 2020/2092 of the European Parliament and of the Council of 16 December 2020 on a general regime of conditionality for the protection of the Union budget.

[79] Especially since, from a state-centered perspective, the concept is already subject to national peculiarities, Geranne Lautenbach, The Concept of the Rule of Law and the European Court of Human Rights (Oxford University Press 2013).

[80] Nicolas Suzor, ‘The Role of the Rule of Law in Virtual Communities’ (2010) 25 Berkeley Technology Law Journal 1817; Nicolas Suzor, ‘Digital Constitutionalism: Using the Rule of Law to Evaluate the Legitimacy of Governance by Platforms’ (2018) 4 Social Media + Society; Niva Elkin-Koren and Maayan Perel, ‘Guarding the Guardians: Content Moderation by Online Intermediaries and the Rule of Law’ in Giancarlo Frosio (ed), Niva Elkin-Koren and Maayan Perel, Oxford Handbook of Online Intermediary Liability (Oxford University Press 2020); Stephan Koloßa, ‘Facebook and the Rule of Law’ (2020) Zeitschrift für ausländisches öffentliches Recht und Völkerrecht 509; cf. also Bloch-Wehba (n 32) 71 et seqq. arguing in favor of a transfer of administrative law principles (transparency, reasoned decision-making, user participation, and judicial review).

[81] The national and international fundamental rights systems differ in the construction of the “state actor” and also vary in terms of concepts to (indirectly) bind non-state actor to fundamental rights.

[82] ECtHR, Judgement of 6 May 2003, Application no. 44306/98, Appleby and others; Christian Starck, ‘State Duties of Protection and Fundamental Rights’ (2009) 3 Potchefstroom Electronic Law Journal/Potchefstroomse Elektroniese Regsblad; Kettemann and Tiedeke (n 45).

[83] Cf. Ralf Müller-Terpitz, ‘Soziale Netzwerke als Gegenstand des geltenden Rechts. Eine Rechtssystematische Einordnung’ in Martin Eifert and Tobias Gostomzyk (eds), Netzwerkrecht (Nomos Verlagsgesellschaft mbH & Co KG 2018); Anna Kellner, Die Regulierung der Meinungsmacht von Internetintermediären (1. Auflage, Nomos 2019) 91 et seqq.; Samira Tief, Kommunikation auf Facebook, Twitter & YouTube: Verfassungsrechtlicher Schutz der Informationsintermediäre und ihrer Nutzer durch die Medienfreiheiten (1st edn, Duncker & Humblot 2020) 93 et seqq.

[84] On this so-called ‘jawboning’ see Derek E Bambauer, ‘Against Jawboning’ (2015) 100 Minnesota Law Review 51; Keller (n 10) 5 et seqq.; David Greene, ‘When “Jawboning” Creates Private Liability’ (Electronic Frontier Foundation, 21 June 2022) <https://www.eff.org/deeplinks/2022/06/when-jawboning-creates-private-liability> accessed 30 November 2022.

[85] Indeed, in examining Facebook’s community standards, the Court interpreted Facebook’s rules in the light of fundamental rights. The court thus concluded that Facebook may in principle set up communication rules in their T&C, even when their T&C go beyond national libel laws. This is because of Facebook’s own fundamental rights. However, due to its users’ fundamental rights, there must always be an objective reason for removing content and blocking user accounts. In addition, the platform must provide for certain procedural requirements in their T&C, for example information, statement of reasons, and the possibility of a counterstatement in the case of content deletions, see German Federal Supreme Court, Judgements of 29 Jul 2021, III ZR 179/20 and ZR 192/20; cf. also German Federal Constitutional Court, Decision of 22 May 2019, 1 BvQ 42/19, III. Weg; Daniel Holznagel, ‘Overblocking Durch User Generated Content (UGC) – Plattformen: Ansprüche Der Nutzer Auf Wiederherstellung Oder Schadensersatz?’ (2018) 34 Computer und Recht 369, 371 et seq.; Lena Isabell Löber and Alexander Roßnagel, ‘Das Netzwerkdurchsetzungsgesetz in Der Umsetzung’ (2019) Multimedia und Recht 71, 75; Simon Jobst, ‘Konsequenzen Einer Unmittelbaren Grundrechtsbindung Privater’ (2020) Neue Juristische Wochenschrift 11; Judit Bayer, ‘Rights and Duties of Online Platforms’ in Judit Bayer and others (eds), Perspectives on Platform Regulation: Concepts and Models of Social Media Governance | Across the Globe (Nomos Verlagsgesellschaft mbH & Co KG 2021) 38.

[86] For the Netherlands see District Court of Amsterdam, Judgement of 9 Sep 2020, C/13/687385; for Italy see Court of Rome, Decision of 29 Apr 2020, 80961/19.

[87] Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online.

[88] João Pedro Quintais, Naomi Appelman and Ronan Ó Fathaigh, ‘Using Terms and Conditions to Apply Fundamental Rights to Content Moderation’ (2023) 24 German Law Journal 881, 890 view this as the source of inspiration for Art. 14(4) DSA.

[89] Concerning German law Rolfes (n 69) 115 et seqq.

[90] Matthias C Kettemann and Martin Fertmann, ‘Making Platforms Rules More Democratic: Are Social Media Councils the Way to Go?’; Matthias C Kettemann and Martin Fertmann, Platform-Proofing Democracy. Social Media Councils as Tools to Increase the Public Accountability of Online Platforms (Friedrich-Naumann-Stiftung für die Freiheit 2021).

[91] SPD (Social Democrats), Bündnis90/Die Grünen (Greens) and FDP (Liberals), Coalition Agreement of the German Government (2021-2025): Mehr Fortschritt Wagen - Bündnis Für Freiheit, Gerechtigkeit Und Nachhaltigkeit, p 14.

[92] Cf. Article 52(1) EU Charter of Fundamental Rights.

[93] Verica Trstenjak and Erwin Beysen, ‘Das Prinzip Der Verhältnismäßigkeit in Der Unionsrechtsordnung’ (2012) Europarecht 265, 274 et seqq.; Wolf Sauter, ‘Proportionality in EU Law: A Balancing Act?’ (2013) 15 Cambridge Yearbook of European Legal Studies 439.

[94] Systemic balancing by platforms based on the proportionality principle already corresponds to the reality of platform governance, see Douek (n 31).

[95] For initial clarifications see Tobias Mast and Christian Ollig, ‘The Lazy Legislature – Incorporating and Horizontalising the Charter of Fundamental Rights through Secondary Union Law’ (2023) European Constitutional Law Review Forthcoming.

[96] Recital 47 DSA.

[97] See Wolfgang Hoffmann-Riem, Recht im Sog der digitalen Transformation (Mohr Siebeck 2022) 103.

[98] See also the mentions of the freedom to conduct a business und the freedom of contract in recitals 3, 45, 52 of the DSA.

[99] Italics here only.

[100] De Gregorio (n 6) 316.

[101] Cf. Eva Skobel, Regulierung nutzergenerierter Inhalte auf sozialen Netzwerken (Universität Trier 2021) 310.

[102] Cf. on the ‘one-sided fundamental rights pathos‘ of the DSA Denga (n 29) 580.

[103] Pollicino and Gregorio (n 64) 16 et seqq.

[104] See section 5.2.2.

[105] Mast and Ollig (n 95).

[106] Quintais, Appelman and Fathaigh (n 88) 894; Benjamin Raue, ‘Art. 14 DSA’ in Franz Hofmann and Benjamin Raue (eds), Digital Services Act – Gesetz über digitale Dienste (Nomos 2023) paras 74 et seqq.

[107] Cf. also Evelyn Douek, ‘Content Moderation as Systems Thinking’ (2022) 136 Harvard Law Review 528 arguing that meaningful accountability of content moderation system may solely be achieved by design choices and tradeoffs at the upstream level of content moderation as a form of systems thinking.

[108] Ilaria Buri and Joris van Hoboken, ‘The Digital Services Act (DSA) Proposal: A Critical Overview’ (Institute for Information Law, University of Amsterdam 2021).

[109] Andreas Peukert, ‘Zu Risiken Und Nebenwirkungen Des Gesetzes Über Digitale Dienste (Digital Services Act)’ [2022] Kritische Vierteljahresschrift für Gesetzgebung und Rechtswissenschaft 57, 61 et seqq.; Andreas Peukert, ‘Five Reasons to be Skeptical About the DSA’ (Verfassungsblog) <https://verfassungsblog.de/power-dsa-dma-04/> accessed 1 December 2022.

[110] Cf. Jürgen Kühling, ‘»Fake News« und »Hate Speech« – Die Verantwortung Der Medienintermediäre Zwischen Neuen NetzDG, MStV Und Digital Services Act’ (2021) Zeitschrift für Urheber- und Medienrecht 461, 470.

[111] German Bundesrat Decision, Document 96/21 of 26 Mar, 2021, 10.

[112] Cf. Quirin Weinzierl, ‘Institutionalizing Parallel Governance’ (Verfassungsblog: On Matters Constitutional, 18 December 2020) <https://verfassungsblog.de/institutionalizing-parallel-governance/> accessed 7 April 2022. Others see it as a state-like obligation, Denga (n 29).

[113] More about these specifications Quintais, Appelman and Fathaigh (n 88) 895 et seqq.

[114] Quintais, Appelman and Fathaigh (n 88) 896.

[115] See German Federal Constitutional Court, Decision of 22 May 2019, 1 BvQ 42/19, III. Weg.

[116] Quintais, Appelman and Fathaigh (n 88) 903 et seqq.

[117] Quintais, Appelman and Fathaigh (n 88) 905 et seqq.

[118] Quintais, Appelman and Fathaigh (n 88) 907.

[119] Pollicino and Gregorio (n 64) 17; De Gregorio (n 6) 312.

[120] Keller (n 10) 17; De Gregorio (n 6) 312.

[121] Keller (n 10) 3 et seq.; Kettemann and Tiedeke (n 45) 6; Jacob Kosakowski, ‘Delete and Repeat: The Problem of Protecting Social Media Users’ Free Speech from the Moderation Machine Notes’ (2022) 55 Suffolk University Law Review 65, 71 et seqq.

[122] For example in Marsh v. Alabama, 326 U.S. 501 (1946) the Supreme Court recognized that a private company’s operation of a city performed a public function. Under these circumstances, a private company could be subject to the First Amendment. This jurisprudence, however, has been qualified to the extent that state action is now presumed only when the private entity exercises power that traditionally belonged exclusively to the state, Jackson v. Metropolitan Edison Co., 419 U.S. 345, 352 (1974). For further restrictions imposed by case law, see Keller (n 10) 8.

[123] In PruneYard Shopping Center v. Robins, 447 U.S. 74 (1980) the justices recognized that a private shopping mall could constitute a public forum in which the distribution of leaflets would have to be tolerated.

[124] For example, in Manhattan Community Access Corp. v. Halleck, 587 U.S. (2019), the Supreme Court rejected a public forum in the case of a private operator of a public access television station.

[125] Johnson v. Twitter, Inc., No. 18CECG00078 (Cal. Superior Ct. 2018); Prager Univ v. Google LLC, No. 17-CV-06064-LHK, (N.D. Cal. 2018); Nyabwa v. Facebook, No. 2:17-CV-24, (S.D. Tex. 2018); Prager University v. Google LLC, No. 18-15712 (9th Cir. 2020). See also Jonathan Peters, ‘The “Sovereigns of Cyberspace” and State Action: The First Amendment’s Application (or Lack Thereof) to Third-Party Platforms’ (2017) 32 Berkeley Technology Law Journal 989; Jane Bambauer, James Rollins and Vincent Yesue, ‘Platforms: The First Amendment Misfits Symposium: Compelled Speech: The Cutting Edge of First Amendment Jurisprudence’ (2022) 97 Indiana Law Journal 1047. This may only be judged differently in the case of profiles of public officials on social media platforms, Knight First Amendment Institute at Columbia University v. Trump, No. 18-1691 (2d Cir. 2019); see also Jason Wiener, ‘Social Media and the Message: Facebook, Forums, and First Amendment Follies Notes’ (2020) 55 Wake Forest Law Review 217. See also Davison v. Randall, No. 17–2002 (4th Cir. 2019).

[126] Daphne Keller, cited in David McCabe, ‘Supreme Court Poised to Reconsider Key Tenets of Online Speech’ The New York Times (19 January 2023) <https://www.nytimes.com/2023/01/19/technology/supreme-court-online-free-speech-social-media.html> accessed 14 August 2023.

[127] Netchoice, LLC v. Paxton, No. 21-51178 (5th Cir. 2022).

[128] Netchoice, LLC v. Att’y Gen., Fla., No. 21-12355 (11th Cir. 2022).

[129] Ann E. Marimow and Cat Zakrzewski, ‘Landmark Texas, Florida social media cases added to Supreme Court term’, The Washington Post (29 September 2023) < https://www.washingtonpost.com/politics/2023/09/29/supreme-court-social-media-florida-texas-google-facebook/ > accessed 24 November 2023.

[130] For example, some suggest that courts should adapt their public forum jurisprudence to the digital age. In particular, functional considerations could be given to whether a forum on a platform has public characteristics, see, Peters (n 125) 1022 et seqq. who in this respect proposes a development of Marsh.

Others go in a similar direction when they propose a sub-type of the public forum, namely a “social public forum”: Platforms are bound by the First Amendment only if they offer a digital space for the general public that is essential to public discourse, see Amélie Heldt, ‘Merging the Social and the Public: How Social Media Platforms Could Be a New Public Forum’ (2020) 46 Mitchell Hamline Law Review 1032 et seqq.

Some propose that legislators should act. Indeed, platforms, as gatekeepers of public discourse, could possibly be legally bound to more or less extensive must-carry obligations; at the same time, in light of the First Amendment, platform operators’ discretion in moderation would have to be respected, see Keller (n 10) 18–27; see also Kosakowski (n 121) 89 et seq.

Still others suggest that because of the platform companies’ First Amendment rights, there should be no direct regulation of moderation practices. Rather, through the combined application of antitrust and competition law, privacy and consumer protection law, and intermediary liability, incentives should be provided for platform operators to create a healthy digital public sphere, see Balkin, ‘How to Regulate (and Not Regulate) Social Media’ (n 25).

Some also argue that content moderation by platform operators is not protected by the First Amendment, which in turn could open up new regulatory options, Pauline Trouillard, ‘Social Media Platforms Are Not Speakers’ (2022) Ohio State Technology Law Journal Forthcoming.

[131] De Gregorio (n 6) 316.

[132] Mast and Ollig (n 95).

[133] Anu Bradford, ‘The Brussels Effect’ (2012) Vol. 107 Northwestern University Law Review 1; Anu Bradford, The Brussels Effect: How the European Union Rules the World (Oxford University Press 2020).

[134] Cf. also Keller (n 45) who argues for a cautious export of the DSA to the U.S. and other jurisdictions based on initial experience with the practical application of the new European rules.

[135] Yuntao Bai and others, ‘Constitutional AI: Harmlessness from AI Feedback’ <http://arxiv.org/abs/2212.08073> accessed 14 August 2023; Kyle Wiggers, ‘Anthropic Thinks “constitutional AI” Is the Best Way to Train Models’ (TechCrunch, 9 May 2023) <https://techcrunch.com/2023/05/09/anthropic-thinks-constitutional-ai-is-the-best-way-to-train-models/> accessed 14 August 2023.

[136] Wolfgang Schulz and Christian Ollig, ‘We Don’t Need No Education?: Teaching Rules to Large Language Models through Hybrid Speech Governance’ (VerfBlog, 9 November 2023) < https://verfassungsblog.de/we-dont-need-no-education/ accessed 11 November 2023.

Fulltext

License

Any party may pass on this Work by electronic means and make it available for download under the terms and conditions of the Digital Peer Publishing License. The text of the license may be accessed and retrieved at http://www.dipp.nrw.de/lizenzen/dppl/dppl/DPPL_v2_en_06-2004.html.

JIPITEC – Journal of Intellectual Property, Information Technology and E-Commerce Law
Article search
Extended article search
Newsletter
Subscribe to our newsletter
Follow Us
twitter