Document Actions

Articles

Recommenders you can rely on: A legal and empirical perspective on the transparency and control individuals require to trust news personalisation

  1. Max van Drunen
  2. Brahim Zarouali
  3. Natali Helberger

Abstract

This article explores the role law can play to support trust in the context of news personalisation. The need to ensure trust in the face of technological changes in information dissemination is an important aspect of both recent horizontal legislation such as the Digital Services Act, as well as context-specific specific efforts surrounding for example disinformation. In these legal discussions, however, what trust is, why law should promote it, and what concrete measures are suitable to do so often remain ambiguous. This raises suspicions over whether trust is simply a selling point of traditional legal measures, and if not, what concrete role law can and should play to promote trust. This article focuses on the role control and transparency measures can play to safeguard trust in organisations that use news personalisation. It first analyses how trust should be understood in the context of news personalisation, how media regulation has traditionally supported trust, and how it should continue to do so in the context of news personalisation. It then draws on a conceptual framework of transparency measures in the context of news personalisation to survey how important different transparency and control measures are to the individuals who place trust in organisations that use personalisation. Law’s current focus on informing individuals about and empowering them to stop personalisation does not account for the importance of enabling individuals to control how news is personalised.

Keywords

1. Introduction*

1

Trust is an intuitively appealing concept. It implies individuals can rely on other parties or technologies without having to fully understand or control them. This has always been crucial for organisations that provide news to an audience that does not have the access, expertise, or time to verify this information. [1] It takes on added importance now that information is increasingly distributed with the use of algorithms that are hard even for experts to fully understand. A number of recent policy initiatives, including horizontal regulations such as the Digital Services Act (DSA) and sector-specific policies surrounding disinformation, accordingly, highlight the need to increase trust in the online environment. [2] This article focuses on a specific technology that helps individuals to navigate the online media environment, namely news personalisation. Personalisation is used by online platforms to determine what (if any) news is shown to which individual based on their characteristics, and is also one of the most important applications of automated decision-making in the traditional news media. [3]

2

The relationship between regulation and trust is complicated. Simply focusing on the need to increase trust shifts attention away from the need to ensure companies using personalisation algorithms are actually trustworthy, and puts the emphasis on the need for individuals to accept them. [4] Ensuring trustworthiness, for example by regulating the data used in personalisation or by requiring platforms to limit the risks their recommender systems pose, has accordingly been an important part of the legal debate. [5] Ensuring trustworthiness, however, does not automatically lead to trust—individuals must also be able to determine whether they can trust another party. Transparency and control, especially concerning the need for algorithmic explainability, has played a dominant role in this context. [6] Indeed, the provisions in the DSA dedicated to recommender systems focus exclusively on transparency and control. [7]

3

How the regulatory approach to trust relates to the perspective of the individuals who interact with (personalisation) algorithms remains underexplored. Legal discussions instead highlight why trust in technology is important, how technological transformations generally challenge trust, and what role legal measures should play in safeguarding trust. [8] At the same time, existing empirical literature focused on trust in personalisation remains disconnected from normative discussions over why and how regulation should enable trust. [9] In the face of this disconnect between legal and empirical discussions on trust, regulation has to promote trust without taking into account the perspectives of the individuals who actually place their trust in organisations using personalisation to inform them. This limits our understanding of how law can promote trust in a manner that supports both normative objectives as well as individuals’ needs.

4

This article explores, from the perspective of individuals, how trust in organisations that use personalisation should be safeguarded through transparency and control measures. It combines an analysis of the ways in which legislation can safeguard trust in the context of personalisation with a survey that explores the perceptions of the individuals who place trust. In particular, we explore how important respondents report different control and transparency measures to be to their trust in organisations that use personalisation to inform them. By focusing on news personalisation the article aims to account for the context-specific challenges which arise when decision-making is automated in a specific field such as the media. The underlying assumption is that trust in technology, and the reasons why regulation should promote it, are shaped by the specific task which technology is relied on to perform.

5

Section B Max Drunen van MvD defines trust in the context of news personalisation and analyses the reasons why and ways in which media regulation has been used to promote trust. Sections C and D connect this analysis to the way in which individuals form trust in technology. The sections draw on a conceptual framework of algorithmic transparency in the context of news personalisation to develop and report the results of a survey that gauges what transparency, control, and (self-)regulation individuals find important when they determine whether to trust organisations which use personalisation to inform them. [10] The article concludes by outlining how regulation can enable individuals to trust organisations that use personalisation to inform them.

2. The relationship between law, trust, and news personalisation.

2.1. Trust and its role in law

6

This paper defines trust as the willingness to be vulnerable to the actions of another based on positive expectations about their actions. [11] Although it has been notoriously difficult to reach a consensus about the exact meaning of trust, this definition contains three commonly used elements which are important to understand this article’s approach to trust and its relation to law and the media. First, trust is relational: it involves one party (the trustor) placing trust in another (the trustee). The exact nature of this ‘another’ is quite flexible. Literature on trust in the media traditionally focused on trust in the media as an institution, specific types of media (such as print or broadcasting), or a specific organisation, journalist, or message. [12] Research into the impact of the use of technology, including personalisation, on trust in media is generally incorporated into these existing approaches. Studies have for example explored to what extent individuals are willing to trust specific types of media that heavily rely on personalisation (such as social media), or how the use of personalisation impacts individuals’ trust in a the organisation that uses personalisation. [13] This article similarly approaches personalisation as another factor that can influence individuals’ trust in the organisation that informs them, rather than treating personalisation algorithms themselves as a new object of trust.

7

Second, trust involves vulnerability. Trust only comes into play when something is at stake, and the possibility exists that the trustor’s vulnerability will be exploited. [14] Vulnerability also tailors trust definitions to specific contexts. Trust in the media typically centres on its editorial function, that is, whether it can be expected to provide relevant and reliable information. [15] Operationalisations of trust in media capture different aspects of this editorial function, such as accuracy, comprehensiveness, and fairness. [16] Personalisation changes the way in which (some of) these editorial functions are fulfilled. Instead of an editor deciding what information the audience should see, each individual is given their own selection of articles by a personalisation algorithm controlled by editors, engineers, and/or business departments. [17] This change in the way organisations inform their audiences may particularly affect aspects of trust that concern the way the media selects what events to cover, such as trust in the comprehensiveness or diversity of the reporting. [18] Conversely, aspects of trust that are closely related to the way news is produced (such as trust in the accuracy of the reporting) may be unaffected by personalisation, at least when an organisation uses personalisation to recommend articles produced through its traditional editorial processes (as is often the case in the legacy news media). [19] It should also be noted that personalisation’s impact on trust is not necessarily negative. For example, individuals may trust algorithmically delivered news more when they perceive algorithms to be more neutral than human editors. [20] As we argue below, the goal of law in this context should not be to promote trust, but to ensure individuals’ trust is based on correct assumptions.

8

Vulnerability is also the element that can make trust such a hollow concept for legal literature. The need to prevent vulnerabilities from being exploited is nothing new in law, which already contains a wide range of values and mechanisms to do exactly that. These include specific values such as the right to receive information and privacy, as well as more overarching concepts such as autonomy. [21] Trust does not have any added analytical value in legal discussions if it is simply used to refer to the need to protect these values. The danger of trust being used in this way is exacerbated by the lack of a consensus on its precise definition. This ambiguity makes it possible to use trust as a rhetorical tool to refer to the need for technology, individuals, or institutions to act in line with an undetermined set of values every reader can fill in for themselves.

9

Trust is not only about one party being vulnerable to another, however. The third element of the definition above captures that trust is about an individual’s willingness to be vulnerable based on a positive expectation about the trustee’s actions. Trust thereby allows individuals to deal with the uncertainty on whether their vulnerability will be exploited. It does not require that every vulnerability is removed from an interaction, or that individuals engage in a fully rational cost-benefit analysis. [22] Instead, trust functions as a heuristic that allows individuals to avoid such a complex analysis. Affective approaches to trust emphasise the role of emotion in this process, such as a feeling of security, while cognitive approaches highlight that individuals can also more consciously draw on information in their trust judgments, such as a website’s presentation. It is important to note that these two approaches are not mutually exclusive; like many other decisions, trust is likely influenced by both affective and cognitive factors. [23]

10

In here also lies trust’s added value for law. Trust captures an essential manner in which individuals determine whether they will interact with those around them—in this case organisations that use personalisation to inform them. Trust facilitates these interactions by giving individuals a fast way to assess whether their vulnerability will be exploited if they rely on another party. Simply reducing the level of vulnerability, for example through rules which require organisations to address risks posed by their personalisation algorithms or limit how organisations can use the data they collect to personalise the news, is not necessarily enough to enable individuals to trust. [24] Individuals must also be able to assess an organisation’s trustworthiness or be able to limit their vulnerability if they are not able to trust another party completely. Legal debates that ignore the function that trust plays in daily life, create the risk that individuals are not able to trust other individuals, organisations, or technologies, and are less able to interact with them as a result. This creates an issue when law aims to promote public values that enable individuals to interact with others, for example by receiving information from the media or privately informing themselves about controversial issues. From a legal perspective, trust accordingly functions as a bridge between regulatory efforts, which aim to secure public values (such as privacy or freedom of expression), and the actions which these regulatory efforts intend to enable individuals to take (such as receiving information which shapes their opinions or interacting with others without chilling effects).

2.2. Why media regulation is used to promote trust

11

At the most basic level, trust is relevant to legal discussions because of its ability to facilitate interactions. Societies are built on cooperative relationships, and individuals interact more easily when they are able to trust each other. [25] However, law’s interest in facilitating interactions is of course selective. There is no legal value in promoting individuals’ trust in actors who will exploit that trust, nor the kind of trust that leads to interactions that run counter to public values, such as that which is necessary for cartels or criminal organisations to function. [26] In the technological and media context of news personalisation, two goals in particular shape the kind of trust law aims to promote. [27]

 

12

The necessity of trust in media law discussions is primarily driven by arguments that focus on the media’s role in democratic society. The media’s ability to play this role is not only based on its ability to collect and distribute information, but also on the audience’s willingness to absorb and act on this information. In an information environment where individuals are not able to determine which organisations they can trust, the media cannot fulfil its function as a public watchdog or source of information. [28] Similarly, citizens cannot fulfil their role in the democratic process unless they are able to trust media organisations. Citizens rely on the media to provide them with information which they do not have the time, resources, or access to obtain themselves. Conversely, a lack of trust severely limits the information that citizens can use to take part in the political process. In other words, the media’s ability to fulfil its role in society presumes that citizens are able to trust the media. [29]

13

Economic goals feature particularly prominently in the broader legal discussion on the need for trust in Artificial Intelligence. In the words of the Commission, “lack of trust is a main factor holding back a broader uptake of AI.” [30] A lack of trust is thereby framed as an economic inefficiency preventing individuals from using AI that is able to provide valuable services. Trust’s role as a precondition for acceptance has a long history. Some of the earliest research into trust in the media focused on the impact of perceived trustworthiness on the acceptance of a message. [31] Literature on trust in personalisation systems often continues to take a rather short-term approach to promoting trust, sometimes simply operationalising trust as the acceptance of the system or its recommendations. [32] AI policy emphasises the need for a more long-term acceptance of AI, for which the technology needs to earn trust and be consistently trustworthy. [33]

14

Democratic and economic perspectives on trust in the media can complement one another. Both focus on ensuring that a media organisation earns the trust of its audience by doing what it is relied on to do. Economic perspectives focus on the financial value of this interaction. Although this aspect is not the focal point of media law discussions, the need to create a media system in which quality journalism is financially sustainable and disinformation is not, is increasingly emphasised. [34] Trust has a part to play in this context, given the relationship between trust and media use—as well as media scepticism and use of non-mainstream sources. [35] The broader literature on media transparency accordingly highlights the importance of trust for the financial health of the media. [36]

15

Regulation’s ability to secure trust in the context of news personalisation is limited precisely because of the centrality of trust to the ability of media organisations to fulfil their democratic role. Regulations that require media organisations to act in a trustworthy way would allow for political interference in the manner in which the media and citizens interact. Wijermars, for example, has analysed how Russian legislation passed to preserve the “truthfulness and trustworthiness of the information that our citizens receive” enables the state to control the output of news recommenders by limiting the kinds of sources they can recommend. [37] EU media regulation has therefore traditionally established only limited minimum norms regarding editorial responsibility, concerning among others an obligation to protect children from harmful content and a prohibition on subliminal advertising. [38] As the next section explores further, regulation aims to create the conditions under which individuals can form trust in the media instead, for example through transparency norms that allow individuals themselves to evaluate the trustworthiness of media organisations or media content.

2.3. How media regulation promotes trust through transparency and control options

16

Transparency and control can make it easier for individuals to determine whether they will trust another party by allowing them to be less uncertain and vulnerable. At least from a conceptual perspective, this could prevent individuals from placing as much trust in others as they otherwise would. After all, transparency and control reduce the level of uncertainty and vulnerability that make trust possible. A similar argument is sometimes made with regard to the general relationship between law and trust. By requiring individuals and companies to (for example) not violate individuals’ privacy, law arguably takes away their ability to demonstrate their trustworthiness voluntarily. [39]

17

The concern that legal measures displace trust inherently only applies when individuals would have placed trust even without e.g. transparency or control. However, as the above argued, regulation is used to enable individuals to trust precisely in situations where they would otherwise feel too uncertain or too vulnerable to do so. That is, media regulation lowers the bar for trust, making it easier for individuals to place trust in a wider variety of actors. Although this may limit the trust individuals would have placed in trustworthy actors even without legal measures being in place, this limitation must be seen in the context of the wider group of actors. Furthermore, the empirical evidence (at least in the context of the media) indicates transparency and generally does have a positive (albeit small) impact on trust. [40] There are a wide variety of potential reasons for this, including the possibility that individuals see transparency as a signal that a company is trustworthy or are unaware of the fact that a company is only transparent because it is legally required to do so. [41]

18

The first way in which media regulation promotes trust is by aligning expectations. By forcing parties to make their assumptions explicit and clarify how they fulfil their roles, media regulation can prevent unintended trust violations. [42] In the context of the media, this way of promoting trust is strongly intertwined with the right to receive information, and more specifically its focus on enabling individuals to seek out a wide range of information. Regulation has traditionally facilitated the exercise of this right by ensuring the availability of information about the media organisation itself, thereby allowing individuals to evaluate how a media organisation fits into their media diet. [43] Article 5 of the EU’s Audiovisual Media Services Directive (AVMSD), for example, intends to make it easier for individuals to determine who is responsible for the content of the media service that shapes their opinion. [44] Personalisation can reduce the usefulness of this information, given that a media organisation shows each individual a different collection of news items. At the same time, personalisation creates the opportunity to better suit the expectations of the individual who places trust in the media. Not only is it possible to show each individual which (types of) articles have been shown to them specifically, personalisation also allows individuals to control the news they receive more directly and ensure that personalisation functions in a way that better aligns the goals of the media organisation with their own. Article 27 DSA, which regulates the recommender systems used by online platforms, aims to engage with these factors by better enabling individuals to understand and influence the parameters of the recommender systems that determine how information is prioritised for them. [45]

19

Secondly, transparency can enable and channel scepticism. By providing additional information and contextual cues, media regulation enables news consumers to assess for themselves whether they can trust reporting. [46] Although this can involve explanations of individual editorial decisions, media regulation has generally focused on higher level explanations. Concretely, it involves information on the organisation providing the information, and whether editorial content is actually an advertisement. [47] In doing so, regulation enables trust judgments regarding specific content or sources. Yet, key from a trust-perspective is that individuals are thus not expected to discount or doublecheck everything which they read, but rather that they can make broader trust judgments and rely on reporting until explanations trigger their scepticism. [48]

20

Scepticism is at first glance incompatible with trust. However, media regulation prevents individuals from having to adopt generalised scepticism to the media as a whole by enabling individuals to distinguish between the trustworthiness of different pieces of media content. [49] For example, the distinction between commercial and editorial content allows individuals to accept that while a media organisation may be influenced by external commercial pressures, these pressures are limited to the types of content labelled as advertising. [50] Distinctions in self-regulatory ethics codes, such as the duty to clearly separate news and opinion, fulfil a similar function. Without such distinctions, individuals would be forced to adopt a more generalised scepticism to all reporting by a media organisation. Explanations of the different forces behind different content channel this scepticism, and thereby safeguard trust in the media organisation as a whole. [51]

21

Finally, media regulation can enable trust repair. As citizens increasingly question journalists’ authority, it is not enough to put out responsibly produced content and assume that it will earn the trust of readers. It is also necessary to address questions as to journalistic authority by highlighting the accountability mechanisms with which the media organisation tries to prevent, detect, disclose, and address (perceived) violations of individuals’ trust. [52] At the most basic level, this includes transparency on the norms to which media organisations consider themselves held, and acknowledgments when their reporting fails to live up to such norms. More recent work also emphasises the importance of providing the audience with a way to act on these explanations by providing criticism and feedback. [53] Through these accountability processes, a more responsible media system can be incentivised. [54] Going a step further, individuals could also be given the option to (temporarily) assume more control over the manner in which a media organisation recommends news to them to create a space in which trust can be repaired. In this way, the media can limit the negative impact of (perceived) trust violations by giving the audience the opportunity to voice their scepticism and showing how these concerns are taken into account. [55]

22

News personalization challenges the way in which existing legal transparency and control measures can enable trust by changing the way news is delivered. As argued above, the increasing use and importance of news personalization impact trust by changing the way in which organizations select what news their audience should be informed about. However, transparency or empowerment measures tailored to the traditional media system do not necessarily enable individuals to assess the trustworthiness of the algorithmic tools that increasingly determine how they are informed. In this context, factors such as the way a personalization algorithm impacts an individuals’ news diet, the (editorial) values it is designed to promote, or the type of content it can recommend are relevant as well. These factors generally fall outside the scope of traditional transparency and empowerment measures, however, as they focus on the content that is published (for example by requiring that any commercial content is clearly identified, and a wide variety of content is available) or publishers themselves (for example by requiring the disclosure of the identity of the media organization or commercial party influencing content, and ensuring the media system contains a variety of sources of content with which individuals can engage). [56] As personalization algorithms increasingly mediate how individuals are exposed to content or sources, it becomes more important to adapt and expand on traditional transparency and empowerment measures in media regulation to allow individuals to assess the trustworthiness of the way information is algorithmically selected for them.

2.4. Surveying individuals’ perspective on trust and law

23

Increasingly, policy efforts, such as the DSA as well as the various EU disinformation codes, begin to reinvent the role that law can play to safeguard trust in the light of the technological changes in the online media environment. What remains unclear, however, is to what extent regulatory initiatives aiming to promote trust in the media in the context of technological change are in line with the way in which individuals form trust. This aspect is crucial because it is ultimately the individuals themselves who determine whether they do or do not trust. If regulation is expected to actually promote the trust necessary for individuals and the media to fulfil their role in democratic society, it needs to take into account the perspective of the individuals who place this trust in the media.

24

To that end, Sections C and D Max Drunen van MvD report on the methodology and results of the survey exploring the transparency and control items that individuals find significant when it comes to their trust in organisations using personalisation to inform them. The items (see 2) were developed from a conceptual framework of algorithmic transparency obligations in the context of news personalisation. The framework combines algorithmic transparency and media transparency literature to distinguish between disclosures concerning the organisation that operates the personalisation algorithm, the sources shown, the data used, the algorithm itself, and the output. [57] For the purposes of this survey, the framework was expanded with a number of control options serving as counterparts to the transparency items, [58] as well as recent regulatory measures put forward in the context of trust in platform and disinformation discussions. [59] The first set of research questions explores how important these transparency and control measures are to individuals when it comes to their trust in organisations which use personalisation to inform them.

 

RQ1a: how important are legal transparency measures to individuals’ trust in organisations that use news personalisation to inform them?

RQ1b: how important are legal control measures to individuals’ trust in organisations that use news personalisation to inform them?

RQ2: is there a difference between the importance of transparency and control measures to individuals’ trust?

25

News personalisation has the potential to impact individuals’ trust in the organisations that use it because it changes the way in which the audience is informed. [60] This would mean that the use of personalisation further limits the media’s ability to fulfil its role in society by reducing the number of individuals with high trust in the media. It is therefore important to know how news personalisation can be explained to or made controllable for individuals who already trust the media. At the same time, considerable policy and research attention is devoted to the need to prevent a decrease in trust. Research into analogue media indicates transparency is unlikely to restore the trust of individuals who have already lost trust in the media, given that the transparency is provided by an untrustworthy party. [61] Conversely, control options may not face the same challenge because they allow an individual to limit the media’s influence over their news diet. [62] In order to explore to what extent the tested transparency and control measures are suitable to enable individuals with high and low trust in the media respectively to trust organisations that personalise their news, the research asks the following questions:

 

RQ3a: is the extent to which individuals find transparency measures important related to their existing trust in the media?

RQ3b: is the extent to which individuals find control measures important related to their existing trust in the media?

26

Similarly, the importance attached to transparency of and control over personalisation algorithms may depend on an individual’s existing level of algorithmic literacy. Individuals first have to know what personalisation is and how it might affect them to gain an interest in better understanding or controlling a personalisation algorithm. [63] Knowing what information and control measures (if any) are important to individuals with high algorithmic literacy may indicate which types of measures will become more important as public awareness of the personalisation algorithms used by platforms grows. [64] This article thus aims to explore the following questions:

 

RQ4a: is the extent to which individuals find transparency measures important related to their algorithmic literacy?

RQ4b: is the extent to which individuals find control measures important related to their algorithmic literacy?

27

Finally, law’s ability to safeguard trust entails more than empowering individuals to protect themselves through transparency and control measures. An important way in which law protects trust is by prohibiting certain forms of behaviour, effectively reducing individuals’ level of vulnerability. The AI Act, which prohibits the use of certain AI systems deemed to be high risk, is an important recent example of this approach. Along similar lines, (self)-regulation of the media can limit unacceptable practices and provide individuals with further protection and certainty. [65] In other words, there can also be a role for further-reaching measures, either in the form of legal obligations or self-regulation to protect the legitimate interests and rights of users and society.

 

RQ5a: how important are measures in (self-)regulation to individuals’ trust in organisations that use news personalisation to inform them?

RQ5b: is there a relationship between the importance of self-regulation and the importance of transparency to individuals’ trust in organisations that use news personalisation to inform them?

RQ5c: is there a relationship between the importance of self-regulation and the importance of control to individuals’ trust in organisations that use news personalisation to inform them?

3. Methodology

28

The survey (Annex A) was distributed among a representative sample of the Dutch population. The total sample size was N = 1009. Representativeness was achieved based on age, gender, education, and region. The data collection was carried out by the research company IPSOS. The overall response rate was 27 per cent. The data collection took place between 15 and 20 April 2021 (5 days). The mean age of the sample was 48.17 (SD = 16.68 years), ranging from 18 to 89 years old. Half of the sample consisted of women (50 per cent). All respondents who successfully completed the survey received an incentive from the research company. A demographic overview of the sample is presented in Table 1.

 

 

Percentage (%)

Frequency (N)

Age categories (Mage = 48.17, SDage = 16.68)

 

 

18-34 years

26.76

270

35-54 years

32.80

331

55+ years

40.44

408

Gender

 

 

Women

50.45

509

Men

49.55

500

Education

 

 

Low

16.65

168

Moderate

39.94

403

High

43.41

438

Region

 

 

North

8.52

86

East

22.60

228

South

25.77

260

West

29.83

301

Three large cities (Amsterdam, Rotterdam & The Hague)

13.28

13.28

Table 1. Demographic characteristics of the sample.

4. Results

29

To answer RQ1a and RQ1b, we asked respondents to indicate how important a number of concrete transparency and control measures were to their trust in media organisations that use news personalisation to inform them. Answer options ranged from 1 (not important at all) to 7 (very important). The measures and associated mean values and standard deviations can be found in Table 2. In addition, we also provide the Cronbach’s alphas as estimates of internal consistency (which are all very high). It can be concluded that all transparency and control measures are perceived to be important by the respondents. The mean scores are relatively high (all between 5-6, with 7 being the maximum score). This highlights that people find all the transparency and control measures in the context of the media organisation, the data, the algorithm, and the output to be relatively important.

30

To provide an answer to RQ2, we calculated the average score of all the transparency and control items from Table 2. The average mean score of all the transparency items together is M = 5.47, SD = 1.09; the average mean score for control was 5.54, SD = 1.07. A t-test shows that there is a significant difference between these two values, meaning that people find control to be slightly more important than transparency: t(1008) = -4.46, p <.001. In addition, a Pearson correlation test shows that transparency and control are strongly correlated to each other (r = .90, p < .001). This means that the importance of transparency goes hand in hand with the importance of control measures.

31

To answer RQ3a and RQ3b, we conducted correlation analyses between individuals’ existing media trust and perceived importance of transparency and control measures. Results indicate a weak positive correlation between media trust and control (r = 0.10, p < .01). The exact same pattern for transparency: a weak positive relationship with media trust (r = .12, p < .001). These findings mean that people who have a higher media trust, also find control and transparency to be slightly more important.

32

To answer RQ4a and RQ4b, we ran correlation tests between people’s algorithmic literacy and perceived importance of transparency and control in news personalisation. Algorithmic literacy was measured based on items derived from a study of Zarouali, Boerman, and de Vreese. [66] The correlation between algorithmic literacy and transparency was r = .39 (p < .001); between algorithmic literacy and transparency r = .35 (p < .001). These correlation coefficients indicate a moderate positive relationship. This means that people with a higher algorithmic literacy tend to perceive transparency and control measures as more important as well.

33

To answer RQ5a, we asked respondents to indicate the importance of (self-)regulation at each of the five stages of the model. The average mean score of the importance of (self-)regulation to individuals’ trust is M = 5.33. Finally, in answering RQ5b and RQ5c, we again ran correlation tests. We found that there is a strong positive relationship between the importance of (self)regulation and the importance of control measures (r = .78, p < .001); the exact same strong positive corelation was also found between regulation and transparency (r = .78, p < .001). This indicates that the perceived importance of (self-) regulation is very much associated with people’s perceived importance of transparency and control measures in news personalisation.

 

 

Items

Mean

SD

The media organisation

 

 

Transparency

 

 

It is clear to what extent journalists and editors determine the way news is personalised.

5.36

1.41

It is clear whether commercial parties such as advertisers influence the way news is personalised.

5.48

1.50

It is clear to what extent the media organisation uses algorithms from other companies to personalise the news.

5.38

1.42

 

 

 

Control

 

 

The ability to choose between the personalisation algorithms of different companies on a website.

5.25

1.46

 

 

 

The source of the articles

 

 

Transparency

 

 

It is clear what the identity of the source of a recommended article is.

5.62

1.37

It is clear whether the source of a recommended article adheres to journalistic norms established by traditional media companies.

5.53

1.35

It is clear whether a recommended article comes from a government institution.

5.60

1.38

It is clear whether a recommended article is produced automatically or written by a human.

5.53

1.39

 

 

 

Control

 

 

The ability to choose from which sources you will receive news.

5.64

1.33

The ability to choose to only receive news from sources that adhere to journalistic norms established by traditional media companies.

5.56

1.37

 

 

 

The data

 

 

Transparency

 

 

It is clear what data is collected about you to personalize news.

5.75

1.37

It is clear for which other goals the collected data is used.

5.77

1.36

It is clear whether the collected data is shared with other parties.

5.80

1.37

 

 

 

Control

 

 

The ability to choose what data about you is used to personalise the news.

5.85

1.34

The ability to delete the data used to personalise news for you.

5.87

1.36

 

The algorithm

 

 

Transparency

 

 

It is clear why a specific article is recommended.

5.31

1.40

It is clear which factors have the most impact on the way news is personalised.

5.27

1.38

It is clear what goal the media organisation tries to achieve by personalising the news.

5.35

1.37

 

 

 

Control

 

 

The ability to turn news personalisation off.

5.94

1.32

The ability to indicate that a specific type of news article should be recommended more or less.

5.39

1.41

The ability to choose which factors have the most influence on the way news is personalised.

5.36

1.38

The ability to choose which goals the personalisation algorithm aims to achieve.

5.39

1.36

 

 

 

The output

 

 

Transparency

 

 

It is clear which parts of the site are personalised.

5.42

1.39

It is clear what type of news (for example, entertainment, politics, sport) has been recommended to you more often.

5.23

1.41

It is clear which important articles have not been recommended to you.

5.24

1.43

 

 

 

Control

 

 

The ability to choose to always see important articles.

5.68

1.40

The ability to see which sources or articles have not been recommended.

5.37

1.41

The ability to give feedback on the way news personalization works.

5.22

1.52

 

 

 

Overall Cronbach’s alpha

 

 

Transparency: .96

Control: .93

 

 

Table 2: overview of all transparency and control items with their respective mean values.

5.  Discussion

34

Trust is a psychological process that law aims to enable for normative purposes. This article has argued that doing so successfully in the context of news personalisation first requires us to determine what kind of trust law should promote in this context. Section B Max Drunen van MvD has therefore argued that aligning expectations, facilitating scepticism, and enabling trust repair promotes the kind of trust is necessary for individuals and the organisations informing them to fulfil their role in democratic society. However, knowing why and how media regulation should promote trust is not sufficient. To actually promote trust, media regulation must also account for the perspective of the individuals that decide whether an organisation that uses personalisation is trustworthy. Sections C and D therefore report the results of a survey, developed from a conceptual framework of algorithmic transparency obligations in the context of news personalisation; it explores different transparency and control items that individuals find significant when it comes to their trust in organisations using personalisation to inform them.

35

This research reveals that individuals find the transparency and control items that are suitable for the kind of trust media regulation aims to promote important when they decide whether to trust an organisation using news personalisation to inform them. Moreover, there is only a weak relationship between respondents’ existing trust in the media, and the importance of transparency and control measures to their trust. Though transparency about and control over personalisation are slightly more important to individuals who already trust the media, individuals with lower trust in the media also find these measures important to be able to trust organisations using personalisation to inform them. [67] Enabling individuals to trust organisations that use personalisation to inform them is important to both individuals who already trust the media as well as those with low trust in the media.

36

The differences between the value individuals attach to the various transparency and control items that we tested are relatively small. EU legislation that aims to improve the transparency of personalisation (and automated decision-making more generally) has traditionally focused on explaining the algorithms themselves. In particular, the General Data Protection Regulation (GDPR) requires that the logic and envisaged consequences of automated decision-making are communicated to individuals, while Article 27 DSA requires online platforms to inform users about the main parameters of their recommender systems. Our results indicate that information about the functioning of personalisation algorithms is only a small (and slightly less relevant) portion of the information that is important to individuals’ trust. Also information beyond the functioning of personalisation algorithms, including transparency about the data or the source of the content used in personalisation, is important to individuals’ trust. The former, information about data processing, is regulated extensively in data protection law. The latter, information about the source of the content individuals see, has traditionally been an important aspect of media regulation. [68] Measures adapting such information obligations to the online media environment are beginning to emerge in a fragmented fashion in self-regulation as well as EU and national law. Among others, the proposed AI Act and German Medienstaatsvertrag require that automatically generated content is labelled as such. [69] In addition, self-regulation and soft law increasingly include transparency measures intended to inform individuals that content was produced by a government institution or by a media organisation that adheres to journalistic norms. The results indicate such transparency about the recommended content is also important to enabling individuals to judge the trustworthiness of the organisation that uses personalisation algorithms to recommend content to them, such as online platforms. This finding is also relevant to legacy media organisations, which often only recommend articles produced through their own editorial processes and exercise more traditional editorial control over the design of personalisation algorithms. [70]

37

However, individuals are not merely interested in knowing more; this research demonstrates that the ability to exercise control over the way in which news is personalised, is strongly correlated with and even slightly more important to individuals’ trust than transparency. In particular, half of all respondents indicate that the ability to stop personalisation is very important to their trust in organisations using the technology to inform them.

38

On an abstract level, individuals’ demand for control is in line with the goals that law aims to achieve by establishing algorithmic transparency obligations. The goal is not simply to provide more information to individuals, but also to enable individuals to choose what news to read, to hold organisations accountable, or to trust the use of news personalisation. [71] Control options let individuals act on the information with which they are provided more directly. At the same time, research shows that individuals gain a better understanding of the manner in which a system functions by seeing how their control results in different outcomes. [72] Our research similarly indicated a strong relationship between the importance individuals attach to transparency and control. In short, control and transparency are intertwined.

39

In practice, legislation focuses on transparency, and offers individuals few options to act on the information made available to them. On the positive side, the control option most important to individuals’ trust, the ability to stop personalisation, is also the central focus of EU regulation that addresses individuals’ control over personalisation algorithms. Article 38 DSA now requires very large online platforms to give users at least one option for their recommender system that is not based on profiling. Article 27 DSA moreover requires that users can choose between different options for recommender systems in the section of the platforms’ interface where recommendations are provided. [73] Article 38 DSA is complemented by Article 22 GDPR, which regulates automated decision-making and profiling in general and similarly focuses on enabling individuals to reject personalisation by creating a right not to be subject to decisions solely based on automated processing. [74] However, individuals’ ability to use this right to stop personalisation is subject to multiple exemptions relating to for example whether news personalisation is based on consent or a contract, or involves decisions with legal or similarly significant effect. [75] Moreover, the GDPR does not regulate how the option to stop news personalisation should be offered to users, only requiring organisations to facilitate the exercise of the right provided under Article 22. [76] Conversely for very large online platforms, the DSA makes it easier to exercise the control the respondents in our sample found to be most important for trusting organisations that use news personalisation, namely to stop personalisation.

40

The results also indicate that it is important to look beyond simply stopping personalisation. Though the ability to stop personalisation was the control option our respondents indicated was most important to their trust, it was by no means the only control option they valued. However, the DSA does not require platforms to offer users any other option than to stop personalised recommendations—it only requires that any options platforms offer voluntarily are easily accessible. [77] The GDPR offers individuals few other options to exercise control over personalisation, most of which are focused on removing the data used for personalisation. [78] As a result of regulation’s narrow focus on the ability to stop personalisation, users are faced with a take-it-or-leave-it choice: either they have to trust personalisation with the parameters and goals platforms choose, or they reject personalisation altogether in favour of a non-personalised offer. This option is made all the less attractive by the fact the DSA does not impose any requirements on the non-personalised option it requires very large online platforms to offer.

41

According to our results, EU law’s current focus on enabling individuals to stop personalisation misses the importance individuals attach to control options that allow them to influence how, rather than only if their news is personalised. Moreover, it disregards the central role personalisation algorithms fulfil in the online media system. [79] By prioritising information for individuals based on their characteristics, personalisation algorithms make the overwhelming amount of content that is available online accessible. They can do so not only by providing individuals those news items they are most likely to engage with, but also by providing news that allows individuals to more deeply inform themselves about specific topics they are interested in or offer them diverse perspectives they do not normally encounter. [80] The importance of personalisation algorithms for navigating the online media environment, as well as the different ways in which they can do so in support of users’ needs and public values, is neglected by EU law’s narrow focus on stopping personalisation. Instead, regulation that empowers users could enable them to ensure personalisation algorithms do what they trust them to do by giving them more control over how their news is personalised. The results surfaced a number of control options that individuals perceive to be important to their trust, such as the option to always see important articles, determine the sources from which news is received, choose what data is used to personalise their news, or choose which goals the personalisation algorithm aims to achieve.  [81]

42

Neither control nor transparency are sufficient. The existence of (self-)regulatory norms regarding the way in which personalisation functions, is also critical for trust. The need for such regulation is an essential part of the criticism against individual-oriented transparency and control measures. A focus on empowering individuals can shift policy attention away from the responsibilities that organisations using personalisation bear themselves. [82] This creates the risk that individuals’ involvement replaces rather than complements platforms’ and the media’s responsibility for the use of news personalisation. The results above indicate that empowering individuals is not enough to create the conditions that can lead to trust. Instead, there was a strong relationship between a demand for more transparency and control, and a demand for (self-)regulation in order to support trust. Determining whether technology is trustworthy is therefore not only an individual concern, or individuals’ responsibility. Indeed, policymakers need to both enable individuals to ensure that organisations using news personalisation do what they trust them to do and adapt the regulatory mechanisms with which regulation has traditionally safeguarded trust. [83] In that process, attention should be paid to the factors that individuals have indicated to be relevant to their trust, including information about the influence of advertisers and other commercial interests on the way in which personalisation operates, or the ability of editors to exercise control over personalisation. [84]

 

43

Similarly, information about the data collected to make personalisation possible, and the other purposes for which it is used or actors with whom it is shared, is relatively important to individuals’ trust in whether organisations will inform them appropriately. The latter two factors are not directly related to the way in which media organisations inform individuals. As a result, when individuals determine whether they can trust an organisation using algorithms to inform them, they apparently also consider whether that organisation protects them from other risks that feature prominently in the public debate on technology. [85] Ensuring that the norms in data protection law, which already entitle individuals to this information, are effectively applied is consequently also an important aspect of ensuring trust when the media uses technology to inform its audience. [86] This especially holds true for public service media, which have a special obligation to act as a trusted source of information. [87]

44

Looking forward, exploring the role that general safeguards such as data protection play in supporting trust in different contexts is particularly important. This allows us to determine what role, if any, there is for overarching safeguards, as regards trust in horizontal legal frameworks such as the GDPR or AI Act. At the same time, it enables an analysis of the extent to which regulatory safeguards for trust need to take account of the specific context in which technology is employed. This is not only important to address the contextual nature of trust, it is also necessary to explore to what extent trust-supporting measures such as individuals’ control can be integrated in a way that respects values such as media freedom and editorial independence. Further exploring the differences and similarities in the relationship between trust and regulation in different contexts is therefore key to creating a comprehensive and consistent regulatory approach to trust in organisations using technology.

Annex A - Survey

The questionnaire below was translated from the Dutch version originally shown to participants.

 

What is news personalisation?

 

News personalisation is a technology that is used to automatically show a different selection of news articles to each reader. You can see a good example in the image below. Here, NU.nl uses news personalisation to show readers "recommended articles" on part of its site.

Two things are essential to make news personalisation possible: data and algorithms.

1) First, data has to be collected from the readers, such as their reading behaviour (preferences and interests) or location.

2) Based on that data, algorithms are then used to recommend articles to readers.

 

 

Questionnaire

 

The media organisation

We now want to learn more about your trust in news personalisation. The following questions are about the different parties that can influence the way news is personalised.

 

How important are the following conditions for you to trust an organisation that uses news personalisation to inform you? (1: not important at all – 7: very important)?

 

Transparency:

  • It is clear to what extent journalists and editors determine the way news is personalised.

  • It is clear whether commercial parties such as advertisers influence the way news is personalised.

  • It is clear to what extent the media organisation uses algorithms from other companies to personalise the news.

 

Control:

  • The ability to choose between the personalisation algorithms of different companies on a website.

 

The source of the articles

An algorithm can recommend news from different sources. Nu.nl, for example, only recommends its own articles. Conversely, Google News recommends articles from multiple media outlets, and Facebook recommends the articles its users upload.

 

How important are the following conditions for you to trust an organisation that uses news personalisation to inform you? (1: not important at all – 7: very important)?

 

Transparency:

  • It is clear what the identity of the source of a recommended article is.

  • It is clear whether the source of a recommended article adheres to journalistic norms established by traditional media companies.

  • It is clear whether a recommended article comes from a government institution.

  • It is clear whether a recommended article is produced automatically or written by a human.

 

Control:

  • The ability to choose from which sources you will receive news.

  • The ability to choose to only receive news from sources that adhere to journalistic norms established by traditional media companies.

 

Data

To personalize news, data about you must be collected. This data is used to determine which news articles you are shown.

 

How important are the following conditions for you to trust an organisation that uses news personalisation to inform you? (1: not important at all – 7: very important)?

 

Transparency:

  • It is clear what data is collected about you to personalize news.

  • It is clear for which other goals the collected data is used.

  • It is clear whether the collected data is shared with other parties.

 

Control:

  • The ability to choose what data about you is used to personalise the news.

  • The ability to delete the data used to personalise news for you.

 

Algorithm

In addition to your data, other information is also used to recommend articles. For example, how recent an article is, or what the subject is.

 

How important are the following conditions for you to trust an organisation that uses news personalisation to inform you? (1: not important at all – 7: very important)?

 

Transparency:

  • It is clear why a specific article is recommended.

  • It is clear which factors have the most impact on the way news is personalised.

  • It is clear what goal the media organisation tries to achieve by personalising the news.

 

Control:

  • The ability to turn news personalisation off.

  • The ability to indicate that a specific type of news article should be recommended more or less.

  • The ability to choose which factors have the most influence on the way news is personalised.

  • The ability to choose which goals the personalisation algorithm aims to achieve.

 

The news offer

Because of news personalisation you will see some articles more, and some articles less.

 

How important are the following conditions for you to trust an organisation that uses news personalisation to inform you? (1: not important at all – 7: very important)?

 

Transparency:

  • It is clear which parts of the site are personalised.

  • It is clear what type of news (for example, entertainment, politics, sport) has been recommended to you more often.

  • It is clear which important articles have not been recommended to you.

 

Control:

  • The ability to choose to always see important articles.

  • The ability to see which sources or articles have not been recommended.

  • The ability to give feedback on the way news personalisation works.

 

Closing questions

Knowledge about and trust in news personalisation

The following questions are about your awareness of the use of algorithms in the media. There are no right or wrong answers, this is not a test. We are interested in your own opinion. Please indicate to what extent you are aware of the following statements:

  1. Algorithms are used to recommend posts to me on Facebook.

  2. Algorithms show other people different posts than the ones I see.

  3. Algorithms are used to customize certain posts on Facebook.

  4. Algorithms are used to prioritize certain posts over other posts on Facebook.

Likert scale from 1 (completely unaware) to 7 (fully aware).

How much do you trust the media? (1:not at all to 7: very much).

 

Use of information and control

  • How likely is it that you will pay attention to information about news personalisation, provided it is easy to see and understand? (1: not likely at all – 7: very likely)

  • How likely is it that you will exercise control over how news is personalised, provided this control is easy to exercise? (1: not likely at all – 7: very likely)

 

Regulation of news personalisation

  • Whose job is it to make sure you can control news personalisation?

    • The government.

    • The media.

    • The platforms such as Facebook and Google

    • The organisation that personalizes news.

    • Nobody.

  • Whose job is it to make sure you can get information about news personalisation?

    • The government.

    • The media.

    • The platforms such as Facebook and Google.

    • The organisation that personalizes news.

    • Nobody.

  • How important are the following conditions to enable you to trust an organisation that uses news personalisation to inform you? (1: not important at all – 7: very important)

    • The existence of (self-)regulation about the parties that influence the way in which news is personalised.

    • The existence of (self-)regulation about the sources of news articles that are recommended.

    • The existence of (self-)regulation about the way in which the collected data is used.

    • The existence of (self-)regulation about the functioning of the algorithm that personalizes news.

    • The existence of (self-)regulation about the type of news that is recommended.

 

 

 

*by Max van Drunnen and Natali Helberger are at the Institute for Information Law, University of Amsterdam, Brahim Zarouali is at the Institute for Mediastudies, KU Leuven.

 



[1] Matthias Kohring and Jörg Matthes, ‘Trust in News Media: Development and Validation of a Multidimensional Scale’ (2007) 34 Communication Research 231, 238 <http://crx.sagepub.com/content/34/2/231.abstract>; Yariv Tsfati and Joseph N Cappella, ‘Do People Watch What They Do Not Trust?: Exploring the Association between News Media Skepticism and Exposure’ (2003) 30 Communication Research 504, 506 <https://doi.org/10.1177/0093650203253371>; Nayla Fawzi and others, ‘Concepts, Causes and Consequences of Trust in News Media – a Literature Review and Framework’ (2021) 45 Annals of the International Communication Association 154 <https://doi.org/10.1080/23808985.2021.1960181>.

[2] High level Group on fake news and and online disinformation, ‘A Multi-Dimensional Approach to Disinformation: Report of the Independent High Level Group on Fake News and Online Disinformation’ (European Commission 2018) 11; European Commission, ‘White Paper On Artificial Intelligence - A European Approach to Excellence and Trust’ (European Commission 2020) COM(2020) 65 final 11 <https://ec.europa.eu/info/sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf>.European Parliament legislative resolution of 5 July 2022 on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC 2020 [P9_TA(2022)0269]; Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on Artificial Intelligence (Artificial Intelligence Act) and amending certain union legislative acts 2021 [COM/2021/206 final]. The analysis in this article is based on the version of the DSA passed by the European Parliament on 7 September 2022.

[3] Charlie Beckett, ‘New Powers, New Responsibilities. A Global Survey of Journalism and Artificial Intelligence’ (LSE 2019) <https://blogs.lse.ac.uk/polis/2019/11/18/new-powers-new-responsibilities/>.

[4] Onora O’Neill, A Question of Trust (Cambridge University Press 2002); Damian Tambini, ‘Media Freedom, Regulation, and Trust: A Systemic Approach to Information Disorder’ (Council of Europe 2020) 18.

[5] European Commission, ‘White Paper On Artificial Intelligence - A European Approach to Excellence and Trust’ (n 2) 2; Neil M Richards and Woodrow Hartzog, ‘Taking Trust Seriously in Privacy Law’ (2016) 19 Stanford Technology Law Review 431 <https://www-cdn.law.stanford.edu/wp-content/uploads/2017/11/Taking-Trust-Seriously-in-Privacy-Law.pdf>; Balázs Bodó, ‘Mediated Trust: A Theoretical Framework to Address the Trustworthiness of Technological Trust Mediators’ [2020] New Media & Society 1 <https://doi.org/10.1177/1461444820939922> accessed 6 July 2020; Michael Veale and Frederik Zuiderveen Borgesius, ‘Demystifying the Draft EU Artificial Intelligence Act’ 98 <https://osf.io/preprints/socarxiv/38p5f/> accessed 26 July 2021. DSA article 34, AI Act article 5.

[6] Maartje ter Hoeve and others, ‘Do News Consumers Want Explanations for Personalized News Rankings?’ <http://scholarworks.boisestate.edu/fatrec/2017/1/8> accessed 4 November 2020; Alejandro Barredo Arrieta and others, ‘Explainable Artificial Intelligence (XAI): Concepts, Taxonomies, Opportunities and Challenges toward Responsible AI’ (2020) 58 Information Fusion 82 <http://www.sciencedirect.com/science/article/pii/S1566253519308103> accessed 27 October 2020; Gianclaudio Malgieri, ‘Automated Decision-Making in the EU Member States: The Right to Explanation and Other “Suitable Safeguards” in the National Legislations’ (2019) 35 Computer Law & Security Review 1 <http://www.sciencedirect.com/science/article/pii/S0267364918303753> accessed 27 October 2020.

[7] DSA articles 27, 38. DSA article 3(s) defines recommender systems as (partially) automated systems used by platforms to prioritise information. Recommender systems can be (but are not necessarily) personalised.

[8] Sonia Livingstone, ‘Tackling the Information Crisis: A Policy Framework for Media System Resilience’ (LSE Truth, Trust & Technology Commission 2018) <http://www.lse.ac.uk/media-and-communications/assets/documents/research/T3-Report-Tackling-the-Information-Crisis-v6.pdf> accessed 15 June 2020; Brian O’Neill, ‘Trust in the Information Society’ (2012) 28 Computer Law & Security Review 551 <http://www.sciencedirect.com/science/article/pii/S0267364912001409> accessed 19 June 2020; Helen Nissenbaum, ‘Securing Trust Online: Wisdom or Oxymoron?’ (2001) 81 Boston University International Law Review 31.

[9] This is at least the case within the specific context of the impact of technology on trust in news, which is the focus of this article Donghee Shin, ‘Why Does Explainability Matter in News Analytic Systems? Proposing Explainable Analytic Journalism’ (2021) 22 Journalism Studies 1047 <https://doi.org/10.1080/1461670X.2021.1916984> accessed 8 June 2021; Barredo Arrieta and others (n 6).

[10] MZ van Drunen, N Helberger and M Bastian, ‘Know Your Algorithm: What Media Organizations Need to Explain to Their Users about News Personalization’ (2019) 9 International Data Privacy Law 220 <https://academic.oup.com/idpl/advance-article/doi/10.1093/idpl/ipz011/5544759> accessed 8 August 2019.

[11] Caroline Pauwels and Ike Picone, ‘The Tussle with Trust: Trust in the News Media Ecology’ (2012) 28 Computer Law & Security Review 542, 543 <http://www.sciencedirect.com/science/article/pii/S0267364912001380> accessed 31 July 2020; Jesper Strömbäck and others, ‘News Media Trust and Its Impact on Media Use: Toward a Framework for Future Research’ (2020) 44 Annals of the International Communication Association 139, 148 <https://www.tandfonline.com/doi/full/10.1080/23808985.2020.1755338> accessed 15 May 2020; JD Lee and KA See, ‘Trust in Automation: Designing for Appropriate Reliance’ (2004) 46 Human Factors: The Journal of the Human Factors and Ergonomics Society 50 <http://hfs.sagepub.com/cgi/doi/10.1518/hfes.46.1.50_30392>; Lisa M PytlikZillig and Christopher D Kimbrough, ‘Consensus on Conceptualizations and Definitions of Trust: Are We There Yet?’, Interdisciplinary Perspectives on Trust (Springer International Publishing 2016).

[12] Strömbäck and others (n 11).

[13] Cristina Monzer and others, ‘User Perspectives on the News Personalisation Process: Agency, Trust and Utility as Building Blocks’ (2020) 8 Digital Journalism 1142 <https://www.tandfonline.com/doi/full/10.1080/21670811.2020.1773291> accessed 18 June 2020; Nic Newman and others, ‘Reuters Institute Digital News Report 2016’ (Reuters Institute for the Study of Journalism 2017) <http://reutersinstitute.politics.ox.ac.uk/sites/default/files/research/files/Digital%2520News%2520Report%25202016.pdf>; Robin Steedman, Helen Kennedy and Rhianne Jones, ‘Complex Ecologies of Trust in Data Practices and Data-Driven Systems’ (2020) 23 Information, Communication & Society 817 <https://doi.org/10.1080/1369118X.2020.1748090> accessed 9 April 2020; Jannick Kirk Sørensen, Hilde Van den Bulck and Sokol Kosta, ‘Stop Spreading The Data: PSM, Trust, and Third-Party Services’ (2020) 10 Journal of Information Policy 474 <https://www.jstor.org/stable/10.5325/jinfopoli.10.2020.0474> accessed 15 January 2021; Andreas Graefe and Nina Bohlken, ‘Automated Journalism: A Meta-Analysis of Readers’ Perceptions of Human-Written in Comparison to Automated News’ (2020) 8 Media and Communication 50 <https://www.cogitatiopress.com/mediaandcommunication/article/view/3019> accessed 28 October 2020.

[14] Annette Baier, ‘Trust and Antitrust’ (1986) 96 Ethics 231 <https://www.jstor.org/stable/2381376> accessed 29 October 2020.

[15] Strömbäck and others (n 11) 148; Thomas Hanitzsch, Arjen Van Dalen and Nina Steindl, ‘Caught in the Nexus: A Comparative and Longitudinal Analysis of Public Trust in the Press’ (2018) 23 The International Journal of Press/Politics 3 <http://journals.sagepub.com/doi/10.1177/1940161217740695>; Bernd Blöbaum, Trust and Communication in a Digitized World (Bernd Blöbaum ed, Springer 2016) <http://www.springer.com/it/book/9783319280578>.

[16] Kohring and Matthes (n 1); Katherine M Grosser, ‘Trust in Online Journalism’ (2016) 4 Digital Journalism 1036 <http://dx.doi.org/10.1080/21670811.2015.1127174>; Strömbäck and others (n 11) 142.

[17] Balázs Bodó, ‘Selling News to Audiences – A Qualitative Inquiry into the Emerging Logics of Algorithmic News Personalization in European Quality News Media’ (2019) 7 Digital Journalism 1054 <https://doi.org/10.1080/21670811.2019.1624185> accessed 14 January 2020; Neil Thurman and others, ‘My Friends, Editors, Algorithms, and I’ (2019) 7 Digital Journalism 447, 459 <https://doi.org/10.1080/21670811.2018.1493936> accessed 8 June 2021; Efrat Nechushtai and Seth C Lewis, ‘What Kind of News Gatekeepers Do We Want Machines to Be? Filter Bubbles, Fragmentation, and the Normative Dimensions of Algorithmic Recommendations’ (2019) 90 Computers in Human Behavior 298.

[18] Kohring and Matthes (n 1); Thurman and others (n 17) 459; Monzer and others (n 13).

[19] Jessica Kunert and Neil Thurman, ‘The Form of Content Personalisation at Mainstream, Transatlantic News Outlets: 2010–2016’ (2019) 13 Journalism Practice 759 <https://www.tandfonline.com/doi/full/10.1080/17512786.2019.1567271> accessed 23 November 2020; Bodó (n 17).

[20] Thurman and others (n 17); Monzer and others (n 13); Newman and others (n 13) 111.

[21] Sarah Eskens, Natali Helberger and Judith Möller, ‘Challenged by News Personalisation: Five Perspectives on the Right to Receive Information’ (2017) 9 Journal of Media Law 259.

[22] Guido Möllering, Trust: Reason, Routine, Reflexivity (Elsevier 2006).

[23] Möllering (n 22).

[24] DSA article 34-35; AI Act article 5, GDPR Chapter IV.

[25] Robert D Putnam, ‘Bowling Alone: America’s Declining Social Capital’ in Lane Crothers and Charles Lockhart (eds), Culture and Politics: A Reader (Palgrave Macmillan US 2000) <https://doi.org/10.1007/978-1-349-62965-7_12> accessed 29 October 2020.

[26] Maria Bigoni and others, ‘Trust, Leniency, and Deterrence’ (2015) 31 The Journal of Law, Economics, and Organization 663 <https://academic.oup.com/jleo/article/31/4/663/2492478> accessed 29 October 2020.

[27] Mark E Warren, ‘What Kinds of Trust Does a Democracy Need? Trust from the Perspective of Democratic Theory’ in Sonja Zmerli and Tom WG van der Meer (eds), Handbook on Political Trust (Elgar 2017) <https://www.elgaronline.com/view/edcoll/9781782545101/9781782545101.00013.xml>; O’Neill, A Question of Trust (n 4).

[28] Tambini (n 4); Thomas Gibbons, ‘Building Trust in Press Regulation: Obstacles and Opportunities’ (2013) 5 Journal of Media Law 202, 210 <https://www.tandfonline.com/doi/full/10.5235/17577632.5.2.202> accessed 30 May 2020; Benjamin Toff and others, ‘What We Think We Know and What We Want to Know: Perspectives on Trust in News in a Changing World’ (Reuters Institute for the Study of Journalism 2020) 5.

[29] CoE, ‘Recommendation of the Committee of Ministers to Member States on a New Notion of Media’ (Council of Europe 2011) CM/Rec(2011)7 para 53 <https://search.coe.int/cm/Pages/result_details.aspx?ObjectID=09000016805cc2c0>.

[30] European Commission, ‘White Paper On Artificial Intelligence - A European Approach to Excellence and Trust’ (n 2) 9.

[31] Carl I Hovland and Walter Weiss, ‘The Influence of Source Credibility on Communication Effectiveness*’ (1951) 15 Public Opinion Quarterly 635 <https://doi.org/10.1086/266350> accessed 16 June 2021.

[32] Jonathan L Herlocker, Joseph A Konstan and John Riedl, ‘Explaining Collaborative Filtering Recommendations’, Proceedings of the 2000 ACM conference on Computer supported cooperative work (ACM 2000); Ingrid Nunes and Dietmar Jannach, ‘A Systematic Review and Taxonomy of Explanations in Decision Support and Recommender Systems’ (2017) 27 User Modeling and User-Adapted Interaction 393.

[33] High-Level Expert Group on Artificial Intelligence, ‘Ethics Guidelines for Trustworthy AI’ (European Commission 2019) 4 <https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=60419>; European Commission, ‘White Paper On Artificial Intelligence - A European Approach to Excellence and Trust’ (n 2) 1.

[34] European Commission, ‘Tackling Online Disinformation: A European Approach’ (European Commission 2018) COM(2018) 236 final <https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52018DC0236> accessed 20 February 2020; CoE, ‘Declaration by the Committee of Ministers on the Financial Sustainability of Quality Journalism in the Digital Age’ (Council of Europe 2019) Decl(13/02/2019)2 <https://search.coe.int/cm/pages/result_details.aspx?objectid=090000168092dd4d> accessed 9 June 2019.

[35] Strömbäck and others (n 11) 146.

[36] B Vanacker and G Belmas, ‘Trust and the Economics of News’ (2009) 5 Journal of Mass Media Ethics 110.

[37] Mariëlle Wijermars, ‘Russia’s Law “On News Aggregators”: Control the News Feed, Control the News?’ [2021] Journalism 1, 2944 <https://journals.sagepub.com/doi/abs/10.1177/1464884921990917> accessed 15 February 2021.

[38] Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities (AVMSD 2018) 2018 articles 6a, 9(1)(b), 28b. Public Service Media have a special (and for certain public service media organisations such as the BBC, legal) obligation to act as a trusted source of information. Ofcom, ‘Operating Licence for the BBC’s UK Public Services’ (2020) s 1.24.3 <https://www.ofcom.org.uk/__data/assets/pdf_file/0017/107072/bbc-operating-licence.pdf>.

[39] See on these arguments e.g. Nissenbaum (n 8) 121.

[40] Caroline Fisher and others, ‘Improving Trust in News: Audience Solutions’ (2020) 0 Journalism Practice 1, 12, 14 <https://doi.org/10.1080/17512786.2020.1787859> accessed 8 July 2020; Donghee Shin, ‘User Perceptions of Algorithmic Decisions in the Personalized AI System:Perceptual Evaluation of Fairness, Accountability, Transparency, and Explainability’ (2020) 0 Journal of Broadcasting & Electronic Media 1 <https://doi.org/10.1080/08838151.2020.1843357> accessed 7 January 2021; Monzer and others (n 13).

[41] See e.g. Fisher and others (n 40) 7; Toff and others (n 28) 16; Bernadette Uth, Laura Badura and Bernd Blöbaum, ‘Perceptions of Trustworthiness and Risk: How Transparency Can Influence Trust in Journalism’ in Bernd Blöbaum (ed), Trust and Communication: Findings and Implications of Trust Research (Springer International Publishing 2021) <https://doi.org/10.1007/978-3-030-72945-5_3> accessed 9 July 2021.

[42] CoE, ‘Declaration on the Financial Sustainability of Quality Journalism’ (n 34) 5; Daryl Koehn, ‘Should We Trust in Trust?’ (1996) 34 American Business Law Journal 183 <https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1744-1714.1996.tb00695.x> accessed 30 June 2020.

[43] Eskens, Helberger and Möller (n 21); CoE, ‘Recommendation of the Committee of Ministers to Member States on Media Pluralism and Transparency of Media Ownership’ (Council of Europe 2018) CM/Rec(2018)1 <https://search.coe.int/cm/Pages/result_details.aspx?ObjectId=0900001680790e13>.

[44] AVMSD 2018 recital 16, article 5.

[45] European Parliament legislative resolution of 5 July 2022 on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (n 2) articles 25, 29, recital 62. Very large online platforms are defined as online platforms with 45 million or more EU users.

[46] O’Neill, A Question of Trust (n 4).

[47] Onora O’Neill, ‘Trust and Accountability in a Digital Age’ (2020) 95 Philosophy 3 <https://www.cambridge.org/core/journals/philosophy/article/trust-and-accountability-in-a-digital-age/ADBDD9EEF4426590D5A60AF87611240D> accessed 31 October 2019.

[48] Fisher and others (n 40) 7.

[49] Lara Fielden, Regulating for Trust in Journalism: Standards Regulation in the Age of Blended Media (University of Oxford, Reuters Institute for the Study of Journalism 2011) 117.

[50] L Hitchens, ‘Commercial Content and Its Relationship to Media Content: Commodification and Trust’ in Monroe E Price and Libby Verhulst, Stefaan G. Morgan (eds), Routledge handbook of media law (Routledge 2013) 102 <https://www.routledge.com/Routledge-Handbook-of-Media-Law/Price-Verhulst-Morgan/p/book/9780415683166>.

[51] Warren (n 27).

[52] O’Neill, ‘Trust and Accountability in a Digital Age’ (n 47).

[53] Monzer and others (n 13).

[54] High level Group on fake news and and online disinformation (n 2) 25.

[55] Gibbons (n 28) 212; European Commission, ‘White Paper On Artificial Intelligence - A European Approach to Excellence and Trust’ (n 2) 23.

[56] van Drunen, Helberger and Bastian (n 10); Monzer and others (n 13); CoE, ‘Recommendation on Pluralism’ (n 43) para 2.2, 2.7, 4.5; AVMSD 2018 recitals 15-16, article 5.

[57] van Drunen, Helberger and Bastian (n 10).

[58] This is sometimes referred to as interactive transparency in media transparency discussions Michael Karlsson, ‘Rituals of Transparency’ (2010) 11 Journalism Studies 535 <http://www.tandfonline.com/doi/abs/10.1080/14616701003638400>; David Domingo and Heikki Heikkilä, ‘Media Accountability Practices in Online News Media’, The Handbook of Global Online Journalism (Wiley-Blackwell 2012) <http://doi.wiley.com/10.1002/9781118313978.ch15>.

[59] High-Level Expert Group on Artificial Intelligence (n 33); European Commission, ‘Tackling Online Disinformation: A European Approach’ (n 34); CoE, ‘Declaration on the Financial Sustainability of Quality Journalism’ (n 34).

[60] Monzer and others (n 13).

[61] Michael Karlsson, ‘Dispersing the Opacity of Transparency in Journalism on the Appeal of Different Forms of Transparency to the General Public’ (2020) 21 Journalism Studies 1795 <https://doi.org/10.1080/1461670X.2020.1790028> accessed 26 July 2021.

[62] Monzer and others (n 13) 1153.

[63] Motahhare Eslami and others, ‘I Always Assumed That I Wasn’t Really That Close to [Her]’, Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI ’15 (ACM Press 2015) <http://dl.acm.org/citation.cfm?doid=2702123.2702556>; Emilee Rader, Kelley Cotter and Janghee Cho, ‘Explanations as Mechanisms for Supporting Algorithmic Transparency’, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18 (ACM Press 2018) <http://dl.acm.org/citation.cfm?doid=3173574.3173677> accessed 4 November 2020.

[64] Rader, Cotter and Cho (n 63); Brahim Zarouali, Sophie C Boerman and Claes H de Vreese, ‘Is This Recommended by an Algorithm? The Development and Validation of the Algorithmic Media Content Awareness Scale (AMCA-Scale)’ (2021) 62 Telematics and Informatics <https://www.sciencedirect.com/science/article/pii/S0736585321000460> accessed 7 July 2021.

[65] CoE, ‘New Notion of Media’ (n 29) para. 53; Gibbons (n 28) 216.

[66] Zarouali, Boerman and de Vreese (n 64).

[67] Fisher and others (n 40) 12.

[68] AVMSD 2018 recital 16, article 5; ‘Recommendation of the Committee of Ministers to Member States on Media Pluralism and Transparency of Media Ownership’ (2018) CM/Rec(2018)1 <https://search.coe.int/cm/Pages/result_details.aspx?ObjectId=0900001680790e13> accessed 10 June 2019; O’Neill, ‘Trust and Accountability in a Digital Age’ (n 47) 15.

[69] Medienstaatsvertrag 2020 article 18(3); Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on Artificial Intelligence (Artificial Intelligence Act) and amending certain union legislative acts (n 2) article 52(3); European Commission, ‘EU Code of Practice on Disinformation’ (2018) <https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=54454> article I, II.D.

[70] Bodó (n 17); Mariella Bastian, Natali Helberger and Mykola Makhortykh, ‘Safeguarding the Journalistic DNA: Attitudes towards the Role of Professional Values in Algorithmic News Recommender Designs’ (2021) 9 Digital Journalism 1 <https://doi.org/10.1080/21670811.2021.1912622> accessed 6 August 2021.

[71] van Drunen, Helberger and Bastian (n 10).

[72] S Shyam Sundar, ‘Rise of Machine Agency: A Framework for Studying the Psychology of Human–AI Interaction (HAII)’ Journal of Computer-Mediated Communication 82 <https://academic.oup.com/jcmc/advance-article/doi/10.1093/jcmc/zmz026/5700811> accessed 22 January 2020.

[73] As it is an option to influence the parameters. See also DSA recital 94.

[74] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data 1995 article 15.

[75] Sarah Eskens, ‘A Right to Reset Your User Profile and More: GDPR-Rights for Personalized News Consumers’ (2019) 9 International Data Privacy Law 153 <https://academic.oup.com/idpl/advance-article/doi/10.1093/idpl/ipz007/5525264> accessed 1 July 2019; Natali Helberger and others, ‘Regulation of News Recommenders in the Digital Services Act: Empowering David against the Very Large Online Goliath’ [2021] Internet Policy Review <https://policyreview.info/articles/news/regulation-news-recommenders-digital-services-act-empowering-david-against-very-large> accessed 21 July 2021.

[76] Article 12(2) GDPR; Mariella Bastian and others, ‘Explanations of News Personalisation across Countries and Media Types’ (2020) 9 Internet Policy Review 1 <https://www.econstor.eu/handle/10419/225645> accessed 7 October 2021; Luciana Monteiro Krebs and others, ‘Tell Me What You Know: GDPR Implications on Designing Transparency and Accountability for News Recommender Systems’, Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (ACM 2019) <http://doi.acm.org/10.1145/3290607.3312808> accessed 21 May 2019.

[77] Article 27 DSA.

[78] See for a full overview of the ways in which the GDPR can be used to influence personalisation Eskens (n 75).

[79] European Parliament legislative resolution of 5 July 2022 on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (n 2) recital 62; CoE, ‘Declaration on the Financial Sustainability of Quality Journalism’ (n 34) para. 10, 12; ‘Recommendation of the Committee of Ministers to Member States on Media Pluralism and Transparency of Media Ownership’ (n 68) para. 10, 2.3.

[80] Natali Helberger, Kari Karppinen and Lucia D’Acunto, ‘Exposure Diversity as a Design Principle for Recommender Systems’ (2018) 21 Information, Communication & Society 191 <https://www.tandfonline.com/doi/full/10.1080/1369118X.2016.1271900> accessed 15 May 2020; Jaron Harambam and others, ‘Designing for the Better by Taking Users into Account: A Qualitative Evaluation of User Control Mechanisms in (News) Recommender Systems’, Proceedings of the 13th ACM Conference on Recommender Systems - RecSys ’19 (ACM Press 2019) <http://dl.acm.org/citation.cfm?doid=3298689.3347014> accessed 27 September 2019.

[81] Harambam and others (n 80); Ian Brown, ‘Interoperability as a Tool for Competition Regulation’ [2020] OpenForum Academy <https://euagenda.eu/upload/publications/ian-brown-interoperability-for-competition-regulation.pdf>; ‘The Trust Project’ (Santa Clara University’s Markkula Center for Applied Ethics, 2018) <https://thetrustproject.org/>; Reporters Without Borders, ‘Journalism Trust Initiative’ (3 April 2018) <https://www.journalismtrustinitiative.org/>.

[82] M Ananny and K Crawford, ‘Seeing without Knowing: Limitations of the Transparency Ideal and Its Application to Algorithmic Accountability’ (2016) 20 New Media & Society 973.

[83] José van Dijck, Thomas Poell and Martijn de Waal, The Platform Society : Public Values in a Connective World (Oxford University Press 2018) 30, 159 <https://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=1901418&site=ehost-live&scope=site> accessed 13 February 2019.

[84] Tobias Eberwein, Susanne Fengler and Matthias Karmasin, The European Handbook of Media Accountability (Routledge 2019) <https://www.routledge.com/The-European-Handbook-of-Media-Accountability/Eberwein-Fengler-Karmasin/p/book/9781472457660>.

[85] Steedman, Kennedy and Jones (n 13); Gaurav Bansal and Fatemeh Mariam Zahedi, ‘Trust Violation and Repair: The Information Privacy Perspective’ (2015) 71 Decision Support Systems 62 <https://www.sciencedirect.com/science/article/pii/S0167923615000196>.

[86] Bastian and others (n 76); Paul C Bauer and others, ‘Did the GDPR Increase Trust in Data Collectors? Evidence from Observational and Experimental Data’ (2021) 0 Information, Communication & Society 1 <https://doi.org/10.1080/1369118X.2021.1927138> accessed 24 May 2021.

[87] Sørensen, Van den Bulck and Kosta (n 13).

Fulltext

License

Any party may pass on this Work by electronic means and make it available for download under the terms and conditions of the Digital Peer Publishing License. The text of the license may be accessed and retrieved at http://www.dipp.nrw.de/lizenzen/dppl/dppl/DPPL_v2_en_06-2004.html.

JIPITEC – Journal of Intellectual Property, Information Technology and E-Commerce Law
Article search
Extended article search
Newsletter
Subscribe to our newsletter
Follow Us
twitter
 
Navigation