Accountability and platforms' governance: the case of online prominence of public service media content

Krisztina Rozgonyi, Institute for Comparative Media and Communication Studies (CMC), Austrian Academy of Sciences, Vienna, Austria

PUBLISHED ON: 26 Oct 2023 DOI: 10.14763/2023.4.1723

Abstract

Public discourse has moved online, enabled by platforms, which in the context of information and media content have become an essential source, access point and key distributor of information. Public Service Media (PSM) — the ‘basic information service provider’ with a special mandate from the state — increasingly relies on platforms under the universality principle to reach out and interact with the broadest range of their audiences. However, control over PSM content dissemination and audience engagement is primarily determined by the private interest-ruled platforms via algorithmic recommendation systems (content curation) and according to their terms and conditions (community standards). This paper addresses the necessity and possibilities of safeguards for PSM content delivery on digital online platforms as an issue of media pluralism. Actual or potential policy interventions for the preferential treatment of public value content, aka due prominence online, were studied through the analytical lens of accountability in its interaction with platforms and PSM performance. Finally, the analyses on the appropriateness of the current accountability regimes for achieving pluralism objectives laid out recommendations for future policy for public-interest-driven platform governance.
Citation & publishing information
Received: Reviewed: Published: October 26, 2023
Licence: Creative Commons Attribution 3.0 Germany
Funding: The research leading to this paper has received funding from the Austrian Public Service Broadcasting Organisation (ORF) as part of the Public Value programme.
Competing interests: The author has declared that no competing interests exist that have influenced the text.
Keywords: Platform governance, Media, Accountability, Content filtering, Media pluralism
Citation: Rozgonyi, K. (2023). Accountability and platforms' governance: the case of online prominence of public service media content. Internet Policy Review, 12(4). https://doi.org/10.14763/2023.4.1723

Introduction

Internet intermediary services, particularly big-tech controlled social media hosting platforms, became the gatekeepers of information enabled by their algorithmic-driven content curation and recommendation techniques. Platforms’ powers, such as actively guiding and shaping the media’s democratic mission (Napoli, 2011; Helberger, 2018) and steering opinions (Helberger, 2020), raise severe concerns about democratic resilience and create various layers of communicative dependence. The platforms' private ordering (Kettemann, 2020) sets the decisions on controlling content and information based on internal corporate rules, considerations and assumptions rather than democratic or public interest values.

Finding and discovering media content of public value is critical in defining freedom of expression and diversity online (Mazzoli, 2020, 2021). Platforms' algorithmic content recommendation systems filter large amounts of information (content-based, collaborative or hybrid filtering), whereby prioritisation is meant to positively discriminate and promote certain content by making it more discoverable or prominent for digital audiences. Discoverability, prioritisation and prominence are all intrinsically linked to platforms’ policies and practices and also to democratic platforms and media governance across all levels and actors in the users, the media and the state (Helberger et al., 2020). Users’ ability to counter disinformation via media content of general (public) interest is intrinsic to the fundamental human right to receive accurate and unbiased information (Schulz et al., 2019). Still, disseminating the content of public service media (PSM) depends on how they platformise the business environment (Mazzucato et al., 2020). Also, the state has high stakes in reinforcing public interest objectives vis-à-vis platforms (Rozgonyi, 2020). Thus, online platforms and states determine the audience of content online, by establishing ‘regimes of prominence’, i.e. frameworks of rules concerning to what extent platforms can or should prioritise certain forms of content over others (CoE, 2021).

The rules on prominence should first define public interest content as the ‘objective’ of prioritisation, which is probably the hardest task for policy-makers. The normative definition includes both the aim of pursuing wider social objectives and the criteria “to be met to achieve the ideal outcomes” (Mazzoli & Tambini, 2020, p. 12), produced by various media organisations and content providers which “deliver social and public benefits to citizens” (Mazzoli & Tambini, 2020, p. 14). However, there are currently not any commonly agreed content standards concerning the criteria that qualifies content as having public interest. The European legislative acts similarly dismissed definitional clarity. The recently updated law of the European Union (EU), the Audiovisual Media Services Directive (Directive 2018/1808), refers to “media services of general interest” (Article 7a) while introducing obligations for EU member states to ensure their prominence. However, the AVMSD (Directive 2018/1808) remains vague on the normative criterion of what meets the general interest objective and left it to interpretation at the national level.1 Nevertheless, the policy objective of prominence rules was reiterated across Europe and acknowledged as appropriate measures, which could strengthen media pluralism (Cappello, 2022). The constraints of the normative hardships about the definition require the narrowing of the focus on a certain type of content within the broad public interest and value category of media works produced by PSMs.

All PSMs in the EU context are ‘basic information service providers’2 with a special mandate from the state to guarantee freedom of expression, media pluralism and diversity according to the European doctrine of the Article 10 framework of the European Convention on Human Rights. The EU’s PSMs are also meant to be in the best position (Lowe & Martin, 2014) to meet public interest standards, provide audiences with diverse media content and facilitate a pluralistic offer within the increasingly digital media environment (Sehl, 2020). The content of PSMs could be a primary candidate for becoming the prominence measure best meeting the criteria of general interest and public value. At the same time, the utmost care must be taken in appointing PSM content as the policy objective and not assigning blank privileges. First, there is the need to reflect on the inherent risks because if “not deployed with care, … content prioritisation practices could do more harm than good to European democracy, human rights and pluralism” (CoE, 2021, p. 5). Furthermore, policy design will have to face the global and the European realities of captured and biased PSMs (Dragomir & Soderstrom, 2022). States often orchestrate these specific PSMs as channels of government propaganda, which fuel both dis- and misinformation in specific contexts (Urbán et al., 2023) and ignore the democratic essence of their special mandate for serving the public. Similarly, vigilant policy design could hasten the downward trends of public disengagement with PSM in general and specifically in undemocratic media settings.

Thus, resilient future policy on prominence must contemplate broader media and platforms’ governance matters. At a minimum, the potential impact of content prioritisation on democracy and human rights (CoE, 2021) must be mitigated by appropriate governance mechanisms, in which the design should give accountability a central role. Therefore, this paper investigates three interrelated research questions (RQs). RQ1: Which elements of platforms’ legal and policy accountability regimes correlate to the due prominence of PSM content online? RQ2: How do PSM’s public accountability schemes reflect on value creation via digital, platform-based delivery? RQ3: What role could accountability play in public-value driven platform governance? Thus, this paper first lays out the theoretical groundwork for platforms’ and PSM’s accountability and identifies junctures relevant to media pluralism and the (potential) prioritisation of PSM content. Next, the paper uses a mixed legal and policy analysis method and a case study to examine accountability in the context of prominence online, following Bovens’s (2007) analytical model on accountability. Finally, the paper estimates the chances and hindrances of online media pluralism and opens the forum for further discussion.

Conceptualising accountability for establishing due prominence

Due prominence is a matter of media governance, which concerns the ‘regulatability’ of platforms and the potential for prioritised dissemination of public value content. Prominence could also be a ‘reward’ measure for PSM in return for serving the public. In both cases, the underlying accountability mechanisms platforms and PSM must obey are detrimental to the realisation of policy objectives. Accountability is “a relationship between an actor and a forum in which the actor is obliged to explain and justify his conduct, the forum can pose questions and pass judgment, and the actor may face the consequences” (Bovens, 2007, p. 452). Accountability is a fundamental constituent of platforms and media governance “within which enforcing standards and fulfilling obligations is a reasonable expectation” (Bovens et al., 2014, p. 5). The role of enforcement (rules, norms and judgements) and the ability of the forum, (regulatory authorities and other compliance bodies) to ask the right questions about the conduct of the actor (PSMs and platforms) was, therefore, essential in establishing the links between governance and accountability (Rhodes, 2007).

Several aspects of platforms’ accountability in the context of prominence need to be studied more closely. Content prioritisation is a form of positive discrimination of content via curation techniques by platforms, whereby inherent and evident risks are attached to automated private censorship (Pirkova et al., 2021). If such prioritisation is state-mandated, then government propaganda and the silencing of dissent could be the downsides of policy interventions. The impact on users’ exposure diversity (Helberger et al., 2018) is closely related to due prominence, i.e. the location of PSM content on platforms, but also to discoverability, the likelihood of content discovery and the curation of content (Mazzoli & Tambini, 2020). Platform accountability scholars have already raised some of these questions and triggered debates about appropriate accountability structures and mechanisms (Saurwein, 2019), the links between algorithms and online harms (Saurwein & Spencer-Smith, 2021), with “a much broader ambit than international law in that it focuse[d] less on norms and more on responsibilities of actors for different aspects of the governance of the internet” (Kettemann, 2020, p. 128). Meanwhile, ‘procedural accountability’ was realised in hard law and legislation, which hold that platforms and regulators are required to divide responsibilities based on broad policy objectives and governance standards (Bunting, 2018a, 2018b; Cappello, 2021). However, there is little evidence of whether the accountability design was attentive to the call on due prominence.

Similarly, PSM's 'accountability', 'responsibility' and 'responsiveness' influence on audience participation form two sides of the same coin (Baldi & Hasebrink, 2007). Also, accountability resonated with the ethos of public service broadcasting and the institutionalisation of the PSM (Jakubowicz, 2003) within the digitally transforming mediatised communicative context. ‘Public value creation’ is linked to accountability and, since the 2000s, the “instruments of control and accountability have become increasingly organised within a competition framework” (Van den Bulck, 2015, p. 80). This policy targeted PSMs as 'market disturbants' and 'distortion creators'; thus, accountability was re-conceptualised to closely monitor PSMs over (potential) 'wrong-doings' using public value ex-ante tests (Donders & Moe, 2011). Even today, PSM accountability models and implementation are based primarily on legally inscribed administrative frameworks, strictly regulated institutional governance and formally mandated contact points and channels for interaction (Cabrera Blázquez et al., 2022). Hence, PSMs generally struggled with meaningful accountable performance (Collins, 2011), though “some PSM organisations have developed advanced downward accountability to their publics” (Lowe & Martin, 2014, p. 34). Trappel (2016) argues early on for PSM to have dialogue and conversations with the audience, utilising the affordances of the digital context. However, the PSM’s digital reality was less encouraging, and innovation remained mostly technology-centric “with public broadcasters focusing primarily on the use of technological innovations to serve their own economic and market purposes” (Direito-Rebollal & Donders, 2022, p. 17). Certainly, there is a lack of detailed understanding of PSM’s public value creation in the platformised communication context or of accountable conduct vis-à-vis PSM users.

The investigation on platforms’ and PSM accountability for due prominence of public value content online considered these varying, though conceptually interlinked, notions of accountability. The research focused on the normative aspects and used a combination of legal (Milosavljević & Poler, 2019) and policy analyses (Puppis & Van den Bulck, 2019). The study design enabled an interdisciplinary approach to explore and analyse accountability's most directly relevant aspects. The primary interest resided with the paramount aspects of resilient future policy for due prominence online and the potential interplay between platforms’ and PSM’s accountability mechanisms as mitigation mechanisms ensuring media pluralism. The analysis entailed the perspectives of the audiences (users), PSM providers, the state (especially the national regulators) and the platforms to scrutinise the current accountability arrangements and evaluate their appropriateness. The analytical model followed Bovens’s (2007) scheme of accountability — and reflected Lindberg’s (2013) dimensions (source and strength of control, and direction of relationship). Hence, our model included the following six pillars:

  1. Actors: Who is accountable? Who is designated as being responsible for the organisation’s accountability? (Bovens, 2007, p. 457)?
  2. Forum: Which forum is to articulate public interest in content governance and represent media policy objectives on pluralism and diversity? On what premises and to whom? What is the political and social legitimisation and the judicial authoritativeness of the forum selected?
  3. Legitimacy: What is the social relation between the actor and the forum? How “to ensure that the legitimacy of governance remains intact or is increased” (Bovens, 2007, p. 464)?
  4. Obligation: What is the substance of accountability? What is the “obligation of the actor to explain and justify his or her conduct” (Bovens, 2007, p. 31), and in which dimensions (formal or informal; vertical, diagonal and horizontal) (Lindberg, 2013)?
  5. Compliance: How should the “coherent complex of arrangements and relationships” (Bovens, 2007, p. 465) be ensured? Monitored and controlled by whom?
  6. Consequences: What are the implications “brought upon the actor by the forum directly or indirectly” (Bovens, 2007, p. 452), are they sanction-based (in contexts of justified distrust) or trust-based (in contexts of justified trust) (Mansbridge, 2014)?

These pillars aligned with the call of European policy actors — specifically the Council of Europe — to set the principles for platforms in designing prominence measures (CoE, 2021). Thus, this study identified the nature and capacity of accountability embedded in laws and policies applicable towards platforms and PSM, which correlate to due prominence online, within the limitations of this paper which is the Europe-based and narrow focus on PSM content.

Case study: the Österreichischer Rundfunk (ORF)—Austria’s PSM

It was methodologically inevitable to conduct an additional national case study since PSM is a matter of national media policy in the European tradition and under EU law (Irion & Valcke, 2015) and the Amsterdam Protocol (2012) left it to the competence of the member states to define, organise and fund PSM. The Austrian case and the Österreichischer Rundfunk (ORF, Austria’s PSM) were suitable for scrutiny of accountability and the links to due prominence online. The ORF is a well-established PSM with a long history in Austria and is trusted by more Austrian citizens than the European average (EBU, 2022). The ORF is also representative in terms of market size and presence (Seethaler & Beaufort, 2022), with a meaningful — though legally constrained — online presence.3 Furthermore, the ORF News brand is Austria's most trusted news source (Reuters Institute et al., 2022). The digital-online transformation of the ORF is well documented in the ORF Public Value ‘Transform’ Report Series4 and demonstrates the ORF’s efforts on a ’new partnership’ with the public (Jakubowicz, 2013). Hence, the ORF case represents a PSM which delivers public value content aligned with national and international standards and brings news value to its audience in the digital online context. These factors could entitle the ORF to prominence privileges according to media pluralism policy objectives.

Accountability relevant to due prominence online: The composition of the body of research

The research of due prominence online needs a governance approach (Puppis, 2010) both conceptually and methodologically. The latest and most comprehensive research on media pluralism and diversity online in Europe, specifically on the prominence and discoverability of general interest content, approached the policy-legal-industrial aspects in parallel and reflected on the dynamic relationships between the constituting elements of due prominence (European Commission et al., 2022). Similarly, the accountability schemes relevant to prominence were necessarily embedded in such complex and interconnected mechanisms, “both formal and informal, national and supranational, centralised and dispersed, that aim to organise media systems” (Freedman, 2008, p. 14). Therefore, it was necessary to apply the governance approach to resolve the RQ1: ‘Which elements of platforms’ legal and policy accountability regimes correlate to the due prominence of PSM content online?’. The following section gives more details about the research sampling.

Statutory laws

Currently, platforms are not subject to directly applicable legal or regulatory measures concerning prominence (European Commission et al., 2022, p. 153). Thus, the sampling involved statutory legal acts with the most similar regulatory objectives and close to the legal traditions of the country case study, Austria. The Communication Platforms Act [Kommunikationsplattformen-Gesetz] (KoPl-G, 2020) introduced administrative and public accountability via transparency reports and compliance vis-à-vis the forum KommAustria, the communication regulator of Austria. The only law in Europe that introduced statutory norms towards social media platforms, search engines and news aggregators on accountability for content prioritisation was the German Interstate Media Treaty (MStV, 2020). The Digital Services Act (Regulation 2022/2065; DSA, 2020) introduced accountability duties on content recommender systems (Art. 27), requiring online platforms to clearly explain in their ‘terms and conditions’ the main parameters used to recommend information to users and their options to modify or influence those parameters. The rules of the KoPI-G, MStV and DSA (Broughton Micova, 2021) are also relevant to exposure diversity and prominence (see the analysis in Table 1).

European legal and policy standards

Article 10 of the European Convention for the Protection of Human Rights and Fundamental Freedoms (ECHR) guarantees, within the European legal context, the guiding principles and the accompanying standards on prioritisation and prominence enshrined by the right to freedom of expression and its corollaries, media freedom and pluralism. The Council of Europe, which administers the ECHR, set standards concerning prioritisation and prominence in 2021 and 2022, respectively. The ‘Guidance Note on the Prioritisation of Public Interest Content Online’ (CoE, 2021) recognises the implications of ‘regimes of prominence’ for democracy and human rights, puts forward instructions on how to make public interest content more prominent, and recommends the introduction of new responsibilities for platforms and intermediaries. Meanwhile, the ‘Recommendation on principles for media and communication governance’ (CoE, 2022) provides more details on the guiding principles to due prominence. Both these CoE documents were added to the sample (see Table 1).

Cross-platform standards

There are, currently, not any directly applicable industry standards or practices on prominence in Europe (European Commission et al., 2022, p. 50), but there are closely related accountability norms on countering disinformation attached to the ‘Code of Practice on Disinformation’ (European Commission, 2022). The Code introduces a novel prominence notion whereby platforms are required to prioritise authoritative information and guarantee the transparency of the relevant recommender systems. Also, the Ranking Digital Rights (RDR) initiative that monitors the accountability of algorithmic content curation and recommendation offers relevant accountability indicators (RDR, 2022) on ‘Algorithmic content curation, recommendation, and/or ranking systems’ (RDR, n.d., F12) (see Table 1 for assessments of both documents).

Private ordering/solo-regulation by platforms

Online platforms are not required to comply directly with legal or regulatory norms on due prominence of PSM content. Therefore, their private orderings (Kettemann, 2020) on content curation and recommendation had to be studied individually, and the rules and policies analysed according to the different accountability schemes. These solo-regulatory pledges are inserted in their ‘terms of service’ and other content policies (Milosavljević & Micova, 2016). Three platform providers — Meta, TikTok and Twitter5— operating four distinct social media platforms — Facebook, Instagram, Twitter and TikTok — were sampled for this paper. The selection criteria were to fall under the jurisdiction of the country case study, i.e. Austria, and more specifically of the KoPI-G6; compliance with the Austrian regulations (‘order verifications’)7, and effective use by the ORF for content dissemination.8

For Meta, the Corporate Human Rights Standards (Meta, 2021a), the Facebook (Meta, 2022d) and Instagram (Meta, 2022f) guidelines, the Content Ranking Policy (Meta, 2022b) and the Content Report (Meta, 2022c), were analysed in the context of accountability features on due prominence. TikTok’s approach to prioritisation and the underlying accountability framework was detected in the Community Guidelines (TikTok, 2022a), the For You feed policy on content recommendations (TikTok, 2019) and the latest Community Guidelines Enforcement Report (TikTok, 2022b). Twitter’s policies and self-determined rankings on (de)prioritisation were incorporated in the Twitter Rules (Twitter, 2022d), the Twitter Lists (Twitter, 2022a), and the Public-interest exception guide (Twitter, 2022c). At the same time, accountability appeared in the Twitter Transparency Center (Twitter, 2022e) and the Enforcement and Appeals (Twitter, 2022b) documents. Table 1 summarises the identified building blocks of solo-regulated accountability by platforms, indicating each case's exact sources of information (with legal or similar references).

Table 1: Accountability schemes applicable to online platforms in Europe
 

Actors

Forum

Legitimacy

Obligation

Compliance

Consequences

Statutory law

             

KoPI-G

Communication platform providers

(KoPI-G §1 (2))

NRA – KommAustria and RTR- GmbH – Media Division (KoPI-G §8)

Protection of users on communication platforms against new forms of violence, and hate on the Internet (’platforms’ responsibility’)

Organisational for the effective and transparent handling of certain illegal content (KoPI-G §3)

Transparency on due diligence obligations (prevention of illegal content, resources) (KoPI-G §4 (2.2))

Reporting obligations (general) KoPI-G §4 (1))

Reports (public) (Half-yearly or annual) (KoPI-G §4)

Users (affected) directly informed (KoPI-G §3(1)-(3)

Redress provided (KoPI-G § 3(4)

Responsible representative (KoPI-G §5)

Improvement order (KoPI-G §9)

Administrative fines (KoPI-G §10)

MStV.

Media intermediaries (MStV §91)

Media Authorities – Commission on Licensing and Supervision (MStV §105 (10))

Prevention of discriminatory blocking/de-ranking of specific journalistic-editorial offers

Transparency on criteria of (non-) prioritisation of journalistic-editorial offers (MStV §93 (1, 3))

Reporting (public) and case-by-case assessments based on/against reported criteria (MStV § 93)

Designated/authorised representative (MStV §92)

Administrative Offence/fines (MStV §115 (46))

DSA

Online platforms (distinguished very large online platforms (VLOPs) (DSA §2 h, §25)

National digital service coordinator (DSA §67), European Board for Digital Services (DSA §47) and the European Commission (DSA §51)

(Re)balanced responsibilities of users, platforms, and public authorities according to European values, placing citizens at the centre

Transparency on systemic risks assessment including media pluralism (DSA §26-27)

Transparency of recommender systems (DSA §29)

Transparency in online advertising (DSA §24)

Transparency reports (VLOPs) every six (public) months (DSA §13, 33 (1))

Data sharing with vetted researchers (VLOPs)

(DSA §31 (2))

Audit (VLOPs) (DSA §28 (3) and 33 (2))

Compliance officer (DSA §32)

Codes of conduct (Union level) (DSA §35)

Penalties set by EU Member States (DSA §42 (1))

Penalties by the EC (DSA Preamble 100, §59)

European legal and policy standards

CoE 2021

Platforms and intermediaries (CoE 2021, 25)

Independent national media regulatory authorities; Courts (human rights and the rule of law compliance; appeal by platforms) (CoE 2021, 25 v)

Mitigate the process of digital fragmentation; Restore trust in public information; Promote media diversity and pluralism; Advance ethics of truth-seeking and open deliberation (CoE 2021, 4-5)

Avoiding exploitation for censorship or propaganda (CoE 2021, 7-8)

Avoiding commercialisation and perpetuation of inequalities (CoE 2021, 9-10)

Safeguarding of human rights (freedom of expression, right to private life and data protection) (CoE 2021, 25 iv-v)

Openness and inclusiveness; transparency about opt-out options (CoE 2021, 25 i-vi)

Reporting on opt-out rates; Voluntary audit of prioritisation (CoE 2021, 25 v, vii)

No reference to consequences

CoE 2022

The media and platforms (CoE 2022, EM 13)

States (CoE 2022, EM 13)

Independent media and/or platform regulatory authorities; Independent third parties and other external experts (CoE 2022, EM 13)

Guaranteeing the discoverability, prioritisation and prominence of quality journalism (CoE 2022, EM 13)

Transparency, explainability and accountability of algorithmic systems for content dissemination (data processing, criteria of selection);

Transparency about equal treatment of content (non-discrimination); about opt-out options from personalisation (CoE 2022, EM 13)

Monthly and annual reporting (views of public interest content compared with other content; comparison of viewing on prioritised content and non-prioritised);

Reporting duties on algorithmic content curation and prioritisation (CoE 2022, EM 13)

No reference to consequences

Cross-platform standards

EU Code of Practice on Disinformation 2022

Platforms (Signatories)

Permanent Task Force (Signatories, EDMO, ERGA, EC and EEAS) (EC Code on Disinformation, 2022, 37)

Countering disinformation

Transparency on the prominence of authoritative information; transparency of recommender systems (prioritisation);

Establish and maintain the common Transparency Centre website (EC Code of Practice on Disinformation, 2022, 18.1, 19, 34)

Monitoring (Task Force, ERGA, EDMO, EC);

Reporting on Service Level Indicators (SLIs) and Qualitative Reporting Elements (QREs);

Audits (by VLOPs) (EC Code on Disinformation, 2022 40, 44)

Indirect (revelation of non-compliance)

RDR 2022

Tech companies

 

Safeguarding fundamental rights

Disclosure of the use of algorithmic systems to curate, recommend, and/or rank the content;

Disclosure of variables used by algorithms; Users’ control over variables; Opt-in/-out by users (RDR F12)

 

Reputational (Ranking and scorecards)

Solo-regulation

Meta

Meta

No direct forum

Indirect: Internal content moderators; Meta’s Human Rights Director;

Oversight Board

Indirect: Irish NRA (BAI) – (AVMSD): not applicable yet

International human rights standards (United Nations Guiding Principles on Business and Human Rights) (Meta, 2021a)

Avoiding low-quality, objectionable, particularly sensitive, or inappropriate (for younger viewers) content (Meta, 2022d), (Meta, 2022e)

Avoiding the risk of harm (physical, emotional and financial harm, or a direct threat to public safety) (Meta, 2022a)

Prioritisation of “newsworthy” content – criteria: “special value” (e.g., imminent threat to public health or safety); country-specific circumstances; politically relevant speech; mediated country context (free press) (Meta, 2022a)

Personalisation (machine learning (ML) ranking): content integrity, scoring/ranking, contextualisation (Lada et al., 2021)

Ranking: inventory, signals, predictions and relevance (Meta, 2022b)

Transparency reports (Meta, 2022c)

Indirect: Complaint to BAI (AVMSD): not applicable yet

(eventual) Policy/process adjustment (solo-regulation)

Indirect: AVMSD-based sanctions: not applicable yet

TikTok

TikTok

EMEA Trust & Safety Hub

Transparency and Accountability Centers (access only to policy, content safety, or security experts)

Indirect: UK NRA (Ofcom) – (AVMSD)

No reference to legitimacy

Prioritisation of ’safety, diversity, inclusion, and authenticity’

(TikTok, 2022a)

Personalisation; Interrupting repetitive patterns;

diversity

(TikTok, 2019)

Community Guidelines Enforcement Report (TikTok, 2022b)

Transparency Center – selected/approved experts to examine and verify TikTok’s practices

Complaint to Ofcom (AVMSD)

(eventual) Policy/process adjustment (solo-regulation)

Twitter

Twitter

No direct forum

Indirect: Trust & Safety team (Twitter, 2022e)

Indirect: Irish NRA (BAI) – (AVMSD): not applicable yet

Open Internet(free and secure) (Twitter, 2022d)

Public interest exception for content that would otherwise violate the Twitter Rules

(Tweets from elected and government officials): labelling & users’ choice (placing a Tweet behind a notice) (Twitter, 2022c)

Limiting engagement (no likes, no Retweets, no sharing, no algorithmic recommendation) (Twitter, 2022c)

Enabling users’ choice (Twitter, 2022a)

Twitter Transparency Center (Twitter, 2022e)

Twitter-; DM-; Account-level enforcement (Twitter, 2022b)

Indirect: Complaint to BAI (AVMSD): not applicable yet

(eventual) Policy/process adjustment (solo-regulation)

Indirect: AVMSD-based sanctions: not applicable yet

For optimal readability of this table click here.

Key: AVMSD – Audiovisual Media Services Directive; CoE – Council of Europe; DSA – Digital Single Act; EC – European Commission; EDMO - European Digital Media Observatory; EEAS - European External Action Service; ERGA – European Regulators Group for Audiovisual Media Services; EU MSs – European Union Member States; KoPI-G - Communication Platforms Act; MStV - Medienstaatsvertrag; NRA – National Regulatory Authority; RDR – Ranking Digital Rights 2022; VLOPs – Very Large Online Platforms.

Which elements of platforms’ legal and policy accountability regimes correlate to the due prominence of PSM content online (RQ1)?

The sample of platforms’ legal and policy accountability requirements was analysed according to the pillars of the analytical model, namely on Actors, Forum, Legitimacy, Obligation, Compliance and Consequences, through the lens of due prominence online. The inquiry focused on how platforms’ accountability was rendered towards prioritising content with public value, specifically PSM content, and how and to what extent accountability schemes aligned with the complex issues of media pluralism online.

Actors

The statutory legislative acts — KoPI-G, MStV and the DSA — failed to address any criteria for prioritising PSM content. The laws assessed online platforms solely based on their market-power capacities — size, user base (consumers) and sales revenues — without reflecting on their ‘opinion-power’ (Helberger, 2020). This is a significant flaw in the statutory design, hence a missed opportunity to mitigate the structural imbalances between PSM and dominant platforms. Other than the MStV, none of the legal acts encompassed the ecosystem of online digital media. Even the MStV fell short of reflecting on interdependencies or interplay among the addressees of the norms. Statutory silos characterised the law-making approach instead of governance-based, holistic interventions.

Similarly, the cross-platform standards exhibited a restrictive interpretation of accountability. The commitments on the responsibilities of platforms in countering disinformation were formulated in isolation and without meaningful reflection on PSM content as the possible antidote. The Council of Europe’s standards (CoE, 2021; CoE, 2022) showcased the most advanced and nuanced understanding of digital media and the role of platforms’ accountability within that and addressed the “deep structural imbalances between content providers and dominant platforms” (CoE, 2021, p. 4) and raised the necessity of prioritisation.

On the solo-regulatory level, all four sampled policies revealed some level of platforms’ assurances about content ranking, prioritisation and curation according to “what might be most valuable to users” (Meta, 2022b) but without any meaningful explanation on how such ‘value’ was approached; which criteria of value-creation was applied. None of the individual platforms’ policies acknowledged media pluralism objectives.

In sum, the conceptualisation of Actors within platforms’ actual and proposed accountability was mainly ignorant of public policy on media pluralism. The laws and policies singled out platforms as accountability actors in isolation and did not address their role in delivering content with public value. This approach critically failed not only to counter platforms’ opinion-power but also any chance for regulating them as potential contributors to pluralism online.

Forum

The statutory laws selected platforms’ accountability fora in a liability-based approach and administrative focus. In most cases, the legal acts entitled the national regulatory authorities (NRAs) to require platforms to be accountable, however, they lacked a mandate to enforce public policy on due prominence. The standards set by the Council of Europe (CoE, 2021; CoE, 2022) offered more advanced forum constructs and emphasised the role of ‘empowered users’, the online audiences.

Cross-platform policies — especially the EU Code of Practice on Disinformation — established accountability fora involving multiple stakeholders, including media organisations such as PSMs. Meanwhile, none of the platforms designated any forum for public accountability.

Arguably, the fora of platforms’ accountability lacked the mandate for policy implementation on media pluralism. The design of the accountability procedures was not appropriate to scrutinise public value creation in general, nor the prominence of PSM content. Again, this was a major flaw in potentially countering platforms’ powers.

Legitimacy

The primary objective of the statutory norms on regulating platforms was to counter online harms. Although the DSA enlisted media pluralism as a risk category that VLOPs must assess and mitigate, only the MStV set further policy objectives on safeguarding the democratic public sphere. Generally, the legitimisation of regulating platforms remains isolated and mostly connected to the harms of hate speech, disinformation and privacy breaches.

Meanwhile, the Council of Europe’s standards (CoE, 2021; CoE, 2022) are more ambitious in their objectives and offer sufficiently inclusive policy entitlements for states’ interventions with platforms on due prominence. However, the cross-platform standards defined the legitimacy of regulation narrowly, and the Code of Practice on Disinformation — even after the 2022 update — only focuses on the need for countering online disinformation and does not enhance the policy potential for prioritising public value information, such as PSM content (Commitment 19).

Platforms’ policies did not answer the legitimacy of their private orderings and only vaguely cited their commitment to the international standards on safeguarding human rights (Meta, 2021a).

The legitimacy of platforms' accountability about due prominence online remained mostly unanswered. The statutory laws briefly mentioned pluralism objectives but without establishing accountability mechanisms. Other than the Council of Europe, no other institution made legitimisation arguments on rendering platforms accountable for their capacity to steer public discourse online and their capability towards enforcing public policy on pluralism and diversity.

Obligation

The statutory acts imposed formal and vertical obligations of legal accountability and overall strict norms on transparency, generally in the form of procedural transparency duties and the publication of various reports. These reports serve the monitoring and evaluation by the forum about the quality of accountability and compliance. Also, platforms must be transparent about due diligence, such as systemic risk assessment (DSA), including media pluralism. The MStV pioneered a novel duty on transparency and applied the criteria to the (non-)prioritisation of journalistic, editorial offers. This rule provided content providers — including PSM — “at least with an estimate of how certain changes in their algorithms or homepages may have an impact on the visibility of certain content” (European Commission et al., 2022, p. 15). Importantly, the transparency rules of the DSA on platforms’ recommender systems are expected to impact platforms’ accountability concerning prioritisation. According to the DSA, VLOPs will have to indicate in their publicly accessible terms and conditions the “main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters”. Moreover, users will have to be informed of (and offered) non-personalisation, using recommendations not based on profiling.

On the European policy level, the Council of Europe standards on platforms’ governance (CoE, 2021; CoE, 2022) are meticulously detailed on how platforms are to meet accountability requirements. Transparency, explainability and accountability of algorithmic systems for content dissemination are crucial, while the rules to have transparency about users' opt-out options from personalised content curation are similarly important.

Cross-platform policies mostly prefer transparency reporting. The Code of Practice on Disinformation requires platforms to report on how they have ensured the prominence of authoritative information and, in which way they have made their recommender systems and prioritisation practices transparent. The RDR initiative expects platforms to disclose the use of algorithmic systems to curate, recommend, and rank the content, including the variables used by algorithms and users’ control options over variables.

Platforms have made several solo pledges towards their users directly relevant to content prioritisation. Meta vows to avoid low-quality, objectionable, particularly sensitive or inappropriate content (Meta, 2022d) and promises to prioritise ‘newsworthiness’ according to a set of criteria, including the ‘special value’ of the content, the country-specifics of politically relevant speech and the country’s press freedom context (Meta, 2022a). Also, Meta puts forward ranking content based on inventory, signals, predictions and relevance (Meta, 2022b). Similarly, TikTok prioritises content based on safety, diversity, inclusion and authenticity (TikTok, 2022a). Twitter promises to enable users’ choice in the prioritisation (Twitter, 2022a) and vindicates the right to make public interest exceptions for content that would otherwise violate the Twitter Rules (Twitter, 2022c). However, none of the platforms entered (voluntarily) into any regulatory scheme, i.e. industry-wide self-regulation, which could control the realisation of their individual, solo pledges. Thus, platforms’ current accountability about content prioritisation — performed in the form of regulation by the platforms (Gillespie, 2018) — is, at best, a collection of promises without verification.

Obligations on accountability are associated with (potential) due prominence online. Platforms' legally mandated commitments could offer entry points for future policies on positive content discrimination of public value and PSM content. The risk management on media pluralism, according to the implementation of the DSA, could make a breakthrough on the EU level.

Compliance

Platforms’ compliance usually entails administrative reporting duties. Platforms report to the authorities and regulators through transparency reports. The MStV is highly relevant to due prominence because it compels platforms to report case-by-case as to how they have applied the “criteria that serve[d] as the basis for the decision as to whether the content is accessible to a media intermediary and whether it remain[ed] that way” (MStV § 93 (1)). The DSA’s compliance scheme on data sharing with vetted researchers for the “purpose of conducting research that contributes to the identification and understanding of systemic risks” (DSA §31 (2)) could contribute to a systemic assessment of media pluralism risks.

The European policy standards similarly prioritised reporting tasks. Importantly, the Council of Europe recommends that platforms should inform the public about users’ opt-out rates, audited prioritisation (CoE, 2021), the views of public interest content compared with other content, the viewing of prioritised content and non-prioritised content and algorithmic content curation and prioritisation (CoE, 2022, Explanatory Memorandum 13). If this information is available to PSM, they could interrogate platforms about their conduct meaningfully. In parallel, the Council of Europe advises PSM to actively seek stakeholders' views and opinions and select the appropriate accountability channels.

The cross-platform policies address compliance without any links to content prioritisation. Platforms usually comply with accountability by setting up online ‘transparency centres’ and dedicated websites for aggregated reporting about their conduct but without commitments to highlighting content curation practices of public value content.

Legal and policy measures on platforms’ compliance mostly disregard their role in steering public communication and the impact of their ‘opinion power’ (Helberger, 2020) on media pluralism. The Council of Europe recommends that compliance could offer chances for due prominence for PSM content.

Consequences

The statutory rulings on platforms are sanction-based and formalised, setting out fines and similar penalties as the first and foremost consequences of cases of non-compliance with the law. There have been no possibilities enshrined for dialogical law enforcement. The fora of accountability — the NRAs and the European Commission — were not mandated to use sanctions in the form of positive interventions, such as content curation duties on platforms.

The European standards (CoE 202, 2022) are silent about consequences in cases of platforms’ non-compliance, and similarly, industrial policies leave this question open.

Platforms’ private ordering does not indicate any internally or externally validated processes for facing consequences of non-compliance or how they would eventually adjust to sanctions or recover from failures.

To sum up, the accountability of platforms in facing the consequences of misconduct does not encounter positive (content) obligations, which is a crucial policy-design failure and a missed opportunity for media pluralism online.

PSM and accountability

A similar governance approach was necessary to investigate PSM accountability while exploring the potential interactions and possible interplay with platforms’ governance and the opportunities for due prominence online. Within the context of the country case study on Austria, the (1) statutory law(s) on PSM; the (2) corresponding European legal and policy standards; the (3) industry policies and practices (both PSM and other quality media); and (4) the PSM practice (the ORF case) were analysed according to the main accountability analytical pillars. The following summary introduces the individual elements of the body of research.

Statutory law

The governing legal act of the ORF that dates to 1984 (ORF-G) stipulates the organisation, the public mandate (remit) and the functioning of the ORF. The law also reflects the evolution of ORF from PSB to PSM, including the transforming notion of accountability. The ORF-G is exemplary in the administrative, top-down designed accountability and a parallel attempt at incorporating some elements of public responsiveness (see Table 2).

European legal and policy standards

Under the framework of Article 10 of the ECHR, a PSM is a matter of media pluralism bearing a positive obligation of the state to put in place appropriate legislative and administrative guarantees (Berka & Tretter, 2013).9 Under this pretext, the Council of Europe set the primary standards for ensuring media pluralism, thus safeguarding PSM in Europe via successive recommendations (European Audiovisual Observatory, 2022). The latest and most directly relevant recommendation on governance explicitly set norms on the qualities of PSM (Recommendation CM/Rec(2012)1) (see Table 2).

Industry standards

There are at least two closely related industry initiatives on PSM accountability resonating to due prominence online. The first is by the European Broadcasting Union (EBU) which is the most significant alliance of PSM with 112 member organisations in 56 countries and regularly identifies major issues that impact PSM.10 In 2015, the EBU’s report on ’Assessing Transparency’ developed a methodology for PSM’s accountable encounter (EBU, 2015). One of the central features of the report is the ‘Transparency Index’ which is a tool for self-evaluation of PSM both on content and format (Table 2).

The other notable industrial practice is the Journalism Trust Initiative (JTI, 2019). The JTI started as a collaborative standard-setting process according to the European Committee for Standardization guidelines and aimed at developing and implementing indicators for the trustworthiness of journalism by “translating existing professional norms into machine-readable code” (JTI, n.d.). Finally, the indicators were developed to standards in 2019, which is to date the most comprehensive and widely recognised industrial agreement facilitating online prominence and laying down the norms for platforms to prioritise public value content (Table 2).

PSM practice: the ORF case

The ORF’s public value concept captures the PSB to PSM transformation momentum. Since 2012, the public value reports (PVRs) inform about the ‘quality dimensions’ (Individual-Social-Nation-International and Corporate value) of serving the public.11 Beyond reporting, the ORF also established fora for public accountability, such as the DialogForum12, and the #Next Generation series13 under the umbrella of the ‘TransFORM’ Process’ (ORF, 2022) (see Table 2).

Table 2 overviews all the sampled accountability schemes applicable to PSM and the ORF.

Table 2: Accountability schemes applicable to PSM and the ORF
 

Actor

Forum

Legitimacy

Obligation

Compliance

Consequences

Statutory law

PSB/ORF

Federal Chancellor (and the National Council and the Federal Council) (ORF-G § 7)

National regulatory authority (KommAustria) (ORF-G § 5, 7, 14, 36)

Foundation Council (ORF-G § 4a (2), 21)

Audience Council (Quality Committee) (ORF-G § 28)

The public (ORF-G § 7 and other reporting)

Court of Audit (ORF-G § 31a)

Auditing Commission (ORF-G § 40)

Core public mandate and further special mandates (ORF-G § 4, 5)

Transparency in the constant evaluation of quality criteria (ORF-G § 4 (3))

Audience satisfaction assessment (quality assurance system) (ORF-G § 4a (5))

Control (special mandate) of online services § 4e

Service concept evaluation/approval (ORF-G § 5a)

Control over advertising agreements (ORF-G § 14, 5b)

Audit compliance (ORF-G § 40)

Annual reporting to all I

Detailed reporting on the fulfilment of the mandate(s) (ORF-G § 7, 11, 12)

Legal supervision by KommAustria (ORF-G § 36)

Service concept submission to KommAustria and publication online (ORF-G § 5a (2))

Audit report (ORF-G § 39)

Financial reporting (§ 277, 280 of the Business Code)

Annulment of decision(s) and public announcement (ORF-G § 37 (2, 4))

Administrative penalties (ORF-G § 38)

Recovery procedure (ORF-G § 38a)

Civic law liability of the Director General (ORF-G § 22(4))

Dismissal of the Director General (ORF-G § 22(5))(ORF-G §)

European legal and policy standards

PSM

Normative categories ‘of 'public' representation: the state (government, parliament, independent regulatory and supervisory bodies); the public (audience, citizens, participants); civil society groups (CoE 2012 A. II. 28)

Public mandate/remit

Setting up a comprehensive accountability governance scheme (CoE 2012 A. II. 30)

Structured relationships with the ‘public’ (CoE 2012 A. II. 39)

Responsiveness (active and mandatory engagement) (CoE 2012 A. II. 45)

Actively seeking the views and opinions of stakeholders (CoE 2012 A. II. 45)

Accountability channels and fora (CoE 2012 A. II. 46)

Good governance

Industry standards and practices

EBU

PSM

Citizens

Stakeholders

PSM to reinforce their legitimacy

Transparency Index (Corporate-Financial-Remit-Social Transparency) (EBU, 2015, pp. 11-13)

Self-assessment tool provided by the EBU

Trust-building

JTI

All media outlets producing and disseminating journalistic content

Citizens, advertisers and regulators

Support the universal, individual freedom of opinion through access to information and independent, pluralistic media (JTI I.)

Declaration on Ethical and Professional Journalism (JTI II.)

JTI Standards on ‘Identity and Transparency’; ’Professionalism and Accountability’ (JTI Section A-B)

Use of machine-readable Language and Format (JTI IV.)

Self-assessment - JTI app

Disclosure

Independent certification

Prioritisation rewards (due prominence online, benefits, subsidies)

PSM practice

ORF/PSB/PSM

Political forum (Chancellor)

Administrative forum (KommAustria)

External stakeholders

Internal stakeholders

Value creation (5 dimensions: Individual-Social-Nation-International and Corporate value)

According to the Public Value Quality Dimensions (ORF-G, ORF-regulations, ORF guidelines and 'societal’ demands')

Annual Reports (special sections about online PSM delivery)

PVRs

DialogForum

Internal fora (#Next Generation)

Trustworthiness

Relevance to citizens

For optimal readability of this table click here.

Key: CoE - Council of Europe; EBU – European Broadcasting Union; JTI – Journalism Trust Initiative; PSB – Public Service Broadcaster; PSM – Public Service Media; PVR public value reports; ORF – Österreichischer Rundfunk; ORF-G - Bundesgesetz über den Österreichischen Rundfunk.

How do PSM’s public accountability schemes reflect on value creation via digital, platform-based delivery (RQ2)?

Similar to the previous inquiry, PSM accountability was studied according to the pillars of the analytical model, namely on Actor(s), Forum, Legitimacy, Obligation, Compliance and Consequences. The focus was on the concepts, policies and procedures of rendering PSM accountable to the public: How could accountability models be linked to the evaluation of the online performance of the PSM by the users and the public?

Actor(s)

Laws on PSM and the ORF regulated accountability according to liability categories in ORF’s conduct. Analysis could not identify norms of the ORF’s commitment to its online audiences or regulating interaction. However, the ORF is legally bound to conduct annual audience satisfaction assessments, which could build a bridge to accountable conduct online. Since the design is up to the ORF, the online engagement of their audience, feedback and interaction could be essential indicators of the ORF’s performance and serve as normative ‘quality checks’.14

Among industry standards, the JTI was most reflective of the double-sided nature of public value content provision and recommended systemic collaboration among all actors involved in content prioritisation. The ORF, in practice, approached accountability mostly with a top-down administrative perspective but without systemic feedback relevant to the online communicative environment.

Forum

The ORF law designated multiple fora to require the ORF to be accountable, including both political (the National Council) and regulatory (the NRA and the Audit Commission) and also established institutionalised public representation (the Audience Council), but no direct encounters with the public. Similarly, the PSM-relevant standards only suggested the designation of a highly formalised forum for supervision and formal accountability, but without the actual involvement of the audience. The ORF’s practice did not showcase systemic exchanges with the public either.

None of the fora for PSM accountability were eligible for responsive or otherwise dynamic accountability dialogue with the public. Neither the forum's conceptualisation nor the accountability procedures' design could allow for implementation in the online, hyper-active and reflective communication context. However, PSM online audiences could legitimately call for the opportunity to engage with digital PSM content, enjoying the affordances of social media: to debate, support, complain and criticise.

Legitimacy

The ORF-G defined the ‘core public mandate’ in an extensive, inclusive and broad manner but remained silent — even in the case of the special mandates15— about the distinguished role and responsibility of ORF in the digitally transforming communicative context. Meanwhile, the ORF’s self-account interprets the ‘Core Public Mandate’ according to the distinct public value creation dimensions (Individual, Social, Nation, International and Corporate value) but fails to translate into specific actions. Thus, none of the legal acts contained norms on the preferential treatment of PSM content online.

However, the Council of Europe policy guidance on PSM governance directly linked the criteria of ‘transparency and openness’ (Recommendation CM/Rec(2012)1) with ‘openness and inclusiveness’ (CoE, 2022, iv) and recommended standards to assess public interest content. These standards could create a base for legitimising public interventions with platform-based communication for the due prominence of PSM content. While the EBU missed the opportunity to connect PSM transparency to seek broader legitimacy of public service content online (EBU, 2015), the JTI (2019) successfully put forward arguments for regulation in favour of public interest content online to preserve the ‘health’ of digital societies.

Obligation

The legal character of PSM’s accountability is procedural, non-interactive or dynamic. The ORF reports on ‘quality assurance’ and audience satisfaction in highly formalised and strictly regulated manners but without direct public engagement. Accordingly, the ORF kept “records of ORF’s media performance, value and benefits for the audience” across “five quality dimensions and eighteen performance categories” (ORF, n.d.h) but without dynamic communication with their digital audience.

The PSM-directed Council of Europe standards (Council of Europe, 2012) recommended comprehensive accountability structures and ‘responsive’ (active and mandatory) relationships with the public. Complimentarily, the European Broadcasting Union’s ‘Transparency Index’ (EBU, 2015) enables PSM to self-assess the appropriateness of accountability. Hence, the standards created eligible frameworks based on which PSM could interact with online audiences and get responses on their digital performance, which could, in turn, legitimise due prominence privileges.

Compliance

A PSM’s compliance with accountability is mostly administrative and only takes the form of reporting. The ORF’s Annual Report is the main vessel of accountability, informing the state representatives and the public about the ORF’s conduct. In the case of online services, the ORF seeks the approval of KommAustria in the form of a ‘service concept’ but without the obligation for direct audience feedback. The special section of ORF’s Annual Report about ‘ORF.at’ provides both quantitative data on the use of the ORF.at network (user base, visits, page impressions, live-streaming and video-on-demand) and qualitative information on topical areas, stories and special features on news.ORF.at (ORF, 2022, p. 146), but lacks any references to the engagement of the ORF online community.

Similarly, the European policy standards set forth mainly only reporting tasks. Importantly, the Council of Europe recommends that platforms should inform the public about users opt-out rates, audited prioritisation (CoE, 2021), the views of public interest content compared with other content, the compared viewing of prioritised content and non-prioritised content, and algorithmic content curation and prioritisation (CoE, 2022). If this information were available to PSM, they could potentially better argue for due prominence.

Consequences

The consequences of ORF’s non-compliance with the public mandate include mostly sanctioning, such as the annulment of ORF’s decision(s), administrative penalties against the ORF and the reimbursement of (potential) unlawful enrichment, but no systemic mitigation possibilities. The ORF’s practice did not record any procedures for recovery or learning from their failures. The Council of Europe and the EBU standards did not address the consequences. Thus, none of the applied or envisioned consequences utilised the potential of digitally enabled dialogue and interactive mitigation with the public.

Public value delivery and accountability: chances of and hindrances to media pluralism online

Content of public interest, and specifically PSM-produced content, must find its ‘way’ to the audiences and provide citizens with trusted information within platformised, privately ordered, under-regulated and non-policy governed contexts. These circumstances compel PSMs to demonstrate public value creation online, curated and eventually recommended by platforms and supervised by legally mandated PSM and platforms’ accountability fora. All PSMs within this triangle are formally and informally required to be accountable on several fronts and in multiple relations to be relevant and trustworthy. Therefore, this paper has investigated the role accountability could play in platform governance driven by public-value and focused on the intersections of platforms’ accountability regimes and PSM public accountability (RQ3).

Platforms' legal and policy norms usually omit the digital communicative ecosystem as a (potential) space of democratic discourse and counteraction. The laws reflect neither the platforms’ opinion power (Helberger, 2020) nor connect policy objectives on countering ‘online harms’ to potential positive obligations on prioritising public value content, such as PSM content, and prefer sanction-based accountability. Consequently, they mitigate online harms at best but fail to create systemic and robust countervailing powers across the varied dimensions of platforms’ governance. The current and recent legislation failed security measures on prioritised dissemination of public value content via accountable, digitally enabled dialogue and interactive mitigation with the public.

Similarly, the PSM-relevant accountability concepts fall short in (re-)considering PSM within open, accessible, inclusive and interactive governance, enabling and encouraging democratic participation and active control by PSM users. The top-down administrative accountability of current practice needs to be more responsive to the possibilities of participatory value creation in the digital environment. Due prominence should be organised according to the capacities and capabilities of PSM, along with novel models of accountability and the uptake of corresponding instruments. These “need to be created bottom-up in a participatory manner, in close liaison with broad stakeholder groups” and within an “organisational culture and practices to nurture accountability” (Sorsa, 2019, p. 147).

Forum selection for accountability needs to be more extensive and controversial. The forum — usually the national regulators — was appointed without notice of their preparedness or ability to ‘ask the right questions’ or to assess the public value of content delivered on platforms or content curation. Transparency reporting is the ultimate accountability arrangement but without recourse as positive obligations on media pluralism. There are not any accountability procedures for platforms or PSM that incorporate systemic re-channelling of users’ and audiences’ feedback and guarantees of reflection on users' exposure to PSM content or its public value. Similarly, sanctions are the (almost) only consequences of breaches of accountability, but without systemic intrusions with platforms’ corporate ‘black boxes’ on algorithmic content curation or with PSM’s non-provision of public value. In sum, these lost opportunities have long-lasting and detrimental effects on the future of media pluralism online.

The findings of this paper should inform current and upcoming policy debates on due prominence online of content to the public interest, including PSM content, and offer a range of legal and regulatory venues which need redress on accountability. Platforms must be required to give a meaningful, enforceable and systemic account of their content prioritisation and curation policies and practices, focusing on public interest content. Furthermore, the PSM public value creation and legitimisation procedures and realisation demand different and discursively designed accountability frameworks. The European standards and the national laws and regulations should lessen the administrative accountability obligations on PSM but enable and request systemic dynamic dialogic accountability. These are minimal but inevitable first steps towards the realisation of media pluralism and diversity online.

The upcoming European and national legislative and regulatory events could and should address the current legal, policy and practical incongruities. The transposition and implementation of due prominence foreseen by the revised AVMSD towards audiovisual media services providers is ongoing. The first results are promising regarding the realisation of policy objectives (Cappello, 2022). The recently proposed European Media Freedom Act (EMFA, 2022) was expected to put forward norms on due prominence online because the “political will is building, the evidence is growing, and the EU has the opportunity to offer such a framework for a new social contract through its interlinked proposals – namely, the DSA, DMA and EMFA” (European Commission et al., 2022, p. 190). However, the first draft of the EMFA has missed this opportunity and remained silent about possible future policies and none of the compromise proposals in the European Parliament touched upon the prioritisation of public value content (European Parliament, 2023). Perhaps the DSA implementation and PSM regulatory reforms across Europe could re-open the necessary policy debates and put due prominence back on the policy agenda. The accountability deficiencies unearthed in this paper and the recommended actions could serve as the basis for a coordinated approach to media policy interventions both on the European and national levels.

References

Amsterdam Protocol. (2012). Protocol (No 29) on the system of public broadcasting in the member states (pp. 312–312). European Union. https://eur-lex.europa.eu/eli/treaty/tfeu_2012/pro_29/oj

Baldi, P., & Hasebrink, U. (Eds.). (2007). Broadcasters and citizens in Europe: Trends in media accountability and viewer participation. Intellect Books.

Berka, W., & Tretter, H. (2013). Public service media under article 10 of the European Convention on Human Rights [Study]. European Broadcasting Union. https://www.ebu.ch/files/live/sites/ebu/files/Publications/Art%2010%20Study_final.pdf

Bovens, M. (2007). Analysing and assessing accountability: A conceptual framework. European Law Journal, 13(4), 447–468. https://doi.org/10.1111/j.1468-0386.2007.00378.x

Bovens, M., Goodin, R. E., & Schillemans, T. (Eds.). (2014). The Oxford handbook of public accountability. Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199641253.001.0001

Broughton Micova, S. (2021). What is the harm in size? Very large online platforms in the Digital Services Act [Issue paper]. Centre on Regulation in Europe. https://cerre.eu/publications/what-is-the-harm-in-size/

Bunting, M. (2018a). From editorial obligation to procedural accountability: Policy approaches to online content in the era of information intermediaries. Journal of Cyber Policy, 3(2), 165–186. https://doi.org/10.1080/23738871.2018.1519030

Bunting, M. (2018b). Keeping consumers safe online: Legislating for platform accountability for online content [Independent report]. Communications Chambers. http://static1.1.sqspcdn.com/static/f/1321365/27941308/1530714958163/Sky+Platform+Accountability+FINAL+020718+2200.pdf?token=M0yRHA6bwaAGicAaV9th9n8vAqc%3D

Cabrera Blázquez, FJ., Cappello, M., Talavera Milla, J., & Valais, S. (2022). Governance and independence of public service media (Iris Plus, p. 130). European Audiovisual Observatory. https://rm.coe.int/iris-plus-2022en1-governance-and-independence-of-public-service-media/1680a59a76

Cappello, M. (Ed.). (2021). Unravelling the Digital Services Act package (IRIS Special) [Report]. European Audiovisual Observatory. https://rm.coe.int/iris-special-2021-01en-dsa-package/1680a43e45.

Cappello, M. (Ed.). (2022). Prominence of European works and of services of general interest (IRIS Special) [Report]. European Audiovisual Observatory. https://rm.coe.int/iris-special-2022-2en-prominence-of-european-works/1680aa81dc

Collins, R. (2011). Public value, the BBC and Humpty Dumpty Words – does public value management mean what it says? In K. Donders & H. Moe (Eds.), Exporting the public value test. The regulation of public broadcasters’ new media services across Europe (pp. 49–58). Nordicom, University of Gothenburg. https://www.nordicom.gu.se/en/publications/exporting-public-value-test

Council of Europe. (2012). Recommendation CM/Rec(2012)1 of the Committee of Ministers to member States on public service media governance (Adopted by the Committee of Ministers on 15 February 2012 at the 1134th meeting of the Ministers’ Deputies). Council of Europe. https://www.refworld.org/docid/506981e72.html

Council of Europe. (2021). Guidance note on the prioritisation of public interest content online adopted by the Steering Committee for Media and Information Society (CDMSI) at its 20th plenary meeting, 1-3 December 2021. Council of Europe. https://rm.coe.int/cdmsi-2021-009-guidance-note-on-the-prioritisation-of-pi-content-e-ado/1680a524c4

Council of Europe. (2022). Principles for media and communication governance—Recommendation CM/Rec(2022)11 and explanatory report. Council of Europe. https://edoc.coe.int/en/media/11117-principles-for-media-and-communication-governance-recommendation-cmrec202211-and-explanatory-report.html#

Digital Services Act. (2020). Proposal for a regulation of the European Parliament and of the Council on a single market for digital services (Digital Services Act) and amending Directive 2000/31/EC. European Parliament and Council. https://eur-lex.europa.eu/legal-content/en/TXT/?uri=COM%3A2020%3A825%3AFIN

Directive 2018/1808. (n.d.). Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in member states concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities. European Parliament. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32018L1808

Direito-Rebollal, S., & Donders, K. (2023). Public service media as drivers of innovation: A case study analysis of policies and strategies in Spain, Ireland, and Belgium. Communications, 48(1), 43–67. https://doi.org/10.1515/commun-2021-0003

Donders, K., & Moe, H. (2011). Exporting the public value test: The regulation of public broadcasters’ new media services across Europe. Nordicom, University of Gothenburg. http://urn.kb.se/resolve?urn=urn:nbn:se:norden:org:diva-10044

Dragomir, M., & Soderstrom, A. (2022). The state of state media: A global analysis of the editorial independence of state media and an introduction of a new state media typology [Study]. Center for Media, Data and Society. https://cmds.ceu.edu/sites/cmcs.ceu.hu/files/attachment/article/2091/thestateofstatemedia.pdf

European Broadcasting Union. (n.d.). Media intelligence service (MIS). https://www.ebu.ch/media-intelligence

European Broadcasting Union. (2015). Assessing transparency: A guide to disclosing information online [Report]. European Broadcasting Union. https://www.ebu.ch/publications/research/login_only/report/assessing-transparency-a-guide-t

European Broadcasting Union. (2022). Trust in media [Report]. European Broadcasting Union. https://www.ebu.ch/publications/research/login_only/report/trust-in-media

European Commission. (2022). The strengthened code of practice on disinformation 2022 [Code]. https://digital-strategy.ec.europa.eu/en/library/2022-strengthened-code-practice-disinformation

European Commission, Directorate-General for Communications Networks, Content and Technology, Parcu, P., Brogi, E., Verza, S., Centre on Media Pluralism and Media Freedom (CMPF)., CiTiP (Centre for Information Technology and Intellectual Property) of KU Leuven., Institute for Information Law of the University of Amsterdam (IViR/UvA)., Vrije Universiteit Brussels (Studies in Media, Innovation and Technology, VUB SMIT)., Da Costa Leite Borges, D., Carlini, R., Trevisan, M., Tambini, D., Mazzoli, E. M., Klimkiewicz, B., Broughton Micova, S., Petković, B., Rossi, M. A., Stasi, M. L., Valcke, P., & ... Domazetovik, N. (2022). Study on media plurality and diversity online: Final report. Publications Office of the European Union. https://data.europa.eu/doi/10.2759/529019

European Media Freedom Act. (2022). Proposal for a regulation of the European Parliament and of the Council establishing a common framework for media services in the internal market (European Media Freedom Act) and amending Directive 2010/13/EU. European Parliament and Council. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52022PC0457

European Parliament. (2023). Proposal for a regulation establishing a common framework for media services in the internal market (European Media Freedom Act) and amending Directive 2010/13/EU. https://www.europarl.europa.eu/meetdocs/2014_2019/plmrep/COMMITTEES/CULT/DV/2023/09-07/CA-EMFA_Articles_EN.pdf

Freedman, D. (2008). The politics of media policy. Polity.

Gillespie, T. (2018). Regulation of and by platforms. In J. Burgess, A. Marwick, & T. Poell (Eds.), The Sage handbook of social media (pp. 254–278). SAGE Publications Ltd. https://doi.org/10.4135/9781473984066

Helberger, N. (2018). Challenging diversity—Social media platforms and a new conception of media diversity. In M. Moore & D. Tambini (Eds.), Digital dominance -The power of Google, Amazon, Facebook, and Apple (pp. 153–175). Oxford University Press.

Helberger, N. (2020). The political power of platforms: How current attempts to regulate misinformation amplify opinion power. Digital Journalism, 8(6), 842–854. https://doi.org/10.1080/21670811.2020.1773888

Helberger, N., Karppinen, K., & D’Acunto, L. (2018). Exposure diversity as a design principle for recommender systems. Information, Communication & Society, 21(2), 191–207. https://doi.org/10.1080/1369118X.2016.1271900

Helberger, N., Möller, J., & Vrijenhoek, S. (2020). Diversity by design—Diversity of content in the digital age [Discussion paper]. Department of Canadian Heritage and the Canadian Commission for UNESCO. https://www.canada.ca/en/canadian-heritage/services/diversity-content-digital-age/diversity-design.html

Irion, K., & Valcke, P. (2015). Cultural diversity in the digital age: EU competences, policies and regulations for diverse audio-visual and online content. In E. Psychogiopoulou (Ed.), Cultural governance and the European Union (pp. 75–90). Palgrave Macmillan. https://doi.org/10.1057/9781137453754_7

Jakubowicz, K. (2010). From PSB to PSM: A new promise for public service provision in the information society. In B. Klimkiewicz (Ed.), Media Freedom and pluralism: Media policy challenges in the enlarged Europe (pp. 193–227). Central European University Press. https://doi.org/10.1515/9786155211850-013

Jakubowicz, K., Ferrell Lowe, G., & Hujanen, T. (2003). Bringing public service broadcasting to account. In Broadcasting & convergence: New articulations of the public service remit (pp. 147–165). Nordicom, University of Gothenburg. http://urn.kb.se/resolve?urn=urn:nbn:se:norden:org:diva-11791

Journalism Trust Initiative. (n.d.). The journalism trust initiative. https://www.journalismtrustinitiative.org/

Journalism Trust Initiative. (2019). CEN workshop agreement (Agreement CWA 17493:2019 E). European Committee for Standardization. https://www.cencenelec.eu/media/CEN-CENELEC/CWAs/ICT/cwa17493.pdf

Kettemann, M. C. (2020). The normative order of the internet: A theory of rule and regulation online. Oxford University Press. https://leibniz-hbi.de/uploads/media/default/cms/media/ijp5yvb_Kettemann_The-Normative-Order-of-the-Internet.pdf

KommAustria. (n.d.). Kommunikationsplattformen [Communication platforms]. https://www.rtr.at/medien/service/verzeichnisse/plattformen/Verzeichnis_Kommunikationsplattform.de.html

Kommunikationsplattformen-Gesetz. (2020). Federal Act on measures to protect users on communication platforms (Communication Platforms Act) (StF: BGBl. I Nr. 151/2020 (NR: GP XXVII RV 463 AB 509 S. 69. BR: 10457 AB 10486 S. 917.)). Federal Chancellery of Austria. https://www.ris.bka.gv.at/Dokumente/Erv/ERV_2020_1_151/ERV_2020_1_151.html

Lada, A., Wang, M., & Yan, T. (2021, January 26). How does news feed predict what you want to see? Meta Newsroom. https://about.fb.com/news/2021/01/how-does-news-feed-predict-what-you-want-to-see/

Lindberg, S. I. (2013). Mapping accountability: Core concept and subtypes. International Review of Administrative Sciences, 79(2), 202–226. https://doi.org/10.1177/0020852313477761

Lowe, G. F., & Martin, F. (Eds.). (2014). The value of public service media. Nordicom, University of Gothenburg. https://norden.diva-portal.org/smash/record.jsf?pid=diva2%3A1534707&dswid=424

Mansbridge, J. (2014). A contingency theory of accountability. In M. Bovens, R. E. Goodin, & T. Schillemans (Eds.), The Oxford handbook of public accountability (pp. 55–68). Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199641253.013.0019

Mazzoli, E. M. (2020). Online content governance: Towards a framework for analysis for prominence and discoverability. Journal of Digital Media & Policy, 11(3), 301–319. https://doi.org/10.1386/jdmp_00027_1

Mazzoli, E. M. (2021). A comparative lens on prominence regulation and its implications for media pluralism. TPRC49: The 49th Research Conference on Communication, Information and Internet Policy. https://doi.org/10.2139/ssrn.3898474

Mazzoli, E. M., & Tambini, D. (2020). Prioritisation uncovered: Discoverability of public interest content online (Study DGI(2020)19). Council of Europe. https://rm.coe.int/publication-content-prioritisation-report/1680a07a57

Mazzucato, M., Conway, R., Mazzoli, E., Knoll, E., & Albala, S. (2020). Creating and measuring dynamic public value at the BBC (Policy Report IIPP WP 2020-19). UCL Institute for Innovation and Public Purpose. https://www.ucl.ac.uk/bartlett/public-purpose/publications/2020/dec/creating-and-measuring-dynamic-public-value-bbc

Medienstaatsvertrag. (2020). Medienstaatsvertrag (MStV) in der Fassung des Dritten Staatsvertrags zur Änderung medienrechtlicher Staatsverträge (Dritter Medienänderungsstaatsvertrag) in Kraft seit 01. Juli 2023 [Media state treaty (MStV) in the version of the third state treaty amending state treaties under media law (Third Media Amendment Treaty) in force since July 1, 2023] (GVBl. S. 450, 451; p. 451). die medienanstalten. https://www.die-medienanstalten.de/fileadmin/user_upload/Rechtsgrundlagen/Gesetze_Staatsvertraege/Medienstaatsvertrag_MStV.pdf

Meta. (Retrieved 2021a). Corporate human rights policy [Policy statement]. Meta. https://about.fb.com/wp-content/uploads/2021/04/Facebooks-Corporate-Human-Rights-Policy.pdf

Meta. (Retrieved 2022a). Our approach to newsworthy content. Transparency Center. https://transparency.fb.com/en-gb/features/approach-to-newsworthy-content/

Meta. (Retrieved 2022b). Our approach to ranking. Transparency Center. https://transparency.fb.com/en-gb/features/ranking-and-content/

Meta. (Retrieved 2022c). Widely viewed content report: What people see on Facebook. Transparency Center. https://transparency.fb.com/data/widely-viewed-content-report/

Meta. (Retrieved 2022d). What are recommendations on Facebook? Facebook Help Center. https://www.facebook.com/help/1257205004624246/

Meta. (Retrieved 2022e). What are recommendations on Instagram? Instagram Help Center. https://help.instagram.com/313829416281232/

Milosavljević, M., & Broughton Micova, S. (2016). Banning, blocking and boosting: Twitter’s solo-regulation of expression. Medijske Studije, 7(13), 43–58. https://doi.org/10.20901/ms.7.13.3

Milosavljević, M., & Poler, M. (2019). Legal analysis in media policy research. In H. Van den Bulck, M. Puppis, K. Donders, & L. Van Audenhove (Eds.), The Palgrave handbook of methods for media policy research (pp. 519–539). Springer International Publishing. https://doi.org/10.1007/978-3-030-16065-4_30

Napoli, P. M. (2011). Exposure diversity reconsidered. Journal of Information Policy, 1, 246–259. https://doi.org/10.5325/jinfopoli.1.2011.0246

ORF-G. (1984). Federal Act on the Austrian Broadcasting Corporation (ORF Act) (StF: BGBl. Nr. 379/1984 (WV) idF BGBl. Nr. 612/1986 (DFB) und BGBl. I Nr. 194/1999 (DFB)). National Council of Austria. https://www.ris.bka.gv.at/Dokumente/Erv/ERV_1984_379/ERV_1984_379.pdf

Österreichischer Rundfunk-a. (n.d.a). General Information. https://der.orf.at/unternehmen/austrian-broadcasting-corporation/index.html

Österreichischer Rundfunk-b. (n.d.b). Digital Platform. https://zukunft.orf.at/show_content.php?sid=182

Österreichischer Rundfunk-c. (n.d.c). Order verification. https://zukunft.orf.at/show_content2.php?s2id=183

Österreichischer Rundfunk-d. (n.d.d). Public value report. https://zukunft.orf.at/show_content2.php?s2id=576

Österreichischer Rundfunk-e. (n.d.e). Public value archive. https://zukunft.orf.at/show_content.php?sid=150&language=en

Österreichischer Rundfunk-f. (n.d.f). DialogForum. https://zukunft.orf.at/show_content.php?sid=145&language=en

Österreichischer Rundfunk-g. (n.d.g). Next Generation. https://zukunft.orf.at/show_content.php?sid=155&language=en

Österreichischer Rundfunk-h. (n.d.h). Public value quality dimensions. https://zukunft.orf.at/show_content.php?language=en&sid=134

Österreichischer Rundfunk. (2022). ORF – Jahresbericht 2021 [ORF Annual Report 2021] [Report]. ORF. https://zukunft.orf.at/rte/upload/2022/veroeffentlichungen/jb_2021_final.pdf

Österreichischer Rundfunk. (2023, August 7). Die Vertrauenskrise als Auftrag [The crisis of confidence as a mission]. https://oe1.orf.at/artikel/704023/Die-Vertrauenskrise-als-Auftrag

Pirkova, E., Kettemann, M. C., Wisniak, M., Scheinin, M., Bevensee, E., Pentney, K., Woods, L., Heitz, L., Kostic, B., Rozgonyi, K., Sargeant, H., Haas, J., & Joler, V. (2021). Spotlight on artificial intelligence and freedom of expression: A policy manual (p. 101) [Policy manual]. OSCE Office of the Representative on Freedom of the Media. https://www.osce.org/files/f/documents/8/f/510332.pdf

Puppis, M. (2010). Media governance: A new concept for the analysis of media policy and regulation. Communication, Culture & Critique, 3(2), 134–149. https://doi.org/10.1111/j.1753-9137.2010.01063.x

Puppis, M., & Van Den Bulck, H. (2019). Doing media policy research. In H. Van Den Bulck, M. Puppis, K. Donders, & L. Van Audenhove (Eds.), The Palgrave handbook of methods for media policy research (pp. 23–49). Springer International Publishing. https://doi.org/10.1007/978-3-030-16065-4_2

Ranking Digital Rights. (n.d.). 2020 indicators. https://rankingdigitalrights.org/2020-indicators/#F12

Ranking Digital Rights. (2022). Methods and standards. https://rankingdigitalrights.org/methods-and-standards/

Regulation 2022/2065. (2022) Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a single market for digital services and amending Directive 2000/31/EC (Digital Services Act). European Parliament and Council. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32022R2065

Reuters Institute, Newman, N., Fletcher, R., Robertson, C. T., Eddy, K., & Nielsen, R. K. (2022). Reuters Institute: Digital news report 2022 (p. 164) [Report]. Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2022-06/Digital_News-Report_2022.pdf

Rhodes, R. A. W. (2007). Understanding governance: Policy networks, governance, reflexivity and accountability. Organizational Studies, 28(8), 1–22. https://doi.org/10.1177/0170840607076586

Rozgonyi, K. (2020). Disinformation online: Potential legal and regulatory ramifications to the right to free elections – policy position paper. In F. Loizides, M. Winckler, & U. Chatterjee (Eds.), Human computer interaction and emerging technologies: Adjunct proceedings from the INTERACT 2019 workshops (pp. 57–66). Ubiquity Press. https://doi.org/10.18573/book3.g

Saurwein, F. (2019). Emerging structures of control for algorithms on the Internet: Distributed agency – distributed accountability. In Media accountability in the era of posttruth politics (pp. 196–211). Routledge. https://www.taylorfrancis.com/chapters/edit/10.4324/9781351115780-13/emerging-structures-control-algorithms-internet-florian-saurwein?context=ubx&refId=7d051112-b7d5-4428-872d-339930ad6a1c

Saurwein, F., & Spencer-Smith, C. (2021). Automated trouble: The role of algorithmic selection in harms on social media platforms. Media and Communication, 9(4), 222–233. https://doi.org/10.17645/mac.v9i4.4062

Schulz, A., Levy, D. A., & Nielsen, R. (2019). Old, educated, and politically diverse: The audience of public service news [Report]. Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2019-09/The_audience_of_public_service_news_FINAL.pdf

Seethaler, J., & Beaufort, M. (2022). Monitoring media pluralism in the digital era: Application of the Media Pluralism Monitor In the European Union, Albania, Montenegro, the Republic of North Macedonia, Serbia and Turkey in the year 2021. Country report: Austria [Technical report]. European University Institute. https://doi.org/10.2870/156065

Sehl, A. (2020). Public service media in a digital media environment: Performance from an audience perspective. Media and Communication, 8(3), 359–372. https://doi.org/10.17645/mac.v8i3.3141

Sorsa, K. (2019). Public value and shared value through the delivery of accountability. In Media accountability in the era of post-truth politics (pp. 135–149). Routledge. https://www.taylorfrancis.com/chapters/edit/10.4324/9781351115780-9/public-value-shared-value-delivery-accountability-kaisa-sorsa

TikTok. (2020, Retrieved). How TikTok recommends videos #ForYou. TikTok Newsroom. https://newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you

TikTok. (Retrieved 2022a). TikTok’s Community Guidelines. TikTok. https://www.tiktok.com/community-guidelines?lang=en

TikTok. (2022b). Community guidelines enforcement report [Report]. https://www.tiktok.com/transparency/en-gb/community-guidelines-enforcement-2022-1/

Trappel, J. (2016). Taking the public service remit forward across the digital boundary. International Journal of Digital Television, 7(3), 273–295. https://doi.org/10.1386/jdtv.7.3.273_1

Twitter. (Retrieved 2022a). About Twitter lists. Twitter Help. https://help.twitter.com/en/using-twitter/twitter-lists

Twitter. (Retrieved 2022b). Our range of enforcement options for violations. Twitter Help. https://help.twitter.com/en/rules-and-policies/enforcement-options

Twitter. (Retrieved 2022c). Public-interest exceptions to enforcement of Twitter rules. Twitter Help. https://help.twitter.com/en/rules-and-policies/public-interest

Twitter. (Retrieved 2022d). The Twitter rules: Safety, privacy, authenticity, and more. Twitter Help. https://help.twitter.com/en/rules-and-policies/twitter-rules

Twitter. (Retrieved 2022e). Twitter transparency center. https://transparency.twitter.com/en.html

Urbán, Á., Polyák, G., & Horváth, K. (2023). How public service media disinformation shapes Hungarian public discourse. Media and Communication, 11(4). https://doi.org/10.17645/mac.v11i4.7148

Van den Bulck, H. (2015). Public service media accountability in recent decades: A progressive shift from state to market. In K. Arriaza Ibarra, E. Nowak, & R. Kuhn (Eds.), Public service media in Europe: A comparative approach. Routledge. https://www.taylorfrancis.com/chapters/edit/10.4324/9781315722290-7/public-service-media-accountability-recent-decades-hilde-van-den-bulck?context=ubx&refId=7974b3f8-d24a-478b-b2b2-ac35cf11ca61

Footnotes

1. According to Recital 25 of the AVMSD: “Directive 2010/13/EU is without prejudice to the ability of Member States to impose obligations to ensure the appropriate prominence of content of general interest under defined general interest objectives such as media pluralism, freedom of speech and cultural diversity. Such obligations should only be imposed where they are necessary to meet general interest objectives clearly defined by Member States in accordance with Union law. Where Member States decide to impose rules on appropriate prominence, they should only impose proportionate obligations on undertakings in the interests of legitimate public policy considerations.”

2. See the ‘Grundversorgung', "the basic provision of information and opinion" function the German Constitutional Court assigned in the Fourth Broadcasting Decision to public broadcasters as a constitutional mandate (Decision of the German Federal Constitutional Court of 3 June 1986; 73 BVerfGE at 157.)

3. “In December 2021, 82,8% of the Austrian online population used the ORF.at-network, this is more than 5,6 million users per month. 138,9 million visits per month in 2021 make ORF.at by far the most successful Austrian news website.”(ORF, n.d.a)

4. See the ORF Public Value ‘Transform’ Report Series (ORF, n.d.b) for more information.

5. Twitter, now X, was the brand name in use at the time of the sampling and analyses in 2022.

6. According to Austria’s national regulatory authority (NRA), the KommAustria, 10 platform providers were operating 11 platforms, subject to Austrian jurisdiction (KommAustria, n.d.).

7. See ORF (n.d.c) for a detailed overview of legal compliance by ORF distributing public value content online

8. The ORF disclosed 4 platforms as their main social media distribution channels (ORF, n.d.d).

9. This interpretation was reassured by the consistent jurisprudence of the European Court of Human Rights (EctHR) which holds that the State is the “ultimate guarantor” of pluralism given the fundamental role of freedom of expression in a democratic society. See Informationsverein Lentia and Others v. Austria, Application no. 13914/88; 15041/89; 15717/89; 15779/89; 17207/90, 24 November, 1993, Para. 38.

10. For more information, see the EBU Media Intelligence services (EBU, n.d).

11. For more information, see the PVR archive (ORF, n.d.e).

12. For more information, see ORF (n.d.f).

13. For more information, see ORF (n.d.g).

14. For more information about strategic “quality-checks”, see the latest statement by the ORF Director General (Die Vertrauenskrise als Auftrag, 2023)

15. If the ORF wishes to launch online services (such as broadcast content-related daily news overviews and similar), they need to obtain a ‘special mandate’ (ORF-G § 4e). First, the ORF has to provide a service concept (§ 5a), which might be subject to prior approval by the regulator, the KommAustria (§ 6 to § 6b).