Framing the role of experts in platform governance: Negotiating the code of practice on disinformation as a case study

Kateryna Chystoforova, European University Institute, Florence, Italy
Urbano Reviglio, European University Institute, Florence, Italy

PUBLISHED ON: 31 Mar 2025 DOI: 10.14763/2025.1.1823

Abstract

Given the growing complexity of platform governance, the role of experts in digital policy-making processes is becoming more critical. However, there remains a gap in the literature regarding the specific role of experts in shaping platform governance processes. To explore this research gap, this study examines the perspectives of experts involved in the Code of Practice on Disinformation, a co-regulatory governance mechanism established by the European Union. Through a total of 40 semi-structured interviews with disinformation and policy experts, we analyse the perceptions of their role in shaping this process. We found ambivalent opinions on the efficacy of the Code and widespread distrust for platforms, particularly regarding content moderation and data sharing practices. Our analysis also reveals key limitations and challenges involved in platform governance such as systemic resource asymmetries. The article calls for further research on deepening our understanding of the pivotal role experts play in platform governance and content moderation in order to identify strategies for enhancing their impact on digital policymaking.

Citation & publishing information
Received: Reviewed: Published: March 31, 2025
Licence: Creative Commons Attribution 3.0 Germany
Funding: The authors did not receive any funding for this research.
Competing interests: The authors have declared that no competing interests exist that have influenced the text.
Keywords: Content moderation, Platform governance, Co-regulation, Policymaking, Experts
Citation: Chystoforova, K., & Reviglio, U. (2025). Framing the role of experts in platform governance: Negotiating the code of practice on disinformation as a case study. Internet Policy Review, 14(1). https://doi.org/10.14763/2025.1.1823

This paper is part of Content moderation on digital platforms: beyond states and firms, a special issue of Internet Policy Review guest-edited by Romain Badouard and Anne Bellon.

Introduction

In the dynamic realm of digital governance, the European Union (EU) has positioned itself as a leading force, proactively regulating the digital sphere to tackle societal challenges stemming from the growing dominance of online platforms. Specifically, the issues of content moderation on Very Large Online Platforms (VLOPs) and the fight against disinformation have risen to prominence as primary concerns for European policymakers. Considering the outsize influence of the platforms and the many stakeholders who are directly involved, the governance regime developed along the lines of “cooperative responsibility” (Helberger et al., 2018). However, as historically internet governance and online content moderation have overwhelmingly taken the private, self-regulatory form, the institution of such “cooperative responsibility”, or “co-regulation”, has been challenging.

Given the rapid advancement and complexity of the socio-technical systems to be governed, one group of stakeholders which the European Commission has leaned on to move the needle towards co-regulation in the digital sphere have been experts. While this term is known to be wide and elusive, in this context we specifically focused on academic, industry and civil society experts involved in the effort to influence and shape how policymakers view and respond to content moderation policy and enforcement challenges on platforms (we further elaborate on this in Section 1). More generally, the article delves into the crucial role of experts and expert knowledge within this dynamic policy landscape, examining their involvement in the development, implementation, monitoring, and evaluation of digital policies. This is done using the Code of Practice on Disinformation as a case study (hereafter referred to as ‘the Code’). First, we conducted 17 semi-structured interviews with participants of the Expert Group on Structural Indicators for the Code of Practice on Disinformation coordinated by the European Digital Media Observatory (EDMO) (Chystoforova and Reviglio, 2024). This investigation was intended to survey the perception of these experts specifically on the co-regulatory development of the “Structural Indicators” of the Code, discussing how to improve them. Following this, we have expanded our research questions and conducted 24 additional interviews on the self-perception of experts involved in the Code and other digital policy processes. The objective of this investigation was to gain a deeper understanding of experts’ assessments of the Code and, more broadly, platform governance, while also exploring their self-perceptions and opinions on their role in the decision-making process.

Our investigation attempts to fulfill several objectives vis-à-vis the current state of play in research. While we broadly look at the role of experts in policy-making processes and how expert knowledge is translated into policy, we aim to extend the current debates on the role of experts to the digital policy realm in the EU, and platform governance and content moderation in particular. This is a dynamic policy area that also has devoted a special place to experts within the dominating co-regulatory approach. The experts are called upon to contribute knowledge, or sometimes to confirm the Commission’s course, but they also emerge as actors in their own right, with specific goals, agendas and expectations from this engagement. To this end, we have attempted to discern what is the perceived and actual role of experts involved in platform co-regulatory processes at the EU level by looking at how Code-adjacent experts see their role, and how they relate to other stakeholders in the process – the Commission and the platforms. The analysis shows that even when government institutions try to mandate a more inclusive, multi-stakeholder co-regulatory regime, due to the “softness” of this approach, any cooperation can become very ineffectual and surface-level, necessitating a more hands-on form of co-regulation. Our final goal is to shed light on potential improvements that European regulators can take into consideration in transferring and producing expert knowledge for policy-making.

Section 1: Setting the scenes: context, definitions and methods

The following section firstly contextualises the role of experts in policy-making, highlighting the emergence of evidence-based policy-making and the specific characteristics of platform governance and content moderation. Then, it outlines the peculiarities of the EU's Code as a case study, stressing the collaborative framework involving experts in co-regulation. Finally, it defines "experts" within this context, describing their roles and the methodology employed to gather and analyse interviews.

An overview on the role of experts in digital policymaking

The role of experts in policymaking has received a lot of attention from academia in recent years, in fields as wide-ranging as environmental (Valin & Huitema, 2023), immigration (Boswell, 2009), and educational (Volmari, 2022) policy. While not exclusively associated with it, the growing interest towards the intersections between expertise and policy can be tied to the push for more evidence-based policymaking (EBP), echoed both by academia and policymakers themselves, and increasingly institutionalised in recent years (European Commission, 2022). The study of the role of experts is not limited to EBP though; several streams in literature engage in “fragmented, parallel monologues” on expertise in policy, including science and technology studies, epistemic communities, and knowledge utilisation/mobilisation (Christensen, 2021).

The role of experts in digital policy processes, however, has received relatively little attention, except for Metwally’s work (2022). This may be explained in many ways; the relative novelty of platform governance as a research field, the elusivity of the terms ‘expert’ and ‘expertise’, and the challenges of obtaining interviews and of participating in closed-door policy events, especially in self-regulation which does not have the same transparency requirements as traditional, state-led policy. The literature broadly acknowledges that experts, especially technical experts coming from ICT industries, have played a role in establishing digital and internet policy in the past, and continue playing one today (An & Yoo, 2019; Mărcuţ, 2020). Some literature exists on the interactions between different stakeholders - including civil society organisations, tech companies, and policymakers - in self-regulation and standard setting (Harcourt et al., 2020). The need to insert expertise into the policymaking process is often tied to either highly technical policy areas or done in attempts to find resolutions to particularly complex issues (Spruijt et al., 2014). This makes all the more glaring the current gap in the literature looking at the role of experts in digital policy and in particular platform governance and content moderation. These areas not only can be very technical and exhibit the characteristics of being “wicked” problems but they also show unique features.

On the one hand, platform governance has emerged as a peculiar form of governance (Gillespie, 2018; Gorwa, 2019). It requires an understanding of technical systems and platforms but also the ability to legally and technically challenge the powerful role of companies that control them as fundamentally political actors that engineer the global infrastructure of free expression. Platform governance includes not only users and platforms, but also a vast array of subjects such as political actors (including various branches of governments), other stakeholders and advocacy groups (non-governmental privacy and digital rights groups, academics and researchers, and investigative journalists), as well as other companies that participate in the platform’s ecosystem (e.g., advertisers, developers, and other parties).

On the other hand, content moderation is an integral part of the online experience on the platformised internet. In very simple terms, it is the act of managing content published on the platform according to pre-set rules and criteria – which might stem from the platform’s own terms of service, local regulations, or even international law (Roberts, 2017). Essentially a sub-type of platform governance, content moderation has also become a privatised form of government which is eventually responsible for balancing human rights and protecting free speech (Douek, 2022). This complex endeavor necessitates a diverse range of expertise, including algorithmic detection analysis, legal knowledge on compliance and human rights, ethical considerations in decision-making, and an understanding of user behavior. This may also explain why expert engagement, in this context, is increasingly mandated by law (Metwally, 2022).

The Code’s peculiarities

While the policy approach in this area may appear no different from the conventional approach of EU institutions, which tries to incorporate evidence and expertise at different stages of the policy cycle, what makes the case for the involvement of experts in platform governance particularly interesting is the special position of experts, notably academic and civil society researchers, within the co-regulatory framework. In this scenario, the regulator leans on the experts not only to utilise their knowledge for policy development, but also shares with them some of the responsibility for its enforcement through the expansive provisions for access to data for researchers.

The EU’s Code of Practice on Disinformation is a paradigmatic example of this approach. It is one of the longest-running examples of cooperative content moderation governance in the EU, lending it the paradigmatic status – it sets the standard for other such instruments and similar approaches going forward and highlights general characteristics of this mode of governance (Flyvbjerg & Sampson, 2001). Developed as a self-regulatory tool in 2018 by a group of Signatories, including Very Large Online Platforms, other tech firms, fact-checkers and civil society organisations, under the guidance of the European Commission, its purpose was to set common rules and standards for content moderation of disinformation in the EU. In addition to self-reporting from platform companies, however, the Code foresees a more holistic monitoring of the state of disinformation in the EU through the development of Structural Indicators (Commitment 41). These SIs are meant to complement the self-reporting, by setting a baseline and then continuously assessing aspects of disinformation; for example, its prevalence, the monitoring of problematic sources, the audience’s behaviors, and platforms’ attempts to demonetise those who profit from spreading it (Nenadic et al., 2023). The successful monitoring based on the SIs hinges on another Commitment under the Code - the Signatories providing access to data for researchers, who can then independently assess the relevant aspect of the platforms’ work, and continuously evaluate the effectiveness of the Code as a policy instrument. The existence of these multi-stakeholder monitoring mechanisms, as well as a significant level of involvement of the European Commission as the de facto convenor of the Code, thus contributes to the status of the Code as more of a co-regulation device, than a simple self-regulatory instrument.

The Code combines the vestiges of an ex-post approach to content moderation governance, through, for example, Commitments related to user appeal mechanisms. More importantly, however, especially in the part of Structural Indicators and access to data, we can observe one of the first comprehensive attempts to establish an ex-ante “structured and ongoing oversight regime” (Douek, 2022), which recognises the systemic issues in platform design that lead to the dissemination of disinformation, and the fact that the range of actors involved in content moderation decision-making is far broader than front-line moderators, or even just the platforms themselves, by carving out rules for the engagement of users, fact-checkers, researchers and regulators. The issue is that, while the ex-post approach is narrow and thus, at least in theory, enforceable, the broader ex-ante content governance regime can be so all-encompassing, so as to become completely ineffectual. However, this argument belies another possible reason behind the platform’s opposition to full-blown ex-ante content governance - if fully implemented, it could shed light onto the parts of their business model they would rather conceal, e.g., its emphasis on virality and polarisation, and lead to the eventual dismantling of the entire “surveillance capitalist” mode of production (Zuboff, 2019). All in all, platforms vehemently oppose the institutionalisation of a genuine ex-ante oversight at every step of the way - and given that the platforms alone possess the exclusive knowledge of how their algorithms currently work, policy-makers would be at a disadvantage when trying to craft any type of ex-ante governance regime. As such, the Code represents an attempt to reach a “middle ground” between platforms’ preference for ex-post self-regulation and the growing demands for more robust and systemic solutions for the problems stemming from platform operations.

The murky middle ground between self- and co-regulation, as mentioned above, is another significant peculiarity of the Code. Of course, even if self-regulation and co-regulation set and enforce different regulatory goals, standards and justifications, they operate on a spectrum (Finck, 2018). Moreover, while platforms are already self-regulating entities as they determine the terms and conditions, digital regulation naturally requires some forms of enforced self-regulation or co-regulation which can take the form of voluntary codes of conduct and negotiated self-regulatory agreements. The state role, in this case, is relatively limited to more of an informal oversight and steering role. The initial Code was an example of such a negotiated self-regulatory agreement, developed with significant input from the European Commission in terms of kickstarting the process, and mandating its establishment. However, the revision of the Code in 2022, which broadened the group of Signatories to the Code, and most importantly, included a clause on the potential evolution into a Code of Conduct under Art. 45 of the DSA, has brought this instrument firmly into the realm of co-regulation. By aligning the Code concretely with “hard” regulation like the DSA, this move foreshadowed a broader trend toward more explicit government involvement in regulatory processes concerning online content moderation (Brogi & De Gregorio, 2024). As such, within this paper, we treat the Code as a de-facto (soon-to-be de-jure) co-regulatory instrument, while recognising its self-regulatory roots.

The conceptual approach

Against this backdrop, it is fundamental to define what we mean by “experts” and situate their role within content moderation and platform governance. An "expert" in the context of policy-making encompasses individuals or entities possessing specialised knowledge relevant to decision-making. This expertise may be derived from scientific research, professional experience, or representation of specific groups' experiences (Christensen et al., 2023; Grundmann, 2017; Robert, 2010). Experts are characterised by their ability to apply knowledge to new problems, often perceived as trusted and independent information sources (Grundmann, 2017), and their relational role in policymaking. They may act as informants, knowledge brokers, or issue advocates, mediating their knowledge to inform or influence policy decisions (Lewis et al., 2023; Pielke, 2007). This broad definition aligns with the European Commission's criteria, including individual experts (often academics), stakeholder group representatives, organisations (e.g., companies, NGOs), national authorities, and other public bodies (European Commission, 2016).

Before we delve into the discussion, it’s important to bridge the gap between the studies of the role of expertise in policy-making and platform governance, to understand who are the experts in particular in the content moderation governance of the EU. Metwally (2022) has put forward four broad yet meaningful categories that situate experts based on their positionality within platform governance and content moderation: (i) outside experts, as paid third-party experts often deemed as neutral while often lacking transparency; (ii) internal experts, as the insider engineering and policy teams; (iii) trust & safety experts, as the professionalised experts from the corresponding emerging industry, often composed of former social media employees; (iv) civil society, as the experts from non-governmental organisations and advocacy groups emerged as increasingly powerful expert bodies. These categories, of course, are not uniform, yet they highlight different backgrounds, worldviews, and goals. Ultimately, all these experts remain “engaged in an effort to influence, shape, and thereby control, how the decision makers view and respond to content policy and enforcement challenges on the platforms” (p.5 ibid).

For this paper, however, it is important to make further distinctions. While the groups that are identified by Metwally are all present in Europe, not all of them are actively engaging in the EU platform governance process, at least not through the formal policy formulation and implementation channels. This could be explained by the peculiarities of the European Commission’s policy-making process, which traditionally seeks expertise from academia, civil society, consumer groups and industry (European Commission, 2016), as well as the particular frame of reference of Metwally’s study, which explored experts in platforms’ internal and industry self-regulation, rather than broader regulatory frameworks. In our study of expert involvement in the Code of Practice, we have found four main groups that have engaged in this process directly:

(i) Internal experts – official representatives of the platforms, usually policy managers and lobbyists, who do not possess in-depth technical or trust-and-safety knowledge, but act as public-facing conduits and translators of the platforms’ vast knowledge and expertise.

(ii) Academics and researchers – individual experts, who possess in-depth topic matter knowledge on issues relevant to the Code (such as disinformation, audience studies, media policy, etc.)

(iii) Industry stakeholders – representatives of industries and sectors affected by the operations of the platforms or who have a stake in platform operations (e.g., media industry), and who possess sector-specific knowledge about the broader effects of platformisation and its implications for the spread of disinformation.

(iv) Civil society – representatives of a broad range of organisations, such as advocacy groups, NGOs, fact-checkers, who have other relevant expertise on disinformation, digital and consumer rights.

In this research, we have only focused on the three latter groups – academics, industry and civil society experts – as we wanted to explore how external and at least theoretically independent experts try to affect the platforms’ content moderation by way of EU digital policy. In the discussion section, we also further elaborate on the changing face of expertise in platform governance in the EU, given the growing formalisation and importance of this policy sphere.

Methodology and data collection

Interviews are a widely used method for examining the relationships between social actors, their beliefs, and motivations, as they allow for in-depth exploration of individual perspectives and contextual factors (Yin, 2018). Given the study’s focus on understanding the roles and perceptions of experts within the policy process, interviews provided a suitable approach to capture nuanced insights that might not be accessible through other methods.

Given the lack of granular analysis on the subject of this case study, as well as the opaque and black-box nature of the policy process, the choice of a blend of expert/elite interviews follows naturally. While all interviewees possessed specialised expertise relevant to the policy field, our primary focus was not solely on their technical knowledge but on their strategic positions within the Code of Practice on Disinformation, which enabled them to exert influence over the policy process to varying extents (Harvey, 2011). This positioning, and our intention to study the self-perceptions of their own positioning and role in the policy process (Van Audenhove & Donders, 2019), supports their classification as "elite" interviewees. At the same time, the distinction between expert and elite interviews is widely regarded as overlapping (Littig, 2009). In our study, we consider our methodological approach to be a blend of both, as we sought not only to understand the role of experts within this policy process but also to leverage their topical knowledge to gain broader insights into content moderation policy at large.

The interviews were semi-structured, to provide a degree of comparability of findings, while also staying true to the exploratory nature of the research. In total, we conducted 31 interviews between 13 June 2023 and 10 May 2024. The interview guide is included in Appendix A.

The selection of interview participants was limited to the people who are currently or were in the past involved as “experts” in the process of development, negotiation, and/or monitoring of the Code of Practice on Disinformation. Namely, they included members of the High-Level Expert Group (HLEG) on Fake News, members of advisory bodies and other initiatives involved in the Code, and representatives of the Commission. Moreover, the interviews were held in two stages. This study emerged from the authors’ work on improving the monitoring of the Code of Practice through Structural Indicators (Chystoforova & Reviglio, 2024), where we conducted 17 interviews with members of the Expert group on Structural Indicators to collect their feedback on the initial proposal. These interviews, though specific to the monitoring of the Code, lended very interesting insights into the Code as a content moderation governance instrument and the role of experts within it. This was the first, exploratory stage of interviews; based upon them, we formed more particular research questions expanding from the already collected insights. Later, we conducted an additional 24 interviews, this time following the interview guide in Appendix A. The participants included a mix of past interviewees, who belonged to the Expert Group on Structural Indicators, and other experts, who were involved at different stages in the HLEG, which put forward the idea of the Code of Practice in 2018, the so-called “Sounding Board” - a group of independent experts who provided feedback to the platform signatories during the development of the original Code, and representatives of the European Digital Media Observatory (EDMO), who have played an active role in the development, monitoring of the Code since EDMO’s inception in 2020.

Interviews were conducted on the basis of availability and participant willingness, recognising that access to individuals involved in high-level policy negotiations is challenging. Although this sampling is non-random, it provides valuable insights by surveying perspectives from a unique set of well-informed and influential actors. This approach, while limited in scope, is aligned with the study’s aim to gather expert insights that are otherwise difficult to obtain. Indeed, semi-structured interviews were chosen to provide a flexible yet focused exploration of the experts' views. Each interview lasted between 30 to 60 minutes, and was conducted via online platforms. The full list of interviews is available in the Appendix B below.

The first batch of interview transcripts was analysed and coded inductively in order to map some common emerging themes to explore, such as perceptions of the Code's effectiveness, co-regulatory challenges, and proposed enhancements to structural indicators. This initial exploratory analysis informed the interview guide for the second round of interviews, and partly influenced the coding framework. The second round of interviews went through several rounds of coding: first, based on the first round coding scheme, in a more deductive manner; subsequently an inductive approach to identify new themes was applied.

Section 2: Findings

In this section, we outline the main themes and ideas that emerged from the interviews, and in the next section, discussion, we elaborate further insights based on our findings.

The varied roles of experts

The interviews conducted with experts involved in the Code of Practice on Disinformation reveal a multifaceted understanding of their roles and contributions. Experts described their roles using terms such as “vessels of information,” “lobbyists,” “watchdogs,” “connecting tissues,” and “voices of common sense.”

Many experts see themselves as providers of valuable information and evidence to policymakers, playing a crucial role in informing decision-making processes. Most experts view their involvement as essential in ensuring that policies are grounded in empirical evidence and rigorous analysis. Additionally, several experts identified themselves as “connecting tissues,” facilitating knowledge exchange and collaboration between academia, civil society, and policymakers.

Some interviewees defined themselves as “information brokers” and perceived themselves as lobbyists, advocating for specific positions or interests. Others see themselves as “watchdogs”, monitoring the actions of platforms and the process of negotiations between them and the Commission, and advocating for greater transparency and accountability. The watchdog role can be combined with other self-perceptions, even that of an “information provider”. The watchdogs see themselves as less “political” than direct policy advising, with experts stating that their oversight efforts were often conducted implicitly rather than through direct policy recommendations.

The self-perceptions of experts depend on their “type” in many, though not all cases. For example, “information brokers” role is a more characteristic perception for representatives of stakeholder groups, while that of a “watchdog” is more common for civil society experts. Many experts play into these “predefined” roles, as the rules of the game in the broader EU policy process have already been set for them.

Academics are a curious case in particular. They often tend to see themselves as “vessels” of accurate information, but some do emerge as more conscious and determined “brokers” who not only see themselves as representing objective knowledge, but the academic community as an important stakeholder in the policy process.

Expertise translation and engagement strategies

Expert knowledge translation into policy-making encompasses various strategies and engagements. One key aspect is the collaborative relationship between experts and policymakers, particularly at the national level, where even those experts who act simply as information providers for EU policy-makers can turn into more engaged and empowered actors. Several experts indicated that they engaged in commissioned studies and provided independent views to ministries, which they believed influenced policy decisions directly..

Interviewees also noted their participation in initiatives such as media ethics councils, where they proactively contribute to self-regulatory measures and educate society on pertinent issues. These engagements were described as opportunities for experts to provide insights and contribute to shaping policy outcomes.

Although engagement in national policy-making is beyond this study’s scope, it’s worth noting that, while many experts find it significantly easier to participate in decision-making within their countries (as Interviewee 11 states, “on a national level, it’s really difficult not to influence [policy]”), this involvement has minimal impact on platform governance, particularly in smaller EU member states.There is a perception that these countries lack the capacity and negotiating power to effectively regulate the platforms, so they must rely on EU-level policy.

At the EU level, experts remarked on the value of coalition-building and collaboration in order to make policy impacts. This was more natural for civil society experts and “lobbyists”, who engage in policy advocacy more frequently. But even academics, who generally take a more hands-off approach to policy (the “information provider” archetype), underlined the value of platforms and “knowledge brokers” that pooled their expertise and somehow represented the broader academic community more effectively in the policy process.

Challenges and successes of expert participation

Furthermore, experts reflect on the challenges and frustrations they encounter in their roles. Some express a sense of powerlessness in the face of entrenched policy agendas or bureaucratic processes, while others lament the lack of feedback and iterative dialogue in policy-making forums. Many interviewees expressed uncertainty about their influence on policy outcomes, with some questioning the extent to which their contributions shape decision-making. While some individuals perceive a degree of efficacy in their efforts, particularly when transitioning from academia to think tanks or participating in civil society coalitions, others reported feeling constrained by the entrenched mechanisms of EU policy-making and lack of alignment with the Commission’s current priorities.

Interviewees cited the presence of supportive policymakers and influential figures as key to the adoption of certain policies. Conversely, they noted that policy advocacy becomes more difficult in absence of support from decision-makers and during shifts in political landscapes,, necessitating strategic coalition building and engagement with diverse stakeholders. A specific example of such misalignment was provided by an expert involved in the early development of the Code of Practice on Disinformation. They expressed concerns that the Code could infringe on freedom of speech by granting public and private actors significant control over online discourse. However, these concerns conflicted with the prevailing narrative endorsed by the European Commission and the majority of stakeholders and were ultimately disregarded. As a result, they reported feeling sidelined and later withdrew from formal participation, opting instead to monitor the Code’s implementation externally.

Despite these challenges, there are instances where participants believe their contributions have made a tangible difference, such as shaping codes of practice or advocating for greater transparency and privacy. While many interviewees found it difficult to assess their overall influence, they acknowledged the value of collaborative efforts and sustained engagement in the policy-making process.

Experts perceptions of other stakeholders

Experts express concerns about power imbalances within policy-making processes, particularly in relation to the influence of platform companies. They argued that digital platforms exert control over research findings, as evidenced by their efforts to restrict access to data and thus influence research outcomes. Participants cite instances where platforms implement measures like screening research publications or rejecting research proposals, which are perceived as obstructive to uncovering and addressing online harms. There is a notable power disparity within expert groups, with platforms wielding significant influence and dominating discussions, often to the detriment of diverse perspectives and critical inquiry.

More broadly, the research findings reveal a pervasive lack of trust among experts towards the platforms involved in policy-making processes, particularly regarding the handling of disinformation and cooperation with regulatory bodies. Interviewees frequently characterised tech platforms as prioritising profit over public interest, with many citing ongoing struggles to encourage platforms to take more decisive action against harmful content.. Other experts echo this sentiment, highlighting platforms' dominance and resistance to align with public responsibility measures, particularly in collaboration with the European Union. Moreover, experts widely perceive the platforms' commitments, including the Code of Practice, as mere public relations gestures lacking substantial action. The assertion that platforms engage in "90% bullshitting, 10% action" (Interview 8) reflects a sentiment shared by many, who view platforms' efforts as more focused on communication and image management rather than addressing underlying issues effectively. This skepticism extends to the verifiability and reliability of platform reports, with concerns raised about the lack of transparency and accountability in their self-reported data.

This encompasses the efforts to influence platform policies and activities related to content moderation directly as well. One expert invited to attend a Meta event questioned the purpose behind and the impact of this engagement: “It was more like PR for Meta, than they're very interested in our opinions.” (sic) (Interview 7). Overall, the prevailing sentiment among participants indicated a deep-seated skepticism regarding the platforms' intentions and their role in the policy-making process. Some interviewees pointed to platforms’ partial compliance with the Code of Practice as evidence of this skepticism (Mündges & Park, 2024).

Interviewees expressed relatively fewer criticisms of the European Commission compared to the platforms. Concerns revolve around the Commission's handling of the process, with some experts citing organizational challenges and perceived political or business capture. Some see the power imbalance not just vis-a-vis the platforms, but also vis-a-vis the Commission, acting as a gatekeeper who gets to choose what kind of expertise is sought and who gets a seat at the table; this imbalance is reflected in the challenges faced by individuals seeking participation in influential policy-making circles, where entry may be restricted despite their expertise and willingness to contribute. The Commission's role is viewed differently by various stakeholders, with some perceiving it as overly accommodating to platforms, while others see it as being controlled or influenced by them.

Despite all these challenges, most experts remain committed to their roles and see value in engaging with policy processes to effect change. Over and over, they stress the value of conducting thorough research and utilising empirical data to substantiate advocacy efforts, noting that sustained engagement and credibility are achieved through a commitment to informed, evidence-based discourse.

Overall, the interviews illustrate the complex and evolving role of experts in the Code of Practice on Disinformation. Experts navigate various challenges, including power imbalances and trust issues, yet they remain key contributors to policy processes. Their diverse perspectives and self-perceptions reveal a dynamic interplay between expertise and influence, shaping how they engage with policymakers and stakeholders.

Section 3: Discussion

This research investigated how experts involved in the Code consider their role in the decision-making process as well as its effectiveness. The analysis of the interviews suggest that engagement of experts in EU digital policy follows many of the same patterns as expert engagement in other EU policy areas, e.g. emergent themes of political/business capture, instrumentalisation and non-transparent selection processes (Boswell, 2008; Metz, 2015); difficulties in finding common language with policy-makers, often necessitating “knowledge brokerage” (Ramot & Bialik, 2020), and assessing the experts’ impact on final outcomes. Their stated goals also align with Metwally’s framework (2022), as they engage in norm-making and narrative setting, shaping content policies and their enforcement; except in our case, experts engage with content policies not by directly advocating or otherwise engaging with the platforms, but through rule-setting in the more formal venue of the Code of Practice on Disinformation. Despite the widespread distrust in platforms and towards the efficacy of the policy process highlighted in the findings, experts are still willing to contribute for various reasons, mainly for individual and instrumental ones such as networking and prestige, but also more idealistic ones such as sharing their expertise for the common good.

In this regard, our study draws parallels with existing literature on the role of experts in policy-making, extending it also to platform governance. However, compared to other policy areas where experts are involved, the Code negotiation process proved to be somewhat peculiar - which is what we seek to explore in greater detail.

The Code of Practice on Disinformation as a policy process

The Code of Practice on Disinformation addresses the emergent challenge of disinformation, a field still in its infancy and lacking established policy solutions. This novelty, underscored by Interviewee 5's claim that “the world of counter-disinformation didn’t exist five years ago”, coincides with disinformation's ascension on the EU and global policy agendas, notably recognised by the World Economic Forum (2024) as a top global risk.

The initial rush to develop the Code in 2018, driven by the urgency to combat disinformation, resulted in vague commitments and underwhelming outcomes, as assessed by the European Commission (2020), European Regulators Group for Audiovisual Media Services (ERGA, 2020), and Nenadic (2020). Despite a 2022 revision aimed at strengthening the Code, its effectiveness remains constrained by its current non-binding nature and a lack of progress on commitments, with platform reports missing on average 64% of essential quantitative data (Park & Mündges, 2023). A critical limitation of the Code is its focus on symptoms rather than the root causes of disinformation, notably the engagement-driven algorithms of Very Large Online Platforms (VLOPs). This perspective is echoed by experts, with some advocating for a complete overhaul of the business model, while others are more reserved.

The transition to co-regulation under the Code does not fundamentally change its limitations, as commitments are still determined by profit-seeking Signatories. This, coupled with a power imbalance favoring VLOPs, restricts the potential for meaningful examination of algorithmic systems. However, the entire exercise of the Code does provide a valuable insight into the changing face of expertise in platform governance, as the previously almost informal and industry-led modus operandi gives way to more formal regulation with greater involvement of the state, experts and other stakeholder groups.

It should also be noted that while power imbalance between experts and private stakeholders in the regulatory process is quite common in areas that concern corporate regulation or are associated with strong interest groups, those experts who engage in policy-making beyond platform governance have underlined that this imbalance is felt especially strongly vis-a-vis the platforms, compared, for example to agricultural policy. This is a relatively new development; digital policy veterans note that prior to the GDPR, Big Tech companies did not pay as much attention to European policy processes, but since then their presence and influence in Brussels has grown exponentially. These peculiarities of the Code, thus, provide an interesting framing to explore the role of experts in policy, which has so far been underexplored in literature. In the next sections, we discuss the key findings of this study, extending the discussion of expert engagement in content moderation to the realm of platform governance.

The changing role of the expert in EU platform governance

Many “legacy” experts expressed frustration with the organisation of expert engagement in the Code process: they cited a lack of iterative feedback mechanisms in the policy-making process, expecting institutions to facilitate meaningful dialogue and collaboration among stakeholders. This is underscored by the overall lack of formal opportunities for stakeholder engagement in the Code process, compared to the normal EU policy cycle, due to its declared status as a self-regulatory mechanism. Though, as we outlined in previous sections, experts had a chance to participate at all stages of the process, to a different degree, the opportunities to contribute on thematically related regulations like the European Media Freedom Act or the DSA were much more formalised and plentiful.

In these circumstances of a seriously limited ability to engage in the policy process, knowledge brokerage emerges as a crucial mechanism for bridging the gap between experts and policymakers. Platforms such as the European Digital Media Observatory (EDMO) also serve as conduits for expert knowledge, facilitating dialogue between researchers, policymakers, and civil society. Through these platforms, experts provide evidence-based insights and recommendations, contributing to informed decision-making processes at both the national and EU levels. This underscores the importance of collaborative platforms in translating expert knowledge into actionable policy outcomes.

Which expertise for policy?

The selection of experts and the transparency of this process are critical components in ensuring the effectiveness and legitimacy of policy-making efforts, particularly in areas as complex and rapidly evolving as platform governance.

Though experts, particularly computer scientists and technologists, have always been an integral part of multi-stakeholder governance of the internet (An & Yoo, 2019; Price & Verhulst, 2005), this newest thread of expert involvement in technology policy is markedly different. Civil society and freedom of speech advocates, scientists and academics for whom the internet is a field of study (e.g., technology ethicists) or a source of data for their work (e.g. media scholars) are the most obvious additions - they contribute expertise not on how the Internet or platforms work technically, but on what implications this technology has for the society as a whole. Other groups, from consumer protection organisations, to industry associations, at least at the European level, are considered to be providing "expertise" relevant to their particular sector's interactions with platforms.

This is broadly reflected in the Code’s patterns of expert engagement too, but with some unexpected ramifications. Compared to the early days of internet governance, the situation is flipped. There's a consensus among the interviewees that policymakers often prioritise engagement with individuals from policy or social science backgrounds, potentially overlooking the valuable insights that technologists and data scientists can offer.

This overrepresentation of "policy people" raises concerns about the comprehensiveness and efficacy of policy solutions, particularly in addressing technical challenges like disinformation and algorithmic governance. It is especially glaring in light of the changing nature of the notion of “platform expert”. Big Tech companies possess the best understanding of how their technology actually works; however, the type of "experts" they involve at the EU level are now policy professionals and lobbyists who may not possess this knowledge themselves; rather, they are the "faces" of the company who are specifically trained to choose how much information and what kind of information is fed into the policy process. This underscores the “formalization” of platform governance, which has led to the “professionalization” of the platforms’ engagement in EU policy; and underscores the ambiguous definition of expertise by EC, which often straddles the line between knowledge, advocacy and lobbying.

The lobbyists in the room

The presence of lobbyists and industry rappresentative in EC’s expert groups is not an issue that concerns only digital policy (Robert, 2010), but it bears repeating. For one, this underlines the fundamental question of who is an expert and whether evidence-based policy should be built upon objective knowledge and facts, or in a more deliberative manner. Some experts involved in the Code, especially those coming from civil society and academia, were frustrated by the presence of industry lobbyists - for example, Interviewee 13 claimed “they are not expert groups, they are stakeholder groups”. Indeed, the presence of industry lobbyists in expert groups raises questions about conflicts of interest and the extent to which policy outcomes may be influenced by corporate agendas. This blurring of lines between expertise and advocacy underscores the need for greater transparency and scrutiny in the selection of participants. The presence of such “industry experts”, however, is not a problem in itself - their insights can be extremely valuable, if provided in good faith, and give legitimacy to a more deliberative mode of expertise in policy.

The bigger concerns among the interviewees were those of political capture and instrumentalisation in expert deliberations, particularly when diverse interests and agendas were at play. The outsize influence and massive lobbying spending of Big Tech in the EU, which also covers their participation in various relevant expert groups and consultation processes is well-documented (Kergueno et al., 2021; Bank et al., 2021; Corporate Europe Observatory, 2023a; Massoglia, 2024). This is the case for relevant digital regulations such as the GDPR (Kayali & Manancourt, 2021), the DSA (Goujard, 2022), and even the AI Act (Corporate Europe Observatory, 2023b). Therefore, it comes as no surprise that themes of lobbying and business capture featured prominently in the interviews, as a significant contributor to the feeling of “powerlessness” in the face of better-resourced and organised platforms. The platform lobbyists were often able to dominate discussions against the somewhat uncoordinated, loose grouping of other experts representing diffuse interests. While we cannot confirm whether this was compounded by some experts, as Interviewee 17 puts it, being “in the pocket of Big Tech”, this state of affairs underscores an undeniable and systemic resource asymmetry between the European Commission, experts and the platforms.

Systemic resources asymmetries

A significant and widespread limitation that emerged is the systemic resources asymmetry, namely different levels of resources between the Commission, the experts and the platforms at a systemic level, not only in terms of funding, but also in terms of human resources, collected knowledge and accessible information and data.

Financial resources asymmetry is the most immediately obvious one. This is, of course, reflected in the lobbying budgets. At the same time, the experts’ work is completely voluntary and unpaid, with the rare exception of expert group rapporteurs. This leaves platforms with outsize influence and leverage in the policy process. As a matter of fact, much of the Code process relied on platform’s funding. Notably, the pilot test for the development of Structural Indicators was funded by platforms which also hesitated and postponed the payment to TrustLab (the third party responsible for this pilot test).

Human resources asymmetry stems naturally from the financial one. On the one hand, platforms can afford to hire the best talent, lawyers and lobbyists to represent them, and to assign entire teams to work on a given policy issue, as it happened with the Code of Practice. On the other hand, they already possess unrivaled technical experts, specialised in the regulated products. The Commission does not have matching in-house technical expertise and would struggle to attract such talent, when competing with Big Tech firms. This is another potential explanation for the under-representation of independent technical experts in the Code process overall, which was noted by several interviewees.

Information and knowledge asymmetry is, in turn, connected to human resources. Platforms and their experts possess in-depth, insider knowledge on all aspects of the algorithmic systems being regulated, including vast amounts of big data generated and analysed by them on a daily basis. Without this information made available to third parties, the entire exercise of the Code of Practice risks losing its meaning, as reporting of the platforms’ cannot be independently verified or assessed.

Data access and the ‘meta-regulatory’ role of researchers

While information asymmetry puts the entire Code exercise at risk of failure, it is also in theory the one area easily fixable, as it has little upfront costs for the platforms. However, this has proven to be extremely challenging since the inception of the Code of Practice, as demonstrated in the ongoing clashes on the access to data for researchers. The experts are well-aware of the paramount value of data for research purposes, with discussions about the need for European funding and privileged access to study social media. Another perspective suggests that major tech companies are reluctant to provide unrestricted access to data. These are thought to be deliberately limiting researchers' access to data, recognising that such access would expose sensitive information, leading them to adopt obstructionist tactics while maintaining a facade of cooperation.

The perceived tensions and adversity between experts and platforms regarding access to data are likely to be further exacerbated with the DSA coming into full force, and the Code of Practice potentially becoming the Code of Conduct. This transition should essentially enshrine and make enforceable access to data provisions, which we theorise will strengthen the meta-regulatory role for experts from academic and civil society research organisations. In this role, experts will take an even more active part in platform governance, in monitoring, evaluation, auditing and enforcement of the Code of Practice and the DSA. As mentioned above, the experts recognise how important this could be for regulating platforms, with Interviewee 2 remarking: “If we, as researchers, had access to the same data that the platforms have access to, then we would not need the Code of Practice.” Therein lies the issue: while access to data has the potential to resolve the problem of information asymmetry, it ultimately hinges upon the practical implementation. The truth is that a wide access to data is incompatible with platforms’ interests and therefore we can likely expect an increasing distrust and opposition between platforms and experts/researchers, that perhaps only the rule of law of the Commission can settle to allow a meaningful cooperation/co-regulation. This brings Article 40 of the DSA, which stipulates access to data for researchers, to the fore as a key provision, the implementation and enforcement of which can “make or break” European content moderation governance.

Conclusion

The engagement of experts in EU platform governance, particularly within the framework of the Code of Practice on Disinformation, highlights evolving dynamics and persistent challenges. The formal structures and roles of expertise in platform governance are undergoing significant shifts. The European Commission, taking charge of platform governance, even in the initial self-regulatory stages, opened up the doors to more different “experts”, reflecting an evolution of "expertise"—from technologists to policymakers, from civil society organisations focused on digital rights to broader human rights groups, media representatives, and other stakeholders. This diversification underscores the fluid and contested notion of what it means to be a "platform governance expert." The concept remains ambiguous, encompassing not only those who specialise directly in platform governance but also individuals from diverse sectors who bring valuable insights based on their interactions with platforms. This inclusivity enriches the discourse but also complicates the delineation of expertise in this rapidly changing field.

Ultimately, the core issue is not a lack of expertise or experts but the institutional mechanisms—or lack thereof—for convening these voices in a transparent, structured, and constructive manner. Our study highlights structural problems, like political capture, non-transparent selection processes, and lobbying influence, which complicate expert engagement and endanger the effectiveness of developed policy solutions. These structural challenges of co-regulation point to the systemic resources asymmetry between platforms and EU institutions. Platforms, armed with substantial resources and exclusive knowledge of their own systems, possess the ability to coordinate effectively among themselves, anticipate regulatory moves, maintain a level of control over the narrative, even by instrumentally exploiting experts to legitimise their policies and actions. This power dynamic tilts the scale in favour of platforms, potentially compromising the efficacy and impartiality of regulatory efforts. This resource imbalance extends beyond the Commission's relationship with platforms to encompass civil society organisations and independent experts. Limited funding, time constraints, and capacity issues hinder meaningful participation in policy-making processes, exacerbating imbalances in expertise and representation. Without adequate resources and support, independent experts from non-governmental and academic sectors struggle to contribute effectively, undermining the legitimacy and effectiveness of platform governance initiatives.

In conclusion, governing the content moderation of online platforms requires navigating a complex landscape fraught with challenges. While the EU's efforts towards co-regulation represent a step in the right direction, critical issues threaten the viability of such collaborative governance models. Addressing these challenges requires enhanced transparency, accountability, and inclusivity in digital governance processes to maximise the impact of expert knowledge and to ensure effective regulation of online platforms.

References

Altay, S., Berriche, M., Heuer, H., Farkas, J., & Rathje, S. (2023). A survey of expert views on misinformation: Definitions, determinants, solutions, and future of the field. Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-119

An, J., & Yoo, I. T. (2019). Internet governance regimes by epistemic community: Formation and diffusion in Asia. Global Governance, 25(1), 123–148. https://doi.org/10.1163/19426720-02501008

Audenhove, L., & Donders, K. (2019). Talking to people III: expert interviews and elite interviews. In H. Van den Bulck, M. Puppis, K. Donders, & L. Van Audenhove (Eds.), The Palgrave Handbook of Methods for Media Policy Research. Palgrave Macmillan. https://doi.org/10.1007/978-3-030-16065-4_10

Bank, M., Duffy, F., Leyerdecker, V., & Silva, M. (2021). The lobby network – big tech’s web of influence in the EU. https://corporateeurope.org/sites/default/files/2021-08/The%20lobby%20network%20-%20Big%20Tech%27s%20web%20of%20influence%20in%20the%20EU.pdf

Boswell, C. (2008). The political functions of expert knowledge: Knowledge and legitimation in European Union immigration policy. Journal of European Public Policy, 15(4), 471–488. https://doi.org/10.1080/13501760801996634

Boswell, C. (2009). Knowledge and policy. In The political uses of expert knowledge: Immigration policy and social research (1st ed., pp. 233–251). Cambridge University Press. https://doi.org/10.1017/CBO9780511581120

Brogi, E., & Gregorio, G. (2024). From the code of practice to the code of conduct? Navigating the future challenges of disinformation regulation. Journal of Media Law, 16(1), 38–46. https://doi.org/10.1080/17577632.2024.2362480

Christensen, J. (2021). Expert knowledge and policymaking: A multi-disciplinary research agenda. Policy & Politics, 49(3), 455–471. https://doi.org/10.1332/030557320X15898190680037

Chystoforova, K., & Reviglio, U. (2024). EDMO experts’ feedback on structural indicators for the EU code of practice on disinformation. European University Institute. https://op.europa.eu/en/publication-detail/-/publication/3ebebaa2-4eec-11ef-acbc-01aa75ed71a1/language-en

Corporate Europe Observatory. (2023a). Lobbying power of Amazon, Google and co. https://corporateeurope.org/en/2023/09/lobbying-power-amazon-google-and-co-continues-grow

Corporate Europe Observatory. (2023b). Byte by byte. How big tech undermined the AI Act. https://corporateeurope.org/en/2023/11/byte-byte

Douek, E. (2022). Content Moderation as Systems Thinking. Harvard Law Review, 136(2), 526–607.

European Commission. (2016). Commission decision establishing horizontal rules on the creation and operation of Commission expert groups (Decision No. C(2016)3301). https://ec.europa.eu/transparency/documents-register/detail?ref=C(2016)3301&lang=en

European Commission. (2018). Code of Practice on disinformation. https://digital-strategy.ec.europa.eu/en/library/2018-code-practice-disinformation

European Commission. (2020). Assessment of the Code of Practice on disinformation – achievements and areas for further improvement [Working document]. https://digital-strategy.ec.europa.eu/en/library/assessment-code-practice-disinformation-achievements-and-areas-further-improvement

European Commission. (2022). Strengthened Code of Practice on disinformation. https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation

European Commission. Directorate General for Communications Networks, Content and Technology. (2018). A multi-dimensional approach to disinformation: Report of the independent High level Group on fake news and online disinformation. Publications Office. https://data.europa.eu/doi/10.2759/739290

European Commission. Directorate-General for Joint Research Centre. (2022). Supporting and connecting policymaking in the Member States with scientific research [Working document]. Publications Office. https://joint-research-centre.ec.europa.eu/jrc-news-and-updates/evidence-informed-policymaking-new-document-foster-discussion-better-use-scientific-knowledge-policy-2022-10-26_en

European Digital Media Observatory (EDMO). (n.d.). Implementation of the Code of Practice on Disinformation: Lessons from the assessments and proposals for the future [Workshop report]. European Digital Media Observatory (EDMO. https://cadmus.eui.eu/bitstream/handle/1814/70916/Workshop_2020_EDMO.pdf?sequence=1&isAllowed=y

European Regulators Group for Audiovisual Media Services (ERGA). (2020). ERGA report on disinformation: Assessment of the implementation of the Code of Practice. https://erga-online.eu/wp-content/uploads/2020/05/ERGA-2019-report-published-2020-LQ.pdf

Finck, M. (2017). Digital regulation: Designing a supranational legal framework for the platform economy. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.2990043

Flyvbjerg, B. (2001). Making social science matter: Why social Inquiry fails and how it can succeed again (S. Sampson, Trans.; 1st ed.). Cambridge University Press. https://doi.org/10.1017/CBO9780511810503

Gillespie, T. (2019). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press. https://doi.org/10.12987/9780300235029

Gorwa, R. (2019). The platform governance triangle: Conceptualising the informal regulation of online content. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407

Goujard, C. (2022, October 14). Big tech accused of shady lobbying in EU Parliament. POLITICO. https://www.politico.eu/article/big-tech-companies-face-potential-eu-lobbying-ban/

Grundmann, R. (2017). The problem of expertise in knowledge societies. Minerva, 55(1), 25–48. https://doi.org/10.1007/s11024-016-9308-7

Gundersen, T. (2018). Scientists as experts: A distinct role? Studies in History and Philosophy of Science Part A, 69, 52–59. https://doi.org/10.1016/j.shpsa.2018.02.006

Harcourt, A., Christou, G., & Simpson, S. (2020). Global standard setting in internet governance (1st ed.). Oxford University PressOxford. https://doi.org/10.1093/oso/9780198841524.001.0001

Harvey, W. S. (2011). Strategies for conducting elite interviews. Qualitative Research, 11(4), 431–441. https://doi.org/10.1177/1468794111404329

Helberger, N., Pierson, J., & Poell, T. (2018). Governing online platforms: From contested to cooperative responsibility. The Information Society, 34(1), 1–14. https://doi.org/10.1080/01972243.2017.1391913

Holst, C., Molander, A., & Christensen, J. (2022). Expertise, policy-making and democracy (1st ed.). Routledge. https://doi.org/10.4324/9781003106555

Kayali, L., & Manancourt, V. (2021, February 10). How Europe’s new privacy rules survived years of negotiations, lobbying and drama. POLITICO. https://www.politico.eu/article/europe-privacy-rules-survived-years-of-negotiations-lobbying/

Kergueno, R., Aiossa, N., Pearson, L., Corser, N. S., Teixeira, V., & Hulten, M. (2021). Deep pockets, open doors. Big tech lobbying in Brussels. https://transparency.eu/wp-content/uploads/2024/10/Deep_pockets_open_doors_report.pdf

Krick, E., & Gornitzka, Å. (2024). Tracing scientisation in the EU Commission’s expert group system. Innovation: The European Journal of Social Science Research, 37(2), 319–339. https://doi.org/10.1080/13511610.2020.1811649

Lewis, D., Rahman, M. F., Twinomuhangi, R., Haque, S., Huq, N., Huq, S., Ribbe, L., & Ishtiaque, A. (2023). University-based researchers as knowledge brokers for climate policies and action. The European Journal of Development Research, 35(3), 656–683. https://doi.org/10.1057/s41287-022-00526-0

Littig, B. (2009). Interviewing the elite – interviewing experts: Is there a difference? In A. Bogner, B. Littig, & W. Menz (Eds.), Interviewing experts (pp. 98–113). Palgrave Macmillan UK. https://doi.org/10.1057/9780230244276_5

Mărcuț, M. (2017). Crystallizing the EU digital policy. In M. Mărcuț, Crystalizing the EU digital policy (pp. 109–176). Springer International Publishing. https://doi.org/10.1007/978-3-319-69227-2_4

Massoglia, A. (2024, January 26). State and federal lobbying spending tops $46 billion after federal lobbying spending broke records in 2023. OpenSecrets News. https://www.opensecrets.org/news/2024/01/state-and-federal-lobbying-spending-tops-46-billion-after-federal-lobbying-spending-broke-records-in-2023/

Metwally, A. (2022). The governors’ advisors: Experts and expertise as platform governance. Yale JL & Tech, 24, 510–540.

Metz, J. (2015). The European commission, expert groups and the policy process: Demystifying technocratic governance. Palgrave Macmillan. https://doi.org/10.1057/9781137437235

Mündges, S., & Park, K. (2024). But did they really? Platforms’ compliance with the code of practice on disinformation in review. Internet Policy Review, 13(3), 1–21. https://doi.org/10.14763/2024.3.1786

Nenadic, I. (2020). Implementation of the Code of Practice on Disinformation: Lessons from the assessments and proposals for the future. https://hdl.handle.net/1814/70916

Nenadic, I., Brogi, E., & Bleyer-Simon, K. (2023). Structural indicators to assess effectiveness of the EU’s Code of Practice on Disinformation (Working Paper No. European University Institute). https://cadmus.eui.eu/handle/1814/75558

Park, K., & Mündges, S. (2023). CoP monitor. Baseline reports. https://fujomedia.eu/site/assets/files/1990/cop-monitor-report.pdf

Pielke, Jr, R. A. (2007). The honest broker: Making sense of science in policy and politics (1st ed.). Cambridge University Press. https://doi.org/10.1017/CBO9780511818110

Price, M. E., & Verhulst, S. G. (2005). Self-regulation and the internet. Kluwer Law International B.V. https://law-store.wolterskluwer.com/s/product/selfregulation-and-the-internet/01t0f00000J3b17AAB?srsltid=AfmBOop3P8fBhYCfKBn3YPJjU3W1cx7veX-VGxmXh9TnuMLoEtmMyDW4

Ramot, R., & Bialik, G. (2020). Researchers as knowledge brokers: A step toward research-informed policy? Lessons from the Israeli case. Education Policy Analysis Archives, 28, 1–25. https://doi.org/10.14507/epaa.28.5115

Robert, C. (2010). Who are the European experts? Profiles, trajectories and expert ‘careers’ of the European Commission. French Politics, 8(3), 248–274. https://doi.org/10.1057/fp.2010.13

Roberts, S. T. (2017). Content moderation. In L. A. Schintler & C. L. McNeely (Eds.), Encyclopedia of big data (pp. 1–4). Springer International Publishing. https://doi.org/10.1007/978-3-319-32001-4_44-1

Schmulow, A., Hauser, J., & Alemanno, A. (2022). Constructing an EU ethics oversight authority. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.4298158

Spruijt, P., Knol, A. B., Vasileiadou, E., Devilee, J., Lebret, E., & Petersen, A. C. (2014). Roles of scientists as policy advisers on complex issues: A literature review. Environmental Science & Policy, 40, 16–25. https://doi.org/10.1016/j.envsci.2014.03.002

Valin, N., & Huitema, D. (2023). Experts as policy entrepreneurs: How knowledge can lead to radical environmental change. Environmental Science & Policy, 142, 21–28. https://doi.org/10.1016/j.envsci.2023.01.013

Volmari, S., Kauko, J., Anturaniemi, J., & Santos, Í. (2022). Evidence and expert power in Finnish education policy making: The national core curriculum reform. In B. Karseth, K. Sivesind, & G. Steiner-Khamsi (Eds.), Evidence and expertise in nordic education policy (pp. 115–148). Springer International Publishing. https://doi.org/10.1007/978-3-030-91959-7_5

World Economic Forum. (2024). The global risks report 2024 (Insight Report No. 19). https://www3.weforum.org/docs/WEF_The_Global_Risks_Report_2024.pdf

Yin, R. K. (2018). Case Study Research and Applications: Design and Methods. SAGE Publications. https://uk.sagepub.com/en-gb/eur/case-study-research-and-applications/book250150

Appendix A. Interview guide

  1. As you know, the DSA is coming into force in February, with the potential for the Code of Practice on Disinfo to become a Code of Conduct. In your role as an expert and “observer” on the Code, how would you evaluate the Code’s progress up to this juncture? What do you see in the near future of the Code - what will become better/what will be the main challenges?
    1. (Only for HLEG Members) Given your role in the HLEG, which gave the original recommendation to develop the Code: how does it compare to what was being discussed at the time? Did it match your personal expectations?
    2. Which element of the Code would you consider most successful so far? Basically, what was going right? And vice versa, what were some of the main failings?
    3. What would you change, knowing what you do now, in how the Code was designed and implemented?
    4. Who or what would you consider to be the main drivers of both the successes and failures of the Code?
  2. The Code, much like the DSA, devotes a big role to experts/researchers in monitoring and implementation of the Code. How effective in your perception has this policy tool been in meaningfully engaging this community, yourself included? Has this process lived up to your expectations?
    1. Do you think expert knowledge/inputs have been successfully integrated into the Code, its monitoring and implementation? Why or why not?
  3. Beyond participating in this expert group, how do you engage in EU digital policy/platform governance/content moderation policy in your capacity as an “expert” or “researcher”?
    1. (Only for researchers) Do you see informing policy as one of the goals of your research? Why or why not?
    2. Do you think expert knowledge and research in the field of disinformation is effectively translated into policy (whether it be yours, or knowledge and research more broadly)?
  4. In your opinion, how can the engagement of experts/researchers in EU digital policy/platform governance/content moderation policy be improved moving forward?

Appendix B. List of interviewees

No. Name Role
1 Angeliki Monnier Academic, SI Group
2 Kalina Bontcheva Academic, SI group
3 Minna Horowitz Academic, SI group
4 Malin Carlberg Consultant, SI group
5 Carl Miller Think tanker, researcher, SI group
6 Wout van Wijk Media lobbyist, HLEG
7 Mato Brautovic Academic, SI group
8 Peter Kreko Academic, think tanker, SI group
9 Francesca Arcostanzo Civil society researcher, ISD, SI group
10 Mauritius Dorn Civil society public policy officer, ISD
11 Anda Rozukalne Academic, HLEG
12 Nicoleta Corbu Academic, SI Group
13 Monique Goyens Civil society, consumer protection, HLEG
14 Karina Stasiuk-Krajewska Academic, SI group
15 Anonymous Academic, HLEG
16 Ziga Turk Academic, HLEG
17 Olaf Steenfadt Civil society/media, SI group, HLEG
18 Raegan MacDonald Former Mozilla, now civil society, HLEG
19 Tommaso Valletti Academic, former Chief economist DIGI-COMP
20 Gianni Riotta Journalist/academic, previous HLEG on disinformation
21 Wouter Gekiere Head of Brussels Office, European Broadcasting Union, member of the Code Sounding Board
22 Iva Nenadic Academic, EDMO
23 Paolo Cesarini Policymaker, EDMO
24 Konrad Bleyer-Simon Academic, EDMO
List of original interviews with Group of Experts on Structural Indicators
0.1 Mato Brautović Academic
0.2 Francesca Arcostanzo Civil society researcher
0.3 Carl Miller Think tanker, researcher
0.4 Minna Horowitz Academic
0.5 Raluca Buturoiu Academic
0.6 Nicoleta Corbu Academic
0.7 Karina Stasiuk-Krajewska Academic
0.8 Angeliki Monnier Academic
0.9 Peter Kreko Academic, think tanker
0.10 Juliane von Reppert-Bismarck Civil society
0.11 Trisha Meyer Academic
0.12 Eileen Culloty Academic
0.13 Pietro Tesfamariam Media analyst
0.14 Olaf Steenfadt Civil society/media
0.15 Stephan Mündges Academic
0.16 Malin Carlberg Consultant
0.17 Kalina Bontcheva Academic