Facebook, the EU and Russia’s war: Challenges of moderating authoritarian news
Abstract
Social networking sites are becoming an increasingly important channel for the transnational transmission of information. However, their role as intermediaries of autocratically controlled information environments reaching audiences outside the autocratic country's borders has not been extensively studied. This study investigates the consequences of Russia’s full-scale invasion of Ukraine on Russia’s Russian-language news ecosystem on Facebook, which, as of June 2025, mostly includes state propaganda that targets predominantly an audience outside of Russia. Analysing data from Meta's Crowdtangle API, encompassing 646,230 unique posts from 2,791 public Facebook spaces, it focuses on the distribution of Russia’s top 72 Russian-language news domains (critical/uncritical towards the ongoing war) six months before and after the invasion (from 24 August 2021 to 24 August 2022). The results indicate a significant decline in posts containing uncritical news domains post-invasion. Nevertheless, uncritical news domains continue to be disseminated more widely than critical ones, indicating a persistence of Russian state propaganda visibility on Facebook. The study highlights the complexities of platform governance, specifically with regards to the implementation of EU policies on non-EU social networking sites, Facebook's limited efforts to demote Russian state-aligned news and calls for greater attention of regional language contexts in content moderation.
Introduction
As of June 2025, Russia’s full-scale war against Ukraine is still ongoing and being waged through the strategic use of military and media power (Rodgers & Lanoszka, 2023). Before the full-scale war, extant research indicated that audiences residing outside of Russia consumed news content not only from the country's foreign news media but also from Russian-language domestic ones, particularly in the former Soviet countries (see, e.g., Rotaru, 2018; Vihalemm, Juzefovičs, & Leppik, 2019), but also in “Western” countries, such as Germany (e.g., Golova, 2020). Specifically, Russian-state-aligned Russian-language domestic news outlets have utilised Facebook to disseminate information uncritical of Russia’s political leadership primarily to foreign audiences (Kling, 2022).
Russia’s full-scale invasion led to several measures potentially affecting the dissemination of the country’s news content on Facebook. First, the Council of the European Union (EU) (2024) has banned the spread of content tied to "Kremlin-backed disinformation outlets" (n.p.) in the EU starting in March 2022. Russia’s main foreign broadcasters, RT and Sputnik were restricted immediately after the start of the invasion (early March 2022), followed by the domestic news outlets vesti.ru and tvc.ru in June 2022.1Second, Facebook closed down advertisement and monetisation options for Russian state-controlled media shortly after the start of the invasion. This limited their ability to promote content, attract new audiences, and generate revenue on the platform. As a result, the growth and influence of these outlets on Facebook were likely reduced. Also, Facebook labelled some Russian news outlets as “state-controlled”, a label that was found to significantly reduce the reach of affected posts (Aguerri et al., 2022). After the EU sanctions’ implementation, Facebook, moreover, slowly restricted accounts affiliated with the EU-sanctioned outlets for users in the EU (Meta, 2022). This measure additionally likely reduced the dissemination of content from affected news outlets. Third, the Russian Ministry of Justice classified Meta as an “extremist organization”, resulting in its access being blocked by the country's Internet Service Providers (ISPs). It is estimated that, as a consequence, Facebook lost its revenue in Russia, accounting for "1.5% of Meta's advertising dollars" and a loss of about USD 4.7 million per day (Brown, 2022) as of March 2022. Nevertheless, despite this, personal usage of Facebook in Russia remains permissible and the platform can still be accessed through various means, such as Virtual Private Networks (VPNs). Also, domestic news outlets, including state-controlled ones, are theoretically permitted to utilise the network for the dissemination of their content without the use of Facebook’s commercial tools, with no legal restrictions.
While Facebook has previously implemented bans on state-controlled accounts in response to military violence, such as its prohibition of accounts associated with the Myanmar military following its seizure of state control in February 2021 (Associated Press, 2022), there exists no historical precedent of this magnitude of measures aimed at regulating the governance of content from an authoritarian state specifically on Facebook within the EU context. Nevertheless, to the best of our knowledge, no academic study has yet focused on the role of Facebook as a stakeholder of Russia’s informational influence through Russian-language news content since the start of Russia’s full-scale invasion.
To address this gap, this study focuses on the change in the distribution of Russia’s Russian-language news content since the start of Russia’s full-scale invasion by studying the distribution of Uniform Resource Locators (URLs) including 72 domains of the top Russian-language news outlets in Russia with varying levels of criticism towards Russia's war (critical [N=12]/uncritical [N=60]) on Facebook's public spaces (pages, groups, and public profiles) six months before and after the commencement of the invasion. The study uses data from Meta's Crowdtangle Application Programming Interface (API), which includes 646,230 distinct posts from 2,791 public Facebook spaces.
While the results reveal a significant decrease in the number of posts from Facebook's public spaces containing Russia's uncritical news domains after the start of the invasion, they also show that Facebook did not fully censor pro-Kremlin content in the Russian language. Specifically, we find that Facebook did not label outlets controlled by the Russian state in any other way than through direct ownership which allowed unlabelled outlets to continue spreading content that justifies the war. Also, based on the number of posts containing Russia’s news domains, uncritical news domains were still more visibly disseminated (157,641 posts) compared to critical news domains (45,090 posts). Thus, our study underscores the complexity inherent in the governance of platforms2, with a particular focus on non-EU-based social networking sites (SNSs)3, a prime example of which is Facebook. Our findings demonstrate that such complexity enables autocracies, such as Russia, to persist in the manipulation of information environments beyond their borders in their domestic language, despite the implementation of numerous measures specifically designed to counter such behaviour.
In the context of the ongoing war, content regulation policies by the EU, Facebook and Russia remain relatively unchanged. The EU has imposed sanctions on additional Russian news outlets, while Facebook has initiated a process to relax its content moderation policies starting from the US in January 2025. Although Meta's restrictions linked to Russia's war against Ukraine remained unaltered and un-updated since 2022, it is anticipated that subsequent investigations will yield analogous outcomes concerning the governance of news environments from authoritarian non-Anglophone language contexts.
Our paper is structured in the following way. After reviewing the literature on SNSs as information intermediaries of authoritarian countries’ news, particularly concerning Facebook, we present extant findings on the spread of Russia’s propaganda on Facebook since the full-scale invasion. Next, we describe our methodological approach, followed by our findings. Finally, we discuss these findings against the research literature on autocratic information dissemination on SNSs and Russia’s news ecosystem on the platform.
Facebook as a mediator of authoritarian news environments
The dissemination of dis- and misinformation, especially originating from authoritarian states, hate speech and incivility on SNSs has prompted democratic countries to discuss how to govern platforms (Gorwa, 2019), including their “platform power” (Naughton, 2018) over national affairs. As Gorwa (2019) suggests, the governance of platforms is negotiated between different parties, including “platform companies, users, advertisers, governments, and other political actors” (Gorwa, 2019, p. 854). SNSs rely primarily on their international community guidelines and terms of service standards as the basis for content moderation (see Gillespie, 2018) and depend on human and computational actors to govern information (Ahn et al., 2023). This practice has been the subject of considerable criticism for adhering to economic guidelines and popularity, rather than to journalistic standards (see DeVito, 2017; Nechushtai & Lewis, 2019). Also, among other channels of disinformation, such as fake websites, the practice has been criticised for altering the flow of political communication, thereby challenging democratic processes and citizens' trust in democratic institutions (Bennett & Livingston, 2018).
Consequently, governments have expressed a desire for platforms to implement content moderation regulations that are more effectively aligned with national policies (e.g., the Digital Service Act [DSA] in the EU context), and, thus, comply better with national political speech regulations (Ahn et al., 2023). Nevertheless, as argued by Ahn et al., (2023), the governance of harmful information distribution is still a matter of ongoing negotiation. For example, in their research, they found that specifically Facebook formally adheres with national policies of the United States, Germany, and South Korea, but constantly “steer users away from the local systems and towards its centralized operations” (Ahn et al., 2023, p. 1).
Russia’s use of SNSs to reach foreign —primarily anglophone—audiences with state-aligned propaganda has emerged as an important area of academic interest already before Russia’s full-scale invasion of Ukraine. For example, extant research has extensively studied the activities of bots and trolls, such as the Internet Research Agency (IRA), on Twitter, Facebook and Instagram (see, e.g., Bastos & Farkas, 2019; DiResta et al., 2022; Doroshenko & Lukito, 2021; Golovchenko et al., 2020) and the dissemination of content from Russia’s foreign news outlets, RT and Sputnik, on Facebook (see Kuznetsova & Makhortykh, 2023).
Nevertheless, research in this area remains underrepresented in non-anglophone contexts where the operational dynamics of the US-based platforms, such as Facebook, manifest in distinctive ways and exert considerable influence over socio-political phenomena. Illustratively, the UN fact-finding mission has stated that Facebook played a “significant” (Human Rights Council, 2018, p. 14) role in the Rohingya genocide in Myanmar in 2017. In their study, Sablosky (2021) documented the platform’s development from its initial role as “a tool for democratic liberalisation” (p. 1017) to its current status as “an extraterritorial political authority beset with radical new normative powers” (p. 1021). Similarly, in India (Facebook's largest market based on the number of users)4, Facebook has permitted hate speech by the ruling Hindu nationalist Bharatiya Janata Party (BJP) politicians to remain on its platform in contravention of its policies, to avoid offending the BJP in 2020 (Purnell & Horwitz, 2020).
The news media of authoritarian countries can influence foreign audiences with an authoritarian state's vision of domestic and global politics. For instance, in 2014, the Ukrainian government restricted Russian and pro-Russian news outlets due to the concern that their content could impact domestic audiences’ “perceptions of reality” (Szostek & Orlova, 2024, p. 1). Based on survey data from September until October 2021, Szostek & Orlova (2024) found that the use of restricted news outlets was indeed linked to mistaken beliefs about the truthfulness of headlines and lower support for a democratic Ukraine. Nevertheless, at present, there is a paucity of research investigating the role of SNSs in general, and Facebook in particular, in the transnational dissemination of non-Anglophone news controlled by authoritarian countries, such as Russia (for a notable exception related to Russia-based SNS VK, see Golova, 2020).
One of the few studies addressing the transnational reach of Russia’s Russian-language domestic news has demonstrated, based on Facebook’s marketing and follower data, that Facebook served as a transmission channel for state-aligned news for audiences outside of Russia just one month before the start of Russia’s full-scale invasion. This is evidenced by Kling’s (2022) finding that 40% of the 50 top Russian Russian-language domestic news outlets selected for the study (in February 2022) had more than 50% of their audience outside of Russia, with the largest audiences found in former Soviet countries.5The number of Facebook users among former Soviet countries might differ, because Facebook has been less popular than Russia’s SNSs, VK and Odnoklassniki in Central Asia (Reyaz, 2020), and some countries took measures to reduce Russia’s informational influence already before Russia’s full-scale invasion (for instance, Ukraine banned Russia’s SNSs in 2017 (Roth, 2017)). Nevertheless, across all former Soviet countries under investigation, Kling (2022) found that Russia’s domestic news outlets uncritical of Russia’s political leadership had larger audiences than critical ones. To contribute to this line of research, this study examines Facebook's role in moderating the transnational dissemination of content from the top Russian-language news outlets in Russia (as of May 2022) six months into Russia’s full-scale war.
Russia’s propaganda on Facebook since the full-scale invasion
Prior to Russia’s full-scale invasion of Ukraine, Facebook in Russia was perceived as a forum for both oppositional and state-controlled entities. Around the time of the failed protests of 2011-2012 in Russia, Western SNSs and Facebook in particular, have been described as a “liberal-oppositional echo chamber” (Bodrunova et al., 2021, p. 2932), which arguably played an important role in nurturing and galvanising the protest (Bodrunova & Litvinenko, 2013) against the falsified parliamentary elections, and was linked to an increased awareness of election fraud (Reuter & Szakonyi, 2015). Nevertheless, when the Russian authorities in December 2014 asked to block pages that allowed supporters of one of the opposition leaders, the late Alexei Navalny, to register for a protest rally against his imminent criminal conviction and get alerts about the event, Facebook blocked access to this page inside Russia and only after a "huge outcry in the Western media" (Sanovich, 2017, p. 13) refused to block the new copies of the page.
Even before it was designated an extremist organisation, Facebook was never the sole platform for news dissemination in Russia. It faced strong competition from domestic platforms such as VK, as well as international services like YouTube and Telegram and also Meta’s own Instagram (due to its visual format). However, Meta stands out because it is the first major tech company to be officially labelled as extremist by the Russian authorities, unlike Google, which continues to operate YouTube in the country at the time of writing (June 2025). This makes Facebook a prime case study for examining how an authoritarian state exerts pressure on SNSs.
Russia’s full-scale invasion of Ukraine led to a number of measures potentially affecting the dissemination of Russia’s news on Facebook until August 2022 (the timeframe this study focuses on) as outlined in the introduction. Applying the DSA’s rules, the European Commission (2023) published a study on the distribution of Kremlin disinformation across major SNSs over almost a year before and after the start of the full-scale invasion of Ukraine (from December 2021 to December 2022). Specifically for Facebook, it finds that even though the reach (subscriber count) of selected pro-Kremlin accounts grew, their publishing activity and user engagement decreased significantly. Similarly, Okholm et al. (2024) show that the sharing of URLs related to Russia’s foreign news outlets RT and Sputnik dropped in Facebook’s fringe communities. Nevertheless, pro-Kremlin content is accessible on Facebook outside the EU and remains accessible even within the EU to some extent, as noted by the European Commission (2023). Also, a platform implementation study conducted by Glazunova et al. (2023), spanning from February to May 2022, revealed that the bans against RT and Sputnik were inconsistently implemented by Facebook across various EU countries, thereby rendering some EU countries' users more susceptible to Russian propaganda than others.
While the European Commission (2023) provides a comprehensive assessment of the effectiveness of EU sanctions against selected Russian news outlets, it does not specifically focus on news outlets, but pre-selected SNS accounts (including some news outlets) distributing pro-Kremlin content. By doing so, the report studies different types of accounts spreading pro-Kremlin propaganda but does not consider the dissemination of content from popular Russian Russian-language news domains omitting information about Russia’s intentional attacks on the civilian population or civilian objects in Ukraine, as well as disseminating pro-Kremlin propaganda. Moreover, the study does not consider the impact of the other measures highlighted in this study: Facebook’s labelling of specific news outlets and Russia’s legislation towards Meta. In order to create a differentiated picture of the impact of different measures on the distinct news outlets, as well as for measuring their impact on visibility, we ask:
RQ1: Which of Russia’s news outlets’ visibility on Facebook was potentially impacted by which of the highlighted measures following Russia’s full-scale invasion?
Furthermore, the European Commission (2023) did not compare the distribution of war-uncritical with war-critical content. By contrast, in our study, we differentiate between “uncritical” news outlets as those either omitting truthful information about Russia’s war atrocities and/or intentionally spreading pro-Kremlin propaganda as compared to “critical” news outlets, defined as those criticising Russia’s war. The most obvious example of our categorisation is the Bucha massacre, which happened in March 2022. If a news outlet questioned Russia’s army’s involvement in the massacre or omitted any information on it, we classified it as “uncritical”. If an outlet provided information about it, such as evidence of Russia’s army’s involvement or witness statements, we classified it as “critical” (see the Appendix for examples). Therefore, outlets that express some criticism of the Russian army but still deny its war atrocities were classified as “uncritical". The idea behind this categorisation was that information from “critical” outlets could encourage readers to question the version of events provided by the Russian authorities and propaganda.The comparison of war-uncritical and war-critical content helps to understand what type of content Russian-speaking Facebook users are more likely to be exposed to. We note that we do not consider the potential negative framing of URLs for the analysis of this study, as we aim to get a general picture of what type of content is most visibly distributed across the platform. Assuming that predominantly Kremlin-aligned outlets are targeted by the highlighted measures, we hypothesise:
H1a: The visibility of posts including Russia’s uncritical news domains on Facebook decreased after the start of Russia’s full-scale invasion.
H1b: The visibility of posts including Russia’s critical news domains on Facebook did not significantly change after the start of Russia’s full-scale invasion.
Additionally, to understand what content is specifically most visibly spread and what impact the EU sanctions and Meta’s ban inside Russia had with regards to Russia’s news ecosystem six months into Russia’s full-scale war, we ask:
RQ2: What news domains have been most visibly spread on Facebook six months into Russia’s full-scale invasion by criticism level (critical/uncritical)?
Scholars studying the dissemination of dis- and misinformation on SNSs applied network analysis to better understand the dynamics of information amplification (see e.g., Angus et al., 2023; Gruzd et al., 2023). It can be assumed that the official pages of selected news outlets were more impacted by the highlighted measures. Thus, to better understand the dynamics of Russia’s informational control outside of official spaces on Facebook, we also utilise a network methodology to ascertain the dissemination of Russia's news sharing six months into the full-scale invasion of Ukraine specifically on “amplifier spaces” (defined as public spaces not being the official Facebook news pages of a news outlet). We ask:
RQ3: How are amplifier spaces of different types of Russia’s news (critical/uncritical) on Facebook connected six months into Russia’s full-scale invasion?
Method
Data collection
As a first step, we identified 72 domain names of the most influential (top) news outlets in Russia (in the Russian language) as of May 2022.6In selecting the news outlets, we followed a systematic procedure developed in a previous study (Kling, 2022) that operationalised “influence” as (a) audience reach within Russia and (b) the number of citations by other news outlets and on SNSs according to data from the Russian web analytics companies Liveinternet.ru (for audience reach) and Medialogia (for number of citations). For this study, we excluded news outlets focusing exclusively on the local (region or city) context. A list of selected news domains can be found in Table 1 in the Appendix.
For the indicated domains, we gathered data from the links endpoint of Meta‘s Crowdtangle API on all posts in public Facebook spaces (public pages, groups and verified profiles) that published URLs with selected news domains using the Python library “Crowd”.7The API provides only data from “public” Facebook accounts, including Facebook pages (with > 25,000 page likes and followers), public groups (with > 95,000 members), US-based public groups (with > 2,000 members) and all verified profiles. Thus, (semi-) private spaces and small public spaces are not included. For a detailed overview of the data collection process via Meta’s CrowdTangle API, see Angus et al. (2023). The data was collected between 24 August 2021 and 24 August 2022. By choosing this timeframe, we aimed to see the direct consequences of the introduced measures and how long pro-Kremlin propaganda was spread into the war.
For our study, we flattened the data set to count every unique URL in a row of the column “ExtendedLinkOriginal”, which in some cases included an array of several URLs. Also, we excluded from our data set posts that did not include a URL (based on the “ExtendedlinkOriginal” column). For details on parameters, see the official CrowdTangle documentation.8Our initial data set resulted in 673,656 unique posts from 15,780 public spaces on Facebook. In the next step, we deleted public spaces that published URLs to the selected news domains in Russia less than 10 times within the timeframe (from 24 August 2021 to 24 August 2022). This step served to restrict the data set to encompass exclusively public spaces which have disseminated content from top news domains in Russia beyond a predetermined threshold during the period under investigation. After this cleaning step, the data set was reduced to 646,230 unique posts from 2,791 public spaces containing 72 distinct news domains. This data set has been used to address the analysis, in which the URL distribution of the period before Russia’s full-scale invasion (until 24 February 2022) was compared to the period after it (after 24 February 2022).
Data categorisation
We distinguished between three measures to create a differentiated picture of the change in the dissemination of URLs to selected news domains on Facebook: 1) EU sanctions, 2) Facebook labelling and 3) Russia’s legislation. According to the official documents of the Council of the EU (2024), five news domains were affected by the sanctions. To identify if a news domain was labelled in a post as “state-controlled”, one of the authors shared every sampled news domain on a private Facebook page in October 2023 and marked the domains as labelled if Facebook’s label would show.9We identified 15 news domains that were affected by the labelling. To understand whether the use of Facebook by a news outlet would be affected by Russia’s legislation (who labelled the SNS an “extremist organisation”), the two authors studied which news outlets still operate predominantly from their editorial offices in Russia based on publicly available information. Those identified as potentially affected (N=65) were anticipated to exhibit reduced activity in publishing content on their official Facebook pages. Notwithstanding the continued permissibility of personal usage of Facebook and the accessibility of the platform through various means, including VPNs, the SNS itself is blocked by Russia's ISPs. Consequently, Russia-based newsrooms will probably encounter technical and ideological challenges in publishing content on Facebook.
Additionally, the news domains were categorised based on their level of criticism towards Russia’s war against Ukraine (critical/uncritical). The two authors with native Russian language skills and a thorough understanding of Russia’s news landscape categorised the news outlets’ content on Facebook for the timeframe 24 February – 24 August 2022. For details on categorisation criteria, keywords and examples, see the theory section above and the Appendix. The two coders reached full inter-coder reliability, with Krippendorff Alpha (α= 1). They categorised 60 news domains as uncritical and 12 as critical (see the Appendix, Table 1 for an overview).
Data analysis
We calculated three additional measurements to compare the distribution of URLs on Facebook’s public spaces: 1) total number of posts by Facebook space (page, group, public profile), 2) total number of posts by news domain and 3) average engagement by public space. For details on the measurement calculations, see the Appendix.
For the analysis, we split the data set by the date of the start of Russia’s full-scale invasion of Ukraine (24 February 2022), resulting in a before data set including 439,191 unique posts and 2,629 unique public spaces and an after data set including 207,039 unique posts and 2,497 unique public spaces. While the before data set kept all sampled 72 news domains (uncritical [N=60], critical [N=12]), the after data set included only 71 unique domains (uncritical [N=59], 12 critical [N=12], ytro.news [uncritical news outlet] ceased being shared). The descriptives demonstrate that the count of posts containing Russia’s top news domains decreased by approximately 50% after the start of Russia’s full-scale invasion. However, the number of unique public spaces decreased only by about 5%. Thus, public spaces that have previously disseminated a considerable volume of URLs to selected news domains appear to have markedly curtailed their dissemination.
Findings
Change in Russia’s news dissemination
For RQ1, our research reveals a discrepancy in what outlets are affected by the distinct measures linked to the SNS (Facebook) and state actors (the EU, Russia). This discrepancy allowed news outlets whose domains remained unlabelled by Facebook, for example, to continue spreading content that justified Russia's war during its first six months (see Table 1 in the Appendix for an overview). Specifically, the distribution of only three news domains (russian.rt.com, radiosputnik.ria.ru, vesti.ru) was potentially impacted by all three measures. One domain (tvc.ru) was sanctioned by the EU, but not labelled by Facebook. 12 news domains (ukraina.ru, pnp.ru, inosmi.ru, rambler.ru, gazeta.ru, tvzvezda.ru, ria.ru, rg.ru, tass.ru, lenta.ru, 1tv.ru, smotrim.ru) were labelled by Facebook but not sanctioned by the EU during the researched period (until 24 August 2022). Only seven news domains (holod.media, theins.ru, meduza.io, bbc.com/russian, dw.com/ru, svoboda.org, currenttime.tv) were found potentially not to be impacted by Russia’s legislation against Facebook, as most of their employees were located outside of Russia.
Specifically for the group of Russia's uncritical news domains, the study reveals a significant decrease in the total average number of posts by domain after (N=59, M= 3555, SD= 6604) the start of Russia's full-scale invasion (t(117) = 2.3, p = .022) also based on Welch's t-test not assuming equal variances compared to before the period (N=60, M= 7558, SD= 11474), thus supporting H1a related to activity. By contrast, the total average number of posts including Russia's critical news domains has not been significantly affected, thus, also supporting H1b related to activity. For engagement, both H1a and H1b were not supported, as no statistically significant difference was found.
Visibility of Russia’s uncritical news on Facebook
To analyse Russia's news ecosystem on Facebook six months into Russia’s full-scale war (RQ2), we filtered the after data set once again keeping unique public spaces publishing posts containing URLs with Russia’s news domains more than 10 times. This step served to restrict the data set to contain exclusively public spaces with activity beyond a predetermined threshold for the investigated period. The reduced after data set contained 1,385 public spaces, 20,2531 unique posts and 71 domains.
Although overall fewer uncritical domains are spread across the platform after the start of Russia’s full-scale invasion, we find that they are still more visibly distributed based on the total number of unique posts containing those domains (157,641 total posts) and average subscriber count by group (484,023 average subscriber count) compared to critical news domains (45,090 total posts, 162,707 average subscriber count). However, the posts containing critical news domains on average receive more average total engagement (142 average total engagement) compared to posts containing uncritical domains (27 average total engagement). See the Appendix, Figure 1 for a visual comparison of these measurements and Figure 3 for the top news domains by publishing activity and engagement.
Amplifier spaces of Russia’s news domains
Finally, we studied specifically the amplifier spaces of Russia’s news domains defined as public spaces publishing posts containing URLs with Russia’s news domains that are not linked to one of the selected news domains' official Facebook pages (those that almost exclusively [up until one source] publish URLs to their news domain, regularly verified and/or named the same as the outlet). We excluded 63 official news pages, including several pages verified and unverified for one news outlet.10Also, we excluded 67 regional and/or topic-focussed Facebook pages linked to our sample of selected news outlets, such as Komsomol’skaya pravda Irkutsk. The final amplifier data set included 1,266 unique public spaces (91% of the initial ‘after data set’), 67,362 unique posts (33% of the initial after data set) and 70 (out of 71) news domains included in these posts. The descriptives show that the official Facebook pages accounted for the majority of posts (77%) in the ‘after data set’.
A detailed content analysis of amplification spaces is beyond the scope of this study and will be implemented in follow-up research. However, it is worth mentioning that amplification spaces include various types of Facebook groups and pages, for example, the personal pages of famous people, such as Mikhail Khodorkovsky (a former Russian oil tycoon and now exiled Putin opponent), groups of Russian-speaking communities of various countries, local communist groups, groups of Russia sympathisers, and pro-Ukrainian groups. To get a detailed picture of the largest amplifier spaces and how they are connected (RQ3), we first divided the amplifier spaces sharing both uncritical and critical news domains into those sharing more than 50% of the time uncritical and those sharing more than 50% of the time critical news domains. Next, we created a bivariate network connecting public spaces and URLs including the domains. To spatialise the network we used Gephi’s ForceAtlas 2. Figure 1 shows the network coloured by the types of domains the amplifier spaces shared (uncritical, > 50% uncritical, > 50% critical, critical).

Note. The circles stand for the Facebook public spaces. The larger the circles, the more URLs to Russia’s news domains were shared. Those that shared exclusively uncritical domains are coloured in violet, > 50% uncritical in green, > 50% critical in orange and those sharing exclusively critical domains in blue. Network filtered for degree ≥ 5.
The figure shows that the majority of the amplifier spaces shared both uncritical and critical news domains in their posts (662 unique public spaces, 42,703 unique posts), followed by those sharing exclusively uncritical news domains (565 public spaces, 23,365 unique posts). The number of spaces sharing exclusively critical domains is the smallest (39 public spaces, 1,294 unique posts). Thus, we may conclude that uncritical news domains were overall more visibly shared across the amplifier spaces on Facebook six months into Russia’s full-scale war. However, we also find that the amplifier spaces sharing exclusively critical news domains attracted the highest average engagement (1,17 average total engagement) and subscriber count (85,790 average subscriber count). See Figure 2 in the Appendix for an overview of descriptives.
In Figure 1, the public spaces clustered closely together represent instances where the same domains were shared frequently between the spaces. If a space is farther away, there aren't many sharing connections between that space and other clusters of spaces. Even though the critical and uncritical spaces seem to be connected through the spaces sharing both domain types, the figure highlights that there is still a distinct community of spaces sharing exclusively critical news domains and spaces sharing exclusively uncritical ones. The largest circles can be found in the centre of the network highlighting that amplifier spaces sharing both uncritical and critical news domains shared the largest number of news URLs during the first six months of the full-scale war. The 10 largest amplifier spaces by the number of unique posts are plotted in Figure 4 of the Appendix.
Conclusion
Negotiating content moderation on Facebook
This study aims to contribute to the research literature focusing on the complexity of platform governance (Gorwa, 2019), platform power (Naughton, 2018) across different national contexts (Ahn et al., 2023), and the governing of content by SNSs (DeVito, 2017; Gillespie, 2018). Specifically, it investigates the role US-based SNS Facebook plays as an information intermediary of Russia’s Russia-language news content, thus, beyond the Anglophone context (Abhishek, 2021; Sablosky, 2021). The findings demonstrate how SNSs facilitate authoritarian regimes’ control (in this case, Russia) over information flows reaching audiences beyond their borders (Golova, 2020; Kling, 2022). Even after the implementation of several measures to reduce the authoritarian country’s informational influence on the platform, this control persists.
Specifically, we find that the measures introduced by Facebook and state actors (the EU and Russia) since the start of Russia’s full-scale war led to a significant decrease in the number of average posts by Russia’s uncritical news domains (H1a). However, this impact is best explained by the implementation of a mixture of all highlighted measures, not solely the EU’s or Facebook’s measures as we find a discrepancy between what is labelled by Facebook and what is sanctioned by the EU (RQ1). Facebook's definition of "state-controlled" is based on their undisclosed procedure, implemented globally, as their other means of internal content moderation (Ahn et al., 2023). However, Russia’s propaganda outlets are owned not only by the state but also by corporations and oligarchs close to Vladimir Putin (Schimpfössl & Yablokov, 2017). These outlets also shape an uncritical view of Russia’s war atrocities by either omitting information or actively propagating pro-Kremlin narratives.
Contrary to Facebook’s official statements about the war (Meta, 2022), our findings suggest that their efforts to demote Russian-language uncritical news content are limited. Based on the number of posts, uncritical news domains were still approximately three times more shared than critical ones during the first six months of the war (RQ2). Hence, Facebook users could still be exposed to pro-Kremlin content. One reason for this lack of involvement might be linked to the Russian language being less prioritised and Facebook’s lack of engagement with regional experts. According to Facebook whistleblower Frances Haugen (Popli, 2021), the company has historically underinvested in non-English languages, which receive “a tiny fraction” (n.p.) of the security systems that English receives. As for the engagement with regional experts, Facebook claims that it has been “working with third-party fact-checkers in the region” (Meta, 2022, n.p.) without disclosing who these fact-checkers are. Facebook’s statement claimed that the company has expanded “third-party fact-checking capacity in Russian and Ukrainian languages across the region” (Meta, 2022, n.p.). However, the statement does not clarify if third-party fact-checkers were hired both in Russia and in Ukraine or outside the region and if Ukrainian ones were also responsible for fact-checking (media) texts in Russian. As of June 2025, Facebook stopped its third party fact-checking programme in the US (Meta, 2025) and never updated their statement about the ongoing war (Meta, 2022). Another reason might be the unique situation of Facebook losing the Russian market. This situation is incomparable to Facebook's efforts to censor propaganda in the Global South, where some of its largest markets are located (Abhishek, 2021). The company may not feel compelled to decrease Russian-language content, even if it contains propaganda, as the market is no longer profitable. Consequently, we conclude that at least six months into Russia’s full-scale war, Facebook continued to enable Russia’s informational control on the platform.
Visibility of uncritical news on Facebook
With regards to Russia’s news ecosystem on Facebook, we find that even though less uncritical news domains circulate in posts across the platform after the start of the full-scale war, as a whole, they are still more visibly disseminated than critical ones (RQ2). This follows extant research findings before the full-scale war on the dominance of uncritical publics in the Russian media landscape, which was noted by Litvinenko & Toepfl (2019) and also by Kling (2022) just one month before Russia’s full-scale invasion. Our finding may also be interpreted as a measure of "success" stemming from the new legislation in Russia aimed at suppressing critical viewpoints by criminalising dissent regarding the war (McCarthy et al., 2023).
Even though the amplifier spaces publishing exclusively critical and uncritical news domains are divided (RQ3), they also seem connected through the public spaces sharing both domain types. During the first six months of the full-scale war, uncritical news domains were not just published by exclusively “uncritical” amplifier spaces, but also served as a reference to events happening in Russia and the territories of Ukraine occupied by Russia for “critical” spaces. This could show that the Russian state is gradually dominating access to information about its armed forces, as war-critical outlets cannot send their journalists to the Russian frontlines, so they must rely either on data from uncritical media or on open-source data. In other words, while the situation on the Ukrainian frontline is reported not only by Ukrainian media but also by accredited foreign journalists, the majority of information from the Russian frontlines comes from Russia’s uncritical outlets. This could be an explanation for the dominance of amplification spaces publishing both critical and uncritical news domains during the first six months of the war.
Nevertheless, the amplifier spaces sharing mostly (more than 50%) critical news domains and the ones sharing exclusively critical domains are smaller as a whole (318 unique spaces) than their counterparts sharing predominantly or exclusively uncritical news domains (949 unique spaces). This contradicts an earlier argument by scholars who have described Russia’s Facebook as a “liberal-oppositional echo chamber” (Bodrunova et al., 2021, p. 2932). As we show in Figure 1, uncritical amplification spaces are either not isolated from uncritical news or marginal which is in line with the argument of Bruns (2021), calling the “echo chamber” an “ill-defined metaphor” (p. 34) for increasing polarisation in society and politics. As evidenced by our findings that the number of posts containing critical news domains, some of which could be aligned with the liberal-oppositional category in 2011 (Bodrunova et al., 2021), did not drop significantly after the start of Russia’s full-scale invasion, war-critical outlets have not been affected by the measures following the invasion (H1b). Consequently, there is no reason to doubt that their influence on Facebook has remained since the start of the full-scale invasion. This is also evidenced by the higher average engagement (0.77 average engagement by amplifier space) and subscriber counts (52,976 average subscriber count by amplifier space) of critical and majority critical amplifier spaces compared to uncritical and majority uncritical ones (0.31 average engagement by amplifier space, 25,677 average subscriber count by amplifier space).
Limitations and future research
This study focused exclusively on the largest public spaces sharing Russia’s top Russian-language news domains on Facebook during a specific timeframe (from 24 August 2021 to 24 August 2022) based on Crowdtangle data. As noted in the Method section, the collected data is limited to Facebook’s public spaces of a specific follower size and the study’s conclusions are, thus, not necessarily applicable to Facebook’s private environments (e.g., closed groups, private profiles and messaging) and smaller public spaces during another period. Classifying amplifier spaces by their thematic and regional focus is beyond the scope of this study and will be addressed in follow-up research.
Additionally, our research focuses on the environments emerging from the sharing of posts containing Russia’s top Russian-language news domains and discusses the visibility of specific domain types (war-critical/war-uncritical) that circulate across Facebook. As a result, this study does not explicitly address counter-attitudinal sharing – posts including domains not in a supportive but critical way. We can assume that the uncritical domains shared by public spaces where more than 50% of the shared domains were critical are such spaces. However, we argue that by sharing these uncritical news domains, these spaces also contribute to the visibility of uncritical news on Facebook.
Overall, this study provides insights into the role of SNSs as mediators of authoritarian state-controlled information environments during a military conflict, highlighting the intertwined platform governance shared between SNSs and national authorities. Our findings may be partly transferable to other SNSs that have their own interests and content moderation policies (e.g., China’s TikTok and Elon Musk’s X), as they highlight the complexities that arise as a result of policies aimed at reducing an authoritarian state's influence. Consequently, our methodology can be adapted to other languages, particularly in instances where Facebook is no longer able to monetise the national market.
References
Abhishek, A. (2021). Overlooking the political economy in the research on propaganda. Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-61
Aguerri, J., Santisteban, M., & Miró-Llinares, F. (2022). The fight against disinformation and its consequences: Measuring the impact of “Russia state-affiliated media” on Twitter. SocArXiv. https://doi.org/10.31235/osf.io/b4qxt
Ahn, S., Baik, J. (Sophia), & Krause, C. S. (2023). Splintering and centralizing platform governance: How Facebook adapted its content moderation practices to the political and legal contexts in the United States, Germany, and South Korea. Information, Communication & Society, 26(14), 2843–2862. https://doi.org/10.1080/1369118X.2022.2113817
Angus, D., Bruns, A., Hurcombe, E., Harrington, S., & (Jane) Tan, X. Y. (2023). Computational communication methods for examining problematic news-sharing practices on facebook at scale. Social Media + Society, 9(3), 20563051231196880. https://doi.org/10.1177/20563051231196880
Associated Press. (2022, March 22). Report says Facebook fails to detect hate against Rohingya. Voice of America. https://www.voanews.com/a/report-says-facebook-fails-to-detect-hate-against-rohingya/6495791.html
Bastos, M., & Farkas, J. (2019). “Donald Trump is my president!”: The internet research agency propaganda machine. Social Media + Society, 5(3), 2056305119865466. https://doi.org/10.1177/2056305119865466
Bennett, W. L., & Livingston, S. (2018). The disinformation order: Disruptive communication and the decline of democratic institutions. European Journal of Communication, 33(2), 122–139. https://doi.org/10.1177/0267323118760317
Bodrunova, S. S., & Litvinenko, A. A. (2013). New media and political protest: The formation of a public counter-sphere in Russia, 2008–12. In Russia’s changing economic and political regimes. Routledge.
Bodrunova, S. S., Litvinenko, A., & Nigmatullina, K. (2021). Who is the censor? Self-censorship of Russian journalists in professional routines and social networking. Journalism, 22(12), 2919–2937. https://doi.org/10.1177/1464884920941965
Brown, A. (2022, March 11). Russia’s Instagram, Facebook bans will cost Meta nearly $2 Billion in revenue. Forbes. https://www.forbes.com/sites/abrambrown/2022/03/11/instagram-facebook-bans-will-cost-meta-nearly-2-billion-in-revenue/
Bruns, A. (2021). Echo chambers? Filter bubbles? The misleading metaphors that obscure the real problem. In Hate speech and polarization in participatory society (pp. 33–48). Routledge.
Council of the European Union. (2024). EU sanctions against Russia. Consilium. https://www.consilium.europa.eu/en/policies/sanctions-against-russia/
DeVito, M. A. (2017). From editors to algorithms: A values-based approach to understanding story selection in the Facebook news feed. Digital Journalism, 5(6), 753–773. https://doi.org/10.1080/21670811.2016.1178592
DiResta, R., Grossman, S., & Siegel, A. (2022). In-house vs. outsourced trolls: How digital mercenaries shape state influence strategies. Political Communication, 39(2), 222–253. https://doi.org/10.1080/10584609.2021.1994065
Doroshenko, L., & Lukito, J. (2021). Trollfare: Russia’s disinformation campaign during military conflict in Ukraine. International Journal of Communication, 15(0). https://ijoc.org/index.php/ijoc/article/view/16895
European Commission. Directorate-General for Communications Networks, Content and Technology. (2023). Digital Services Act: Application of the risk management framework to Russian disinformation campaigns. Publications Office. https://data.europa.eu/doi/10.2759/764631
Gillespie, T. (2018). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
Glazunova, S., Ryzhova, A., Bruns, A., Montaña-Niño, S. X., Beseler, A., & Dehghan, E. (2023). A platform policy implementation audit of actions against Russia’s state-controlled media. Internet Policy Review, 12(2). https://doi.org/10.14763/2023.2.1711
Golova, T. (2020). Post-Soviet migrants in Germany, transnational public spheres and Russian soft power. Journal of Information Technology & Politics, 17(3), 249–267. https://doi.org/10.1080/19331681.2020.1742265
Golovchenko, Y., Buntain, C., Eady, G., Brown, M. A., & Tucker, J. A. (2020). Cross-platform state propaganda: Russian trolls on Twitter and YouTube during the 2016 U.S. presidential election. The International Journal of Press/Politics, 25(3), 357–389. https://doi.org/10.1177/1940161220912682
Gorwa, R. (2019). What is platform governance? Information, Communication & Society, 22(6), 854–871. https://doi.org/10.1080/1369118X.2019.1573914
Gruzd, A., Abul-Fottouh, D., Song, M. Y., & Saiphoo, A. (2023). From Facebook to YouTube: The potential exposure to COVID-19 anti-vaccine videos on social media. Social Media + Society, 9(1), 20563051221150403. https://doi.org/10.1177/20563051221150403
Human Rights Council. (2018). Report of the independent international fact-finding mission on Myanmar. Office of the High Commissioner for Human Rights. https://ap.ohchr.org/documents/dpage_e.aspx?si=A/HRC/39/64
Kling, J. (2022). Mapping the global audiences of Russia’s domestic news: How social networks function as transmitters of authoritarian news to foreign audiences. International Journal of Communication, 16. https://ijoc.org/index.php/ijoc/article/view/19073
Kuznetsova, E., & Makhortykh, M. (2023). Blame it on the algorithm? Russian government-sponsored media and algorithmic curation of political information on Facebook. International Journal of Communication, 17. https://ijoc.org/index.php/ijoc/article/view/18687
Litvinenko, A., & Toepfl, F. (2019). The “gardening” of an authoritarian public at large: How Russia’s ruling elites transformed the country’s media landscape after the 2011/12 protests “for fair elections”. Publizistik, 64(2), 225–240. https://doi.org/10.1007/s11616-019-00486-2
McCarthy, L. A., Rice, D., & Lokhmutov, A. (2023). Four months of “discrediting the military”: Repressive law in wartime Russia. Demokratizatsiya: The Journal of Post-Soviet Democratization, 31(2), 125–160.
Meta. (2020, June 4). Labeling state-controlled media on Facebook. Meta. https://about.fb.com/news/2020/06/labeling-state-controlled-media/
Meta. (2022, February 26). Meta’s ongoing efforts regarding Russia’s invasion of Ukraine. Meta. https://about.fb.com/news/2022/02/metas-ongoing-efforts-regarding-russias-invasion-of-ukraine/
Meta. (2025, January 7). More speech and fewer mistakes. Meta. https://about.fb.com/news/2025/01/meta-more-speech-fewer-mistakes/
Naughton, J. (2018). Platform power and responsibility in the attention economy. In M. Moore & D. Tambini (Eds), Digital dominance: The power of Google, Amazon, Facebook and Apple. Oxford University Press.
Nechushtai, E., & Lewis, S. C. (2019). What kind of news gatekeepers do we want machines to be? Filter bubbles, fragmentation, and the normative dimensions of algorithmic recommendations. Computers in Human Behavior, 90, 298–307. https://doi.org/10.1016/j.chb.2018.07.043
Okholm, C. S., Fard, A. E., & Thij, M. ten. (2024). Blocking the information war? Testing the effectiveness of the EU’s censorship of Russian state propaganda among the fringe communities of Western Europe. Internet Policy Review, 13(3). https://policyreview.info/articles/analysis/blocking-information-war-testing-effectiveness-eu-censorship
Popli, N. (2021). The 5 most important revelations from the “Facebook papers". TIME. https://time.com/6110234/facebook-papers-testimony-explained/
Purnell, N., & Horwitz, J. (2020). Facebook’s hate-speech rules collide with Indian politics. Wall Street Journal. https://www.wsj.com/articles/facebook-hate-speech-india-politics-muslim-hindu-modi-zuckerberg-11597423346
Reuter, O. J., & Szakonyi, D. (2015). Online social media and political awareness in authoritarian regimes. British Journal of Political Science, 45(1), 29–51. https://doi.org/10.1017/S0007123413000203
Reyaz, M. (2020). Cyberspace in the Post-Soviet states: Assessing the role of new media in Central Asia. Jadavpur Journal of International Relations, 24(1), 7–27. https://doi.org/10.1177/0973598419875266
Rodgers, J., & Lanoszka, A. (2023). Russia’s rising military and communication power: From Chechnya to Crimea. Media, War & Conflict, 16(2), 135–152. https://doi.org/10.1177/17506352211027084
Rotaru, V. (2018). Forced attraction?: How Russia is instrumentalizing its soft power sources in the “near abroad”. Problems of Post-Communism, 65(1), 37–48. https://doi.org/10.1080/10758216.2016.1276400
Roth, A. (2017). In new sanctions list, Ukraine targets Russian social-media sites. The Washington Post. https://www.washingtonpost.com/world/in-new-sanctions-list-ukraine-blocksrussian-social-media-sites/2017/05/16/a982ab4e-3a16-11e7-9e48-c4f199710b69_story.html
Saari, S. (2014). Russia’s Post-Orange Revolution strategies to increase its influence in former Soviet Republics: Public diplomacy po russkii. Europe-Asia Studies, 66(1), 50–66. https://doi.org/10.1080/09668136.2013.864109
Sablosky, J. (2021). “Dangerous organizations: Facebook’s content moderation decisions and ethnic visibility in Myanmar”. Media, Culture & Society, 43(6), 1017–1042. https://doi.org/10.1177/0163443720987751
Sanovich, S. (2017). Computational propaganda in Russia: The origins of digital misinformation. In Woolley, S. & Howard, P. (Eds), Computational propaganda worldwide (pp. 1–25). https://ora.ox.ac.uk/objects/uuid:555c1e20-60d0-4a20-8837-c68868cc0c96
Schimpfössl, E., & Yablokov, I. (2017). Media elites in Post-Soviet Russia and their strategies for success. Russian Politics, 2(1), 32–53. https://doi.org/10.1163/2451-8921-00201003
Szostek, J., & Orlova, D. (2024). Free speech versus defence of the nation? The media as sources of national insecurity in Ukraine. European Security, 33(1), 82–106. https://doi.org/10.1080/09662839.2023.2231369
Vihalemm, T., Juzefovičs, J., & Leppik, M. (2019). Identity and media-use strategies of the Estonian and Latvian Russian-speaking populations amid political crisis. Europe-Asia Studies, 71(1), 48–70. https://doi.org/10.1080/09668136.2018.1533916
Appendix
Additional information on coding public spaces as (un-) critical
Here is an overview of detailed criteria based on which the two authors of the article categorised the domains into critical and uncritical towards Russia’s war against Ukraine.
Critical:
Criticism of Russia’s political leadership, its war and war atrocities against Ukraine.
Keywords: Bucha, v Ukraine, war
Example posts with Bucha in post message:

Example 1. Post from critical news outlet Mediazona in which the killing of residents in Bucha is reported. Translation: "They said, let him lie, they did not bury their own": testimonies of one execution in Bucha. THIS MESSAGE (MATERIAL) WAS CREATED AND (OR) DISTRIBUTED BY A FOREIGN MASS MEDIA ACTING AS A FOREIGN AGENT, AND (OR) BY A RUSSIAN LEGAL ENTITY ACTING AS A FOREIGN AGENT. Bucha is a small city in the Kyiv region, which has already become a symbol of war atrocities committed by the Russian army in Ukraine. Hundreds of Bucha residents were killed in a month, some of them were tortured. Publisher of "Mediazona" Petr Verzilov went to the city and recorded the testimonies of Alexander Melnichuk's neighbour, who became one of the victims of the Russian military. Last accessed (3 May 2024): https://www.facebook.com/100064284724047/posts/4938797682824327/

Example 2. Post from critical news outlet TJournal (TJ) which reports on mass killings in Bucha. Translation: Mass murders in the city of Bucha are the main topic of recent days around the world and, apparently, a turning point in the conflict between Russia and Ukraine. On TJ, the main thing is about what happened: how a city with 36 thousand inhabitants ended up at the centre of world events, what the media discovered after the departure of Russian troops and how this threatens Russia (by court and sanctions). The Ministry of Defense of the Russian Federation responds to numerous testimonies of bodies on the streets, including hands tied behind their backs with accusations of "staging". And the IC is already preparing to punish based on photos and videos under the "law of fakes". Also within the recommendation, what else can you read or watch about the events in Bucha. Last accessed (3 May 2024): https://www.facebook.com/100079982975369/posts/5009565955799534/

Example 3. Post from critical news outlet Meduza which reports on killings of civilians in Bucha. Translation: Ukrainian units and journalists entered the city of Bucha in the Kyiv region on 2 April. The evening before, the first videos of the bodies of civilians lying on Yablonskaya Street (former Kirov Street) on the outskirts of the city appeared in Telegram channels. Testimonies from Bucha, which Russian troops left shortly before, shocked the world. Ukrainian President Vladimir Zelensky called what was happening there a genocide, US President Joe Biden – war crimes. Moscow offered several contradictory versions of what happened, none of which admits that the Russian military is responsible for mass murders. “Meduza” collected and analysed all available data on the events in Bucha. Last accessed (3 May 2024): https://www.facebook.com/100064582336117/posts/2273634886124587/
Uncritical:
No criticism of Russia’s political leadership, its war and war atrocities against Ukraine, omitting information on Bucha and/or advocating pro-Kremlin propaganda narratives.
Keywords: Na Ukraine, special military operation, denazification of Ukraine, z propaganda

Example 1. Post from uncritical news outlet Kommersant’ which reports on the “situation” (instead of killings of civilians) in Bucha and calls it a fake. Translation: Russia's post-presidency at the UN today, 4 April, will hold a press conference where it will present materials on the situation in the Ukrainian city of Bucha to journalists, said the head of the foreign ministry Sergey Lavrov. According to him, Western—especially British—diplomats intend to “melt away” this “fake” about the killings in Bucha in “demagogic reflections”.. He also commented on the statement of the US president, who called his Russian colleague Vladimir Putin a "war criminal" in connection with what happened in that city. Last accessed (3 May 2024): https://www.facebook.com/100064810507665/posts/10158302902266857/

Example 2. Post from uncritical news outlet Vedomosti which reports on the “situation” (instead of killings of civilians) in and “materials” (instead of killings of civilians) from Bucha and calls it “staged”. Translation: Russian Foreign Minister Sergey Lavrov considers that the materials that emerged from the city of Bucha in the Kyiv region after the departure of Russian troops are staged, reports TASS. He recalled that Moscow demands from Britain, the president of the UN Security Council, to hold a meeting on the situation in Bucha. Last accessed (3 May 2024): https://www.facebook.com/100064848123542/posts/10159788654643908/

Example 3. Post from uncritical news outlet Telekanal Zvezda which reports that the Russian Ministry of Defence denied the killings of civilians in Bucha. Translation: The Ministry of Defense of the Russian Federation denied Kyiv's accusations of the alleged murder of civilians in the settlement of Bucha: Last accessed (3 May 2024): https://www.facebook.com/100076236910680/posts/5318618408188147/
Additional measurement calculations
Three additional measurements have been calculated: the total number of posts by Facebook space and news domain, and the average engagement by public space. Below are explanations of the calculations.
For the total number of posts by Facebook space (1), we relied on the count of posts containing Russia’s news domains by Facebook space (based on unique “AccountUrl"). This measurement shows how many posts containing selected news domains were published by an individual public space and helps us to identify the largest URL amplification spaces. For the number of posts by news domain (2), we relied on the count of posts containing a specific domain (based on a unique “ExtendedLinkOriginal”). This measurement shows how prevalent the news domain was shared across all the public spaces on the platform. For average engagement by public space (3), we first calculated the value total engagement drawing on the definition of Arora et al. (2019):
Reactions = like + angry + care + love + haha + wow + sad + thankful
Total Engagement = reactions + comments + shares Number of posts
To do so, we relied on “actual” engagement counts as opposed to “expected”: As defined by CrowdTangle, “actual” represents the actual metrics of the post, e.g., likeCount or commentCount” and “expected” represents what that post's metrics were expected to be given based on the post's properties, as calculated by CrowdTangle.11In order to get a comparable engagement measurement on individual public space level (specifically for RQ3), we then calculated the average engagement by public amplifier space also based on Arora et al. (2019):
Engagement per account = Total Engagement / Number of Posts by public space
Additional tables and figures

Comparison of the number of unique posts and public spaces, average engagement and account subscriber count by account criticism group (critical, majority critical, majority uncritical, uncritical) after the start of Russia’s full-scale invasion



Overview of sampled Russia’s top news domains by content moderation measures and level of criticism (critical/ uncritical)
|
News domain (N = 72) |
EU sanctions (until August 2022, n = 4) |
“Russia state-sponsored” label on Facebook (as of October 2023, n=15) |
Facebook labelled as "extremist organisation" (since March 2024, n=64) |
|---|---|---|---|
| osnmedia.ru | X | ||
| solenka.info | X | ||
| sm.news | X | ||
| novostivl.ru | X | ||
| ng.ru | X | ||
| dni.ru | X | ||
| eadaily.com | X | ||
| argumenti.ru | X | ||
| topcor.ru | X | ||
| profile.ru | X | ||
| radiokp.ru | X | ||
| zavtra.ru | X | ||
| ukraina.ru | X | X | |
| regnum.ru | X | ||
| og.ru* | X | ||
| otr-online.ru | X | ||
| bfm.ru | X | ||
| vz.ru | X | ||
| ridus.ru | X | ||
| mk.ru | X | ||
| interfax.ru | X | ||
| pnp.ru | X | X | |
| ytro.news | X | ||
| business-gazeta.ru | X | ||
| topwar.ru | X | ||
| news.ru | X | ||
| ura.news | X | ||
| svpressa.ru | X | ||
| inosmi.ru | X | X | |
| rambler.ru | X | X | |
| pravda.ru | X | ||
| aif.ru | X | ||
| gazeta.ru | X | X | |
| tvzvezda.ru | X | X | |
| ria.ru | X | X | |
| 5-tv.ru | X | ||
| 360tv.ru | X | ||
| rg.ru | X | X | |
| tass.ru | X | X | |
| rbc.ru | X | ||
| kommersant.ru | X | ||
| readovka.news | X | ||
| iz.ru | X | ||
| life.ru | X | ||
| kp.ru | X | ||
| vedomosti.ru | X | ||
| tjournal.ru* | X | ||
| lenta.ru | X | X | |
| holod.media* | |||
| theins.ru* | |||
| zona.media* | X | ||
| the-village.ru* | X | ||
| meduza.io* | |||
| tayga.info* | X | ||
| bbc.com/russian* | |||
| dw.com/ru* | |||
| svoboda.org* | |||
| currenttime.tv* | |||
| forbes.ru | X | ||
| riafan.ru** | X | ||
| nation-news.ru** | X | ||
| russian.rt.com | X | X | X |
| radiosputnik.ria.ru | X | X | X |
| radiosputnik.ru | X | X | |
| 1tv.ru | X | X | |
| vesti.ru | X | X | X |
| smotrim.ru | X | X | |
| tvc.ru | X | X | |
| ntv.ru | X | ||
| ren.tv | X | ||
| tsargrad.tv | X | ||
| mosregtoday.ru | X |
Note. News domains categorised as critical are marked with * (n = 12). News domains having been banned by Facebook already before the start of Russia’s full-scale invasion of Ukraine are marked with **.
For the news outlet Sputnik, we considered both radiosputnik.ria.ru and radiosputnik.ru as separate domains which should be both EU-sanctioned. Interestingly, while radiosputnik.ria.ru was found to be labelled on Facebook and not accessible in Germany as of October 2023, radiosputnik.ru was not found labelled and also accessible in Germany at that time.
Footnotes
1. Throughout 2022, also other Russian-language domestic news outlets were blocked (in December 2022, 1tv.ru, ntv.ru and ren.tv were sanctioned by the EU), however, this blockage was not considered in this study as it happened only after August 2022.
2. By platforms, we understand the “online, data-driven apps and services (e.g., Facebook Messenger, Google Search, YouTube)” (Gorwa, 2019, p. 856) of the companies that deploy such services (e.g., Meta and Alphabet).
3. By social networking sites, we understand exclusively the social media platforms that are built on social networks (e.g., Facebook, Instagram, Russia’s VK).
4. According to: https://www.statista.com/statistics/268136/top-15-countries-based-on-number-of-facebook-users/
5. Despite exhibiting linguistic, cultural, and political diversities, some of the countries that emerged following the dissolution of the Soviet Union continued to be influenced —politically, economically, or culturally— by Russia. For example, Russia has heavily invested in the reach of its media in the former Soviet countries (Rotaru, 2018; Saari, 2014). Also, several state-aligned news outlets, such as Pervyy Kanal, NTV, Komsomol’skaya Pravda, and Argumenty i Fakty, have local versions in some former Soviet countries (for instance, in Belarus, Kazakhstan, Kyrgyzstan, and Moldova).
6. This sample also includes the Russian-language versions of Western foreign news domains, such as svoboda.org, dw.com/ru, bbc.com/russian, currenttime.tv, as well as Russian foreign news domains rt.com/russian, radiosputnik.ria.ru, and radiosputnik.ru. We included these foreign news domains, as they address a Russian-speaking audience both inside and outside Russia.
7. Code for the “Crowd” library: https://github.com/qut-dmrc/Crowd
8. The full list of parameters is available on the CrowdTangle Wiki https://github.com/CrowdTangle/API/wiki/Search
9. The sharing was conducted in Germany. Please be aware that the labelling/“flagging” of posts may vary depending on the geolocation, as observed by Glazunova et al. (2023). However, this aspect does not affect the study results, as the labelling of domains specifically is global (Meta, 2020).
10. See, e.g., TASS: https://www.facebook.com/144698628932572, https://www.facebook.com/221338351211505
11. For details, see Crowdtangle Wiki: https://github.com/CrowdTangle/API/wiki/Post