Research Articles

Transparency and content moderation are becoming increasingly interconnected within legislation. It is time for tech companies to recognise this in the context of borderline terrorist and violent extremist content moderation.

Regulating pressing systemic risks – but not too soon?

Defne Halil, Maastricht University
Konrad Kollnig, Maastricht University
Aurelia Tamò-Larrieux, University of Lausanne
PUBLISHED ON: 25 Jun 2025 DOI: 10.14763/2025.2.2010

The EU’s 2022 Digital Services Act mandates data access for researchers to study platform risks, but delays and diverging opinions of authorities hold back the DSA’s practical implementation.

Introduction to the special issue on content moderation on digital platforms

Romain Badouard, Paris-Panthéon-Assas University
Anne Bellon, University of Technology of Compiègne
PUBLISHED ON: 31 Mar 2025 DOI: 10.14763/2025.1.2005

Content moderation encompasses a great diversity of actors who develop specific practices. Their precise contribution to the democratisation of content regulation, and to the balance between public and private interests in platform governance, remains little studied. This special issue is an attempt at remedying this.

News and Opinion Pieces

Infrastructural power: State strategies for internet control

Juan Ortiz Freuler, University of Southern California
PUBLISHED ON: 20 May 2025 DOI: 10.14763/2025.2.2009

This article explores how governments, shifting from an anti-interventionist stance and views of regulatory impossibility, are embracing an infrastructural turn through strategies that include hijacking, and localising the "points of control" of the internet.

From threat to opportunity: Gaming the algorithmic system as a service

Marijn Sax, University of Amsterdam
Hao Wang, Wageningen University & Research
PUBLISHED ON: 6 May 2025 DOI: 10.14763/2025.2.2007

Gaming the system is often portrayed as a threat to platforms and services, but this alleged threat can also be turned into a commercial service which raises ethical questions on exploitative digital environments, equality, and the erosion of democratic values.

Safer spaces by design? Federated socio-technical architectures in content moderation

Ksenia Ermoshina, National Centre for Scientific Research (CNRS)
Francesca Musiani, National Centre for Scientific Research (CNRS)
PUBLISHED ON: 31 Mar 2025 DOI: 10.14763/2025.1.1827

This article analyses the role that informational architectures and infrastructures in federated social media platforms play in content moderation processes.

Despite their active and growing involvement in monitoring the implementation of the “Code of Conduct on countering illegal hate speech online”, civil society organisations have been barred from translating this expanded role into enhanced influence at the policy-making level.

This article compares the Stop Hate for Profit campaign and the Global Alliance for Responsible Media to evaluate efforts that leverage advertisers’ financial power to challenge platform content moderation.

Framing the role of experts in platform governance: Negotiating the code of practice on disinformation as a case study

Kateryna Chystoforova, European University Institute
Urbano Reviglio, European University Institute
PUBLISHED ON: 31 Mar 2025 DOI: 10.14763/2025.1.1823

This study examines experts' role within the EU's Code of Practice on Disinformation, highlighting challenges in co-regulatory processes and platform governance.