In order to understand the sociodemographic origins of content moderation norms enacted by internet users, it is essential to view moderation as a socially determined and spatialised practice.
Research articles on GOVERNANCE
Content moderation encompasses a great diversity of actors who develop specific practices. Their precise contribution to the democratisation of content regulation, and to the balance between public and private interests in platform governance, remains little studied. This special issue is an attempt at remedying this.
Ideological differences, financial precarity, and tensions within the milieu of digital rights civil society organisations involved in platform governance advocacy can undermine these organisations’ ability to advocate for reform at all, let alone engage in a radical redefinition of the terms under which (platform) governance takes place.
Trusted flaggers under the DSA have sparked the public debate; this article explains how we can safeguard freedom of expression and enable trusted flaggers to effectively target illegal content.
This article analyses the role that informational architectures and infrastructures in federated social media platforms play in content moderation processes.
Despite their active and growing involvement in monitoring the implementation of the “Code of Conduct on countering illegal hate speech online”, civil society organisations have been barred from translating this expanded role into enhanced influence at the policy-making level.
Although commercial social media platforms provide few formal channels for participation in platform governance, creators aspire to influence decisions and policies through expressive forms of civic engagement that ultimately legitimate platforms as arbiters of public discourse.
Decentralising platform regulation: How does the design of regulatory intermediaries in the EU’s DSA and Brazil’s proposed platform regulation bill impact content moderation?
This study examines experts' role within the EU's Code of Practice on Disinformation, highlighting challenges in co-regulatory processes and platform governance.
This paper analyses how platform policies and interfaces of TikTok, YouTube, Snap, and Instagram shape commercial content for influencers and the legal duty to disclose such content under European consumer law.