Ideological differences, financial precarity, and tensions within the milieu of digital rights civil society organisations involved in platform governance advocacy can undermine these organisations’ ability to advocate for reform at all, let alone engage in a radical redefinition of the terms under which (platform) governance takes place.
Research Articles
Content moderation encompasses a great diversity of actors who develop specific practices. Their precise contribution to the democratisation of content regulation, and to the balance between public and private interests in platform governance, remains little studied. This special issue is an attempt at remedying this.
This article compares the Stop Hate for Profit campaign and the Global Alliance for Responsible Media to evaluate efforts that leverage advertisers’ financial power to challenge platform content moderation.
News and Opinion Pieces
This op-ed argues that tech regulations need to focus on platform design, not just content moderation, advocating for prosocial tech design that promotes healthier online spaces and strengthens societal cohesion.
Drawing upon the work of Ranking Digital Rights, this op-ed explores a civil society perspective on the relationship between corporate governance mechanisms and content moderation on digital platforms.
In her op-ed, the author argues that the AI Act overlooks the challenges posed by the use of generative AI in the literary industry. She calls for European legislation that takes into account the specific conditions and cultural value of original literary production.
This study examines experts' role within the EU's Code of Practice on Disinformation, highlighting challenges in co-regulatory processes and platform governance.
This article compares the Stop Hate for Profit campaign and the Global Alliance for Responsible Media to evaluate efforts that leverage advertisers’ financial power to challenge platform content moderation.
Despite their active and growing involvement in monitoring the implementation of the “Code of Conduct on countering illegal hate speech online”, civil society organisations have been barred from translating this expanded role into enhanced influence at the policy-making level.
This article analyses the role that informational architectures and infrastructures in federated social media platforms play in content moderation processes.
Although commercial social media platforms provide few formal channels for participation in platform governance, creators aspire to influence decisions and policies through expressive forms of civic engagement that ultimately legitimate platforms as arbiters of public discourse.
In order to understand the sociodemographic origins of content moderation norms enacted by internet users, it is essential to view moderation as a socially determined and spatialised practice.
Trusted flaggers under the DSA have sparked the public debate; this article explains how we can safeguard freedom of expression and enable trusted flaggers to effectively target illegal content.
This article explores how civil society can contribute to constitutionalising social media global content governance by bridging international human rights law with platform policies.