Gaming the system is often portrayed as a threat to platforms and services, but this alleged threat can also be turned into a commercial service which raises ethical questions on exploitative digital environments, equality, and the erosion of democratic values.
Research Articles
Content moderation encompasses a great diversity of actors who develop specific practices. Their precise contribution to the democratisation of content regulation, and to the balance between public and private interests in platform governance, remains little studied. This special issue is an attempt at remedying this.
This article explores how governments, shifting from an anti-interventionist stance and views of regulatory impossibility, are embracing an infrastructural turn through strategies that include hijacking, and localising the "points of control" of the internet.
News and Opinion Pieces
Employing a scenario-based method, the authors of this op-ed find that generative AI is not a solution, but a symptom of an overburdened and crisis-ridden academic system. The choice ahead, they argue, is clear.
This op-ed argues that tech regulations need to focus on platform design, not just content moderation, advocating for prosocial tech design that promotes healthier online spaces and strengthens societal cohesion.
Drawing upon the work of Ranking Digital Rights, this op-ed explores a civil society perspective on the relationship between corporate governance mechanisms and content moderation on digital platforms.
This study examines experts' role within the EU's Code of Practice on Disinformation, highlighting challenges in co-regulatory processes and platform governance.
Decentralising platform regulation: How does the design of regulatory intermediaries in the EU’s DSA and Brazil’s proposed platform regulation bill impact content moderation?
This article compares the Stop Hate for Profit campaign and the Global Alliance for Responsible Media to evaluate efforts that leverage advertisers’ financial power to challenge platform content moderation.
Despite their active and growing involvement in monitoring the implementation of the “Code of Conduct on countering illegal hate speech online”, civil society organisations have been barred from translating this expanded role into enhanced influence at the policy-making level.
This article analyses the role that informational architectures and infrastructures in federated social media platforms play in content moderation processes.
Trusted flaggers under the DSA have sparked the public debate; this article explains how we can safeguard freedom of expression and enable trusted flaggers to effectively target illegal content.
This article explores how civil society can contribute to constitutionalising social media global content governance by bridging international human rights law with platform policies.
Although commercial social media platforms provide few formal channels for participation in platform governance, creators aspire to influence decisions and policies through expressive forms of civic engagement that ultimately legitimate platforms as arbiters of public discourse.