Transparency and content moderation are becoming increasingly interconnected within legislation. It is time for tech companies to recognise this in the context of borderline terrorist and violent extremist content moderation.
Research Articles
The EU’s 2022 Digital Services Act mandates data access for researchers to study platform risks, but delays and diverging opinions of authorities hold back the DSA’s practical implementation.
Content moderation encompasses a great diversity of actors who develop specific practices. Their precise contribution to the democratisation of content regulation, and to the balance between public and private interests in platform governance, remains little studied. This special issue is an attempt at remedying this.
News and Opinion Pieces
European AI regulations often overlook the experiences of marginalised communities disproportionately impacted by algorithmic biases. This op-ed explores how AI-driven tools exacerbate discrimination against immigrants and minority groups, calling for more inclusive policy frameworks.
Employing a scenario-based method, the authors of this op-ed find that generative AI is not a solution, but a symptom of an overburdened and crisis-ridden academic system. The choice ahead, they argue, is clear.
Drawing upon the work of Ranking Digital Rights, this op-ed explores a civil society perspective on the relationship between corporate governance mechanisms and content moderation on digital platforms.
The bankrupt US company Celsius pursues aggressive clawbacks against European customers—can EU consumer laws defeat this cross-border legal attack?
This article explores how governments, shifting from an anti-interventionist stance and views of regulatory impossibility, are embracing an infrastructural turn through strategies that include hijacking, and localising the "points of control" of the internet.
Gaming the system is often portrayed as a threat to platforms and services, but this alleged threat can also be turned into a commercial service which raises ethical questions on exploitative digital environments, equality, and the erosion of democratic values.
This article analyses the role that informational architectures and infrastructures in federated social media platforms play in content moderation processes.
Despite their active and growing involvement in monitoring the implementation of the “Code of Conduct on countering illegal hate speech online”, civil society organisations have been barred from translating this expanded role into enhanced influence at the policy-making level.
This article compares the Stop Hate for Profit campaign and the Global Alliance for Responsible Media to evaluate efforts that leverage advertisers’ financial power to challenge platform content moderation.
Decentralising platform regulation: How does the design of regulatory intermediaries in the EU’s DSA and Brazil’s proposed platform regulation bill impact content moderation?
This study examines experts' role within the EU's Code of Practice on Disinformation, highlighting challenges in co-regulatory processes and platform governance.