Newly released data reveal how a no-compromise approach to AI-generated and other fictional sexual content depicting children has diverted resources away from prosecuting real child sexual abuse material (CSAM).
This paper explores how the Digital Services Act’s Transparency Database enables platform observability, revealing critical insights into the practices of content moderation across the EU.