Transparency and content moderation are becoming increasingly interconnected within legislation. It is time for tech companies to recognise this in the context of borderline terrorist and violent extremist content moderation.
News and Research articles on Extremism
This systematic review explored 23 studies to establish whether the YouTube recommender system facilitates pathways to problematic content.
Recommendation algorithms potentially amplifying extremist content has become a policy concern in recent years. We conduct a novel empirical experiment on three platforms (YouTube, Reddit, and Gab) to test this phenomenon. We find that YouTube’s “Recommended for you” system does promote extreme content. We synthesise the findings into the policy debate and argue that co-regulation may provide some solutions.