Recommendation algorithms potentially amplifying extremist content has become a policy concern in recent years. We conduct a novel empirical experiment on three platforms (YouTube, Reddit, and Gab) to test this phenomenon. We find that YouTube’s “Recommended for you” system does promote extreme content. We synthesise the findings into the policy debate and argue that co-regulation may provide some solutions.
News and Research articles on Filter bubble
Concepts such as ‘filter bubble’ enjoy considerable popularity in scholarly as well as mainstream debates, but are rarely defined with any rigour. This has led to highly contradictory research findings. This article provides a critical review of the ‘filter bubble’ idea, and concludes that its persistence has served only to distract scholarly attention from far more critical areas of enquiry.
Stefania Milan and Claudio Agosti present the Algorithms Exposed (ALEX) project as well as the browser extension fbtrex.
Personalised news websites can have serious implications for democracy, but little is known about the extent and effects of personalisation.