News and Research articles on Filter bubble

Recommender systems and the amplification of extremist content

Joe Whittaker, Swansea University
Seán Looney, Swansea University
Alastair Reed, Swansea University
Fabio Votta, University of Amsterdam
PUBLISHED ON: 30 Jun 2021 DOI: 10.14763/2021.2.1565

Recommendation algorithms potentially amplifying extremist content has become a policy concern in recent years. We conduct a novel empirical experiment on three platforms (YouTube, Reddit, and Gab) to test this phenomenon. We find that YouTube’s “Recommended for you” system does promote extreme content. We synthesise the findings into the policy debate and argue that co-regulation may provide some solutions.

Filter bubble

Axel Bruns, Queensland University of Technology
PUBLISHED ON: 29 Nov 2019 DOI: 10.14763/2019.4.1426

Concepts such as ‘filter bubble’ enjoy considerable popularity in scholarly as well as mainstream debates, but are rarely defined with any rigour. This has led to highly contradictory research findings. This article provides a critical review of the ‘filter bubble’ idea, and concludes that its persistence has served only to distract scholarly attention from far more critical areas of enquiry.

Should we worry about filter bubbles?

Frederik J. Zuiderveen Borgesius, University of Amsterdam
Damian Trilling, University of Amsterdam
Judith Möller, University of Amsterdam
Balázs Bodó, University of Amsterdam
Claes de Vreese, University of Amsterdam
Natali Helberger, University of Amsterdam
PUBLISHED ON: 31 Mar 2016 DOI: 10.14763/2016.1.401

Personalised news websites can have serious implications for democracy, but little is known about the extent and effects of personalisation.