Regulation of news recommenders in the Digital Services Act: empowering David against the Very Large Online Goliath

Natali Helberger, Institute for Information Law (IViR), University of Amsterdam, Amsterdam, Netherlands
Max van Drunen, Institute for Information Law (IViR), University of Amsterdam, Netherlands, M.Z.vanDrunen@uva.nl
Sanne Vrijenhoek, Faculty of Law, University of Amsterdam, Netherlands
Judith Möller, Department of Communication Science, University of Amsterdam, Amsterdam, Netherlands

PUBLISHED ON: 26 Feb 2021

Nowadays it is difficult to imagine the online world without recommendation algorithms. They filter and classify the growing abundance of information, prioritising content according to predefined ranking criteria. The result of that process is a recommendation to users on what content best matches their interests or personal profile. We encounter recommendation algorithms on a daily basis. They suggest products and services on e-commerce websites such as Amazon, help finding the love of your life on a dating platform, help you discover music and films on services such as Spotify or Netflix, give personalised recommendations on news websites, and match you with content on social media platforms such as YouTube and Facebook. As such, recommender algorithms are the engines behind the internet’s knowledge infrastructure.

Because of their importance for the way users find and access information online, recommender algorithms are a source of great power in the algorithmic society—a power that comes with responsibilities, and systematic risks that worry policymakers and academics alike, particularly if these algorithms are in the hands of some internet giants. Concerns range from the potential polarising effect and the ability to create filter bubbles of selective exposure to information (Möller et al., 2018; Pariser, 2011; Zuiderveen Borgesius et al., 2016) to the potential abuse of the enormous commercial and political gatekeeper powers that control over recommender algorithms entails (Helberger, 2020; Napoli, 2019) or the potentially invasive or even manipulative effect that data-driven recommendations can have for users’ privacy, autonomy and informational self-determination >(Eskens, 2020). As such, it is to be welcomed that the European Commission, in its suggestion for a Digital Services Act (DSA), has devoted considerable attention to news recommenders and even included a specialised provision for them: Art. 29 of the DSA specifically addresses the recommender systems 1 used by what are referred to as ‘Very Large Online Platforms’ 2 and focuses on the information and options users have to influence recommendations by online giants such as Facebook and Google. The following commentary offers a number of critical reflections on the goals behind Art. 29 DSA, the likelihood that it will be able to realise those goals but also how a more ambitious vision on news recommender regulation could have looked like. We conclude with a number of concrete suggestions for more effectively addressing risks but also opportunities from news recommenders.

The intended goal of the provision

Art. 29 of the draft DSA provision should be read in the context of recital 62, which explains that the provision is a response to the significant impact that recommenders can have on the behaviour of individuals and their ability to retrieve and interact with information. Finding ways of dealing with not only the economic but also the power about the way people form opinions is a challenge that has occupied academics and policymakers for the past few years (Moore & Tambini, 2018). What, then, is the solution the Commission suggests to tame the internet Goliaths of this world?

The solution the Commission suggests is to empower users—we call them David—with information. Art. 29 of the draft DSA requires Very Large Online Platforms to explain in their terms and conditions both what the main parameters of their recommender system are and the options for users to modify or influence those parameters, including the possibility, where available, to choose an option not based on profiling (Art. 29 (1) DSA), and if online platforms allow users to choose, to create an “easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them” (Art. 29 (1) DSA).

As such, the planned Art. 29 is a rather fundamental step beyond the present focus of the General Data Protection Regulation (GDPR) on users’ ability to exercise control over their data. The DSA seeks to extend users’ control over the recommendation metrics that the algorithm is optimised for. One may wonder, as the European Data Protection Supervisor (EDPS) does, whether informing consumers by adding a couple of lines to the terms and conditions is the most effective way of telling David (who is notoriously unwilling to read terms and conditions) how to stand up to Goliath (EDPS, 2021). Instead, the EDPS’s suggestion to offer that information on a prominent part of the website does indeed seem to be the better way forward.

Information as a means of exercising power and influence is meaningless without choice, and from that perspective it is to be welcomed that the provision speaks of David’s ability to choose between different metrics—provided platforms are willing to offer that choice to consumers. What the draft Art. 29 of the DSA does not do, is to oblige platforms to offer users the possibility to choose between, modify or implement parameters, including the ability to choose an option not based on profiling. One may wonder exactly what incentives VLOPs would have to offer users the ability to choose between metrics, particularly if one of these options should be not based on profiling, the core element of the business model of many of these platforms. The lack of incentives is especially problematic because Art. 29 only applies to very large online platforms, which are also the least sensitive to mere calls from users for more choice. To conclude, David is on his own here. But should one of the platforms decide to provide him with a slingshot, he will be the first to know!

Real and fake choices

As enticing as the idea of democratising algorithmic recommendations is, it is questionable whether offering users some choice to modify or influence a recommender’s parameters is really likely to make the difference between digital hegemony and a society that takes back control. A recommender engine bases its decisions on an extremely large set of data, including not only everything that the user has done and liked in the past but also all the clicks made by the users’ peers and other users of the platform (collaborative filtering) and a whole range of metadata (content-based) (Karimi, Jannach, Jugovac, 2018). The few parameters that users will be able to influence are only a selection from an enormous pool of parameters that influence the ultimate recommendations, including many that are far beyond users’ control (others, for example, could be general popularity, paid content, date of publication, etc. - see Covington et al., 2016; DeVito, 2017). 3 In most state-of-the-art recommender systems, which are usually some form of machine learning model, it is not even entirely clear what the ‘main parameters’ in the model are and what their effects are. Rather than empowering users, DSA is far more likely to create a fake sense of transparency. Or, to stick with our metaphor: empowering David through Art. 29 of the DSA is a bit like handing him the slingshot and the stone but forgetting to mention that the slingshot is unfortunately lacking the sling.

Let’s talk about personalisation

The draft Art. 29 of the DSA emphasises the importance of options, but it is surprisingly vague regarding the question of what those options actually should be, or how they could align with public values and fundamental rights. The only guidance the provision offers is that those options should relate to a user’s ‘preferred option’ … that determines the relative order of information presented (Art. 29 (2)) and that one of those options could be not based on profiling (Art. 29 (1)). This formulation of Art. 29 is problematic for at least three reasons.

First, the provision fails to engage with the way users can be enabled to realise public values on platforms. Simply offering users the possibility of choosing their preferred option for ranking content does not address the underlying polarisation and power issues in Very Large Online Platforms’ recommenders. At worst, the provision reinforces precisely the kind of filter bubbles that have given rise to concerns about recommenders in the first place, by optimising for personal relevance. This optimisation for personal relevance and satisfaction is precisely what a growing body of research into recommendation systems decries as overly simplistic (Bernstein et al., 2020; Nechushtai & Lewis, 2019). In a more optimistic scenario users’ preferences do align with normative goals such as diversity in recommenders (Bodó et al., 2019; Harambam et al., 2019). Effectively involving users in platform governance, however, requires a nuanced insight into how their personal goals relate to public values, and how the control mechanisms can be designed in such a way that they will be used productively. The DSA avoids these issues by leaving it up to platforms to decide what options, if any, users are given.

Secondly, Art. 29 (1) of the DSA chimes with the growing level of criticism regarding data-driven profiling. The EPDS, in its opinion on the DSA, is even more explicit, suggesting an outright ban on certain forms of profiling (EDPS, 2021). But is profiling always bad? It is true that profiling can be a basis for a range of digital malpractices online, but profiling users and offering personalised services can also be a way to help users find their way through the digital abundance of online information and, as such, is considered by users as a potentially very useful tool. Studies have even demonstrated that, under certain conditions, users are equally or even more likely to appreciate algorithmic recommendations above recommendations from e.g. news editors (Thurman et al., 2018). Also, recommenders come in very different shapes and sizes, and on very different kinds of platforms. Profiling music or movie lovers and issuing personalised recommendations for all the songs they might also like could have very different implications for society than profiling news users.

The point is this: recommender systems can fulfil a crucial role in democratic society and not only endanger but also contribute to the realisation of fundamental rights and public values, but this is a nuance that has been completely omitted from the DSA. Symptomatic is also the framing of Art. 26 of the DSA (risk assessment) that urges platforms to only assess the risks to and potential negative effects on fundamental rights and public values. Instead, the DSA should urge platforms to also consider how their recommendation algorithms could create opportunities for making them better, more democratic and fairer places.

This leads us to our third point: Art. 29 of the DSA is characterised by a complete lack of vision and regard for broader public and societal values. It does nothing to create incentives for platforms (and other services using recommendations) to invest in developing richer parameters that optimise for the realisation of public values and medium-term goals. Until now, the most frequently used Key Performance Indicators (KPIs) have assessed short-term user engagement, such as clicks or time spent on a page. These KPIs are often inspired in turn by technological and business demands rather than the societal and democratic mission of the media. News recommender systems, however, should also be oriented towards public values such as media diversity, inclusivity and fostering tolerance. Nor does Art. 29 do anything to encourage alternative providers of recommendation algorithms and logics (e.g. news media) to challenge the dominant recommendation algorithms on platforms and offer users a real choice. Why not include an obligation for platforms to enable users to choose between different recommendation algorithms, including some from third parties? That would be a true step towards curbing the central communication power of Very Large Online Goliaths.

Conclusions

The DSA takes a fundamental step by moving beyond empowering users to control their data to enabling them to control the recommendation logic. Unfortunately, the proposed Art. 29 of the DSA lacks any vision of how empowering users to exercise control over the recommendation logic can contribute to realising public values or mitigate the potential risks of recommenders to fundamental rights and the public sphere. Recommender systems are essentially framed as threats that must have an off switch, and the provision follows the long-standing and long-criticised traditions of empowering users through terms and conditions, while relying on platforms to voluntarily offer users the means to translate information into concrete choices.

This is a missed opportunity. A more effective and more forward looking version of Art. 29 of the DSA would create incentives for platforms to build recommenders that not only optimise for short-term clicks and immediate user satisfaction but also contribute in the longer term to the realisation of public values such as media diversity. A more effective version of Art. 29 would equip users with more than just some information hidden away in the terms of use and oblige platforms to provide users with a real choice.

References

Bernstein, A., Vreese, C. D., Helberger, N., Schulz, W., & Zweig, K. A. (2020). Diversity, Fairness, and Data-Driven Personalization in (News) Recommender System (p. 8). Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik. https://drops.dagstuhl.de/opus/volltexte/2020/11986/

Bodó, B., Helberger, N., Eskens, S., & Möller, J. (2019). Interested in Diversity. Digital Journalism, 7(2), 206–229. https://doi.org/10.1080/21670811.2018.1521292

Covington, P., Adams, J., & Sargin, E. (2016). Deep Neural Networks for YouTube Recommendations. 8.

Digital Journalism, 5(6), 753–773.https://doi.org/10.1080/21670811.2016.1178592

EDPS. (2021). Opinion 1/2021 on the Proposal for a Digital Services Act. Brussels, 10 February 2021.

Eskens, S. (2020). The personal information sphere: An integral approach to privacy and related information and communication rights. Journal of the Association for Information Science and Technology, 71(9), 1116–1128. https://doi.org/10.1002/asi.24354

Harambam, J., Bountouridis, D., Makhortykh, M., & van Hoboken, J. (2019). Designing for the better by taking users into account: A qualitative evaluation of user control mechanisms in (news) recommender systems. Proceedings of the 13th ACM Conference on Recommender Systems - RecSys ’19, 69–77. https://doi.org/10.1145/3298689.3347014

Helberger, N. (2020). The Political Power of Platforms: How Current Attempts to Regulate Misinformation Amplify Opinion Power. Digital Journalism, 1–13. https://doi.org/10.1080/21670811.2020.1773888

Möller, J., Trilling, D., Helberger, N., & Es, B. van. (2018). Do not blame it on the algorithm: An empirical assessment of multiple recommender systems and their impact on content diversity. Information, Communication & Society, 21(7), 959–977. https://doi.org/10.1080/1369118X.2018.1444076

Moore, M., & Tambini, D. (Eds.). (2018). Digital Dominance: The Power of Google, Amazon, Facebook, and Apple. Oxford University Press.

Napoli, P. M. (2019). Social Media and the Public Interest: Media Regulation in the Disinformation Age. Columbia University Press.

Nechushtai, E., & Lewis, S. C. (2019). What kind of news gatekeepers do we want machines to be? Filter bubbles, fragmentation, and the normative dimensions of algorithmic recommendations. Computers in Human Behavior, 90, 298–307. https://doi.org/10.1016/j.chb.2018.07.043

Pariser, E. (2011). The Filter Bubble: What The Internet Is Hiding From You.

Thurman, N., Moeller, J., Helberger, N., & Trilling, D. (2018). My friends, editors, algorithms, and I: Examining audience attitudes to news selection. Digital Journalism, 7(4) https://doi.org/10.1080/21670811.2018.1493936.

Zuiderveen Borgesius, F. J., Trilling, D., Moeller, J., Bodó, B., de Vreese, C. H., & Helberger, N. (2016). Should We Worry About Filter Bubbles? Internet Policy Review, 5(1). https://doi.org/10.14763/2016.1.401

,>

Footnotes

1. According to Art. 2 (o) of the draft DSA, recommenders are defined as follows: “‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed”.

2. Art. 25 (1) of the draft DSA defines Very Large Online Platforms as “online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million.”

3. ee for some more information from Facebook on how the news feed algorithm works: https://about.fb.com/news/2021/01/how-does-news-feed-predict-what-you-want-to-see/

Add new comment