The quantified consumer: blind, non-informed and manipulated?

Stefan Larsson, Department of Technology and Society, Lund University, Sweden

PUBLISHED ON: 02 Jul 2017

Personal data is found at the core of the digital economy, and from a consumer perspective it is increasingly difficult to assess when it's collected, how it is utilised or traded and by what third parties it is handled over time. As Rhoen (2016) argues, a broader application of consumer protection regulation to user agreements may increase accountability for operators who collect and manage personal data, and in extension lead to increased codetermination for consumers. This is one way of stating that privacy issues and the management of user data is, in short, a consumer issue leading to at least three normative challenges of particular relevance: 1) how to handle (the lack of) transparency in a datafied and personalised market; 2) what the regulatory model based on (“blind, non-informed”) consent is, and; 3) the privacy paradox, that is, the gap between what consumers say when asked in surveys and how they actually behave when using privacy intrusive services.

Consumers as data sources in an “ecosystemic” digital economy

First, one has to acknowledge that the core essence of the internet has changed drastically over the last decade or two. It used to be understood as a cyberspace, an information highway, or a platform for peers to meet, share knowledge and better their arguments (Larsson, 2017a). Now, after a number of transitions, we have arrived at a digitised essence that revolves around data collected on its users, its “consumers”, in order to tailor applications and to function as the actual raw material for the growing “ecosystem” that constitutes the digital economy.

While there are a few for the data-driven markets absolutely central companies that have assumed core positions – the “big five” consisting of Google, Apple, Microsoft, Amazon and Facebook comprise the top five companies with highest market cap in the world for the second quarter of 2017 – there is a complex, interconnected “ecosystem” under development. This is for example emphasized in a recently published Cracked Labs report, describing a “vast landscape” consisting of “thousands of other companies from various industries that collect, analyze, acquire, share, trade, and utilize data on billions of people” (Christl, 2017, p. 4). An essential part of consumer protection is to govern the relationship between consumers and other, often stronger, players on the market, and to empower the weaker consumers if necessary. The ecosystem makes this government increasingly complex.

Data as currency: the market sees

Much of what used to be qualitative, unrecorded and non-monitored affairs are now quantified, measured and recorded: what you search for; the websites you visit; the videos you watch; the ads you click on or tap; your location; device information; your IP address and cookie data. Combined with demographics and what you buy and when, the potential for profiling is very potent. In short, and as put by Marion Fourcade and Kieran Healy (2016) recently: The market sees. The consumers, on the other hand, seem increasingly blind with regards to this; as more and more data is collected and the more it travels, the information asymmetries are increasing.

Personal data has become a sort of currency for digitised services that can be provided for free – monetarily speaking – at the user level. At the aggregated level, however, consumer data truly constitutes wealth in the new economy. Personal data is becoming a traded resource in more and more markets: the financial industries, including credit card issuers, insurance brokers and lenders; in retail, including consumer goods and services, linked to loyalty cards or attempts to “seamlessly” bridge online and offline commerce linked to the individual; in media and publishing for things like programmatic ad exchanges (cf. Christl, 2017; Larsson & Ledendal, 2017).

Three normative challenges

Current developments in the role of consumer data can be problematised in a number of ways, not least as hinted at in Christl’s term “pervasive consumer surveillance” (2017). The following three developments, however, are all legally oriented challenges of particular interest (cf. Larsson 2017b):

1. Lack of transparency: information asymmetry

When data collection becomes the dominant feature of a digitised society, there is no longer any clear way of obtaining knowledge of who is collecting what, in the first stage, and no realistic way for individuals to see where the information travels, in the second stage. Frank Pasquale (2015) and others describe this in terms of a black box society, which includes the question of how to deal with this lack of transparency constituting an information asymmetry with a number of problems emerging from a consumer perspective: the targeting of vulnerable groups; the transition from influencing consumers to manipulating consumers; the question of accountability when automated services discriminate or in another way are illegal or unethical; when erroneous data leads to flawed outcomes.

2. Blind, non-informed consent

From a regulatory point of view, many protective measures focus on the individual consumer’s awareness and options to make informed choices, which leads to a broad use of consent driven data collection regulated through user agreements. However, given human cognitive limitations and time constraints, the choices we actually make in practice concerning the hundreds of user agreements we consent to in our daily use of digital services often represent what digital sociologist Anja Bechmann has termed, “blind non-informed consent” (2014). Consequently, privacy scholar Daniel Solove (2013) argues that this kind of “privacy self-management” does not provide any meaningful control and that there is a need to move beyond relying too heavily on it. Quite simply, it is a regulatory model that does not really work. For the consumers, that is.

3. The privacy paradox

Several surveys carried out in recent years have revealed that users are becoming increasingly concerned about their lack of control of the use and dissemination of their personal data (Lilley et al., 2012; Pew, 2014). Some are particularly worried about having no control over their internet generated personal data and the possibility of it being used in ways other than they originally intended when sharing it (Kshetri, 2014; Narayanaswamy & McGrath, 2014). Yet, research shows that many users often continue to use services that can be very intrusive, while at the same time stating that they are concerned about data being collected from their use of online products and services (Bechmann, 2014; Light & McGrath, 2010).

This can be seen as a sort of paradox between behavioural practice and stated social norms. As research community, we need to figure this out better. Is it the lack of transparency that tricks users into sharing data or is our understanding of privacy being renegotiated in tandem with new services? In their consumer research, Turow et al. (2015) have expressed consumer resignation as one reason for the discrepancy between stated norms and behaviour.

Conclusion: personal data as a consumer issue

In conclusion, transparency is one of the keys to ensuring trust in a datafied environment, which could open up for more competition if consumers could choose services that better represent their expressed values. However, this does not mean that the question of how to manage transparency issues is an easily solved one – clearly, there is a limit to how much information any consumer can handle when the regulatory approach leads to information overload in the form of lengthy user agreements by the hundreds for an average digital lifestyle.

Consumer protection agencies need to be more active with regards to consumer protection and truly acknowledge that data protection and the role that user data plays in the digital economy is not “merely” a privacy issue: it also pertains to how we balance market interests that unavoidably involve the consumer perspective. By reframing the privacy issue as a consumer issue, the benefit is that the consumer protection agencies can be addressed more directly and lead to more tools for achieving a better-balanced approach within current data-driven developments. Rhoen, too, points out the importance of a pragmatic application of consumer protection legislation (2016, p. 8), which thereby clearly addresses the concerned supervisory authorities.

References

Bechmann, A. (2014) Non-informed consent cultures: Privacy policies and app contracts on Facebook. Journal of Media Business Studies, 11(1), 21-38. doi:10.1080/16522354.2014.11073574

Christl, W. (2017) Corporate Surveillance in Everyday Life. How companies collect, combine, analyze, trade, and use personal data on billions. Vienna: Cracked Labs.

Fourcade, M., & Healy, K. (2017). Seeing like a market. Socio-Economic Review, 15(1), 9-29. doi:10.1093/ser/mww033

Kshetri, N. (2014). Big data’s impact on privacy, security and consumer welfare. Telecommunications Policy, 38(11). doi:10.1016/j.telpol.2014.10.002

Larsson, S.(2017a) Conceptions in the Code. How Metaphors Explain Legal Challenges in Digital Times. Oxford University Press.

Larsson, S. (2017) Sustaining Legitimacy and Trust in a Data-driven Society, Ericsson Technology Review, 94(1): 40-49. Available at: http://lup.lub.lu.se/record/75b9d975-1a58-4145-85c4-efde2e46aa14

Larsson, S. & Ledendal, J. (2017) Personuppgifter som betalningsmedel. (4 ed.) Karlstad: Konsumentverket.

Light, B. & McGrath, K. (2010). Ethics and Social Networking Sites: a disclosive analysis of Facebook. Information, technology and people, 23(4), 290-311. doi:10.1108/09593841011087770

Lilley, S., Grodzinsky, F.S. & Gumbus, A. (2012). Revealing the commercialized and compliant Facebook user. Journal of information, communication and ethics in society, 10(2), 82-92. doi:10.1108/14779961211226994

Narayanaswamy, R. & McGrath, L. (2014). A Holistic Study of Privacy in Social Networking Sites, Academy of Information and Management Sciences Journal, 17(1), 71-85.

Pasquale, F. (2015). The Black Box Society. The Secret Algorithms That Control Money and Information, Harvard University Press.

Pew (2014). Public Perceptions of Privacy and Security in the Post-Snowden Era. Pew Research Center.

Rhoen, M. (2016). Beyond consent: improving data protection through consumer protection law. Internet Policy Review, 5(1). doi:10.14763/2016.1.404

Solove, D.J. (2013) Privacy Self-Management and the Consent Dilemma. Harvard Law Review, 126(7), 1880-1903. Available at: https://harvardlawreview.org/wp-content/uploads/pdfs/vol126_solove.pdf

Turow, J., Hennessy, M., Draper, N. (2015) The Tradeoff Fallacy: How Marketers Are Misrepresenting American Consumers and Opening Them Up to Exploitation. Research report, Annenberg School of Communication, University of Pennsylvania.

More: https://www.hiig.de/en/blog/datafication-and-consumer-trust/

Add new comment