Special issue on 'feminist data protection'

Friday, 26. June 2020 - 23:30

CFP: Special issue of Internet Policy Review on

Feminist data protection

Topic and relevance

The notion of data protection is now an integral part of legal and political discourse in Europe, as exemplified by the inclusion of a right to data protection in the EU’s Charter of Fundamental Rights. Yet there have been relatively few engagements in thinking and framing data protection from an explicitly feminist perspective. This stands in stark contrast to the notion of privacy, with which data protection is often conflated and which has been the subject of extensive feminist critique and exploration, particularly insofar as it relates to the distinction between public and private spheres (e.g., Allen, 1988; MacKinnon, 1989; Bhattacharjee, 1997; DeCew, 2015; Weinberg, 2017). The starting point of this Special Issue is that the notion of data protection, once disentangled from privacy (González Fuster, 2014), warrants further examination from a perspective of intersectional (Crenshaw, 1991) feminism.

Data protection may be understood by considering the power imbalance with which individuals are confronted when data about them are processed (de Hingh, 2018): public and private entities can collect data without the individuals’ knowledge and it is hardly possible for individuals to control their data, once collected (Steinmüller et al., 1971). The processing of data thereby creates inherent risks for individuals – particularly so for those already marginalised or subject to discrimination (e.g., Noble, 2018; Chander, 2020; Gandy, 2010; Guzik, 2009; de Vries, 2010) – and may further exacerbate the distribution of power in our societies. Thus, data protection, like feminism, aims at theorising and changing structural inequalities and power relations.

Scope of the Special Issue

We wish to discuss these structural issues as well as potential answers through the lens of emancipatory approaches, such as feminist, queer, Marxist, post-colonial, critical race or disability studies perspectives, from all relevant disciplines. Contributions focussing on the intersection of different oppressive structures (i.e. not only gender but also racialisation, class, marginalisation of religious minorities, etc.) are particularly welcome.

We invite submissions on the topic of feminist and other emancipatory approaches to data protection addressing the power imbalance faced by individuals, especially with regard to discrimination, marginalisation and oppression. We are interested in a wide variety of perspectives on the intersections between feminism and data protection, both in Europe and beyond, whether they are focused on mutual critique or on how either can benefit from the other and what their common or related aims could be (Peña & Varon, 2019). Topics of interest include, but are not limited to:

  • Data protection and privacy: How to analyse the relation between these “positive” notions in one discourse and the negative image of private spaces whose “legal vacuum” facilitates the exploitation of structural inequalities? How can these notions be brought into a dialogue, and which lessons can be learnt from history (Lake, 2016)?

  • Data activism, data justice, digital rights, and feminism: Around which issues are European and worldwide feminist initiatives focusing on data processing practices emerging? Which are the tensions or intersections between such initiatives and data protection?

  • Countering illegitimate data processing: How are women and marginalised groups targeted in the political economy of data gathering (McEwen, 2018)? How can they profit from the networking effects of social networks in order to organise while being protected from the fallout inherent in their capitalist business models, i.e. tracking and profiling?

  • Surveillance: How is technology developed and used to oppress certain groups (Browne, 2015)? What are the dangers disproportionately affecting women, especially women of colour, in the context of surveillance (Gandy, 2010; Guzik, 2009; Lyon, 2003)? How could or should surveillance be avoided, subverted or countered?

  • Artificial intelligence (AI) and ‘big data’: Should these practices be conceived of as a form of automated and inherent discrimination or as tools for visualising and countering existent discrimination? What biases are built into them, and what are their regulatory effects (Buolamwini & Gebru, 2018)? And how does data protection fit into proposed ways forward (i.e. ‘AI ethics’?)

  • Online gender ascription: How is information about gender being collected and processed (Bivens, 2017)? Which parameters determine gender ascription in so-called Automated Gender Recognition (AGR) technologies? Is the gender identity of individuals (including non-binary persons) respected, and how could data protection law further this cause?

  • Practices of categorisation and (mis)representation: How are gendered categories constructed, by which actors, and what is their impact, particularly for oppressed groups such as women or trans and non-binary people (Keyes, 2019)? How are biases and stereotypes built into data systems, and how should we respond to this? How can algorithms and protocols not only be designed but also used in line with principles of fairness and non-discrimination?

  • Data processing and identity formation: What role do notions such as the (male) gaze, visibility, hiding, deception, outing, and performativity play in the context of data processing and reproduction of gender norms and gendered identities (Abu-Laban, 2015; González Fuster et al., 2015; Beauchamp, 2019)? Can and should data protection intervene in such processes?

  • Data subjects and rights: Can we rethink notions of data protection law in ways which go beyond the neoliberal focus on the ostensibly gender-neutral, self-determining individual? How to effectively complement data subject rights (e.g., group rights, or other languages of resistance) without falling into identity traps?

Special Issue Editors

Regina Ammicht Quinn (regina.ammicht-quinn@uni-tuebingen.de)
Spokesperson
International Center for Ethics in the Sciences and Humanities (IZEW) and Center for Gender and Diversity Research (ZGD), University of Tübingen
Tübingen, Germany

Andreas Baur (a.baur@uni-tuebingen.de)
Research associate
International Center for Ethics in the Sciences and Humanities (IZEW), University of Tübingen
Tübingen, Germany 

Felix Bieker (fbieker@datenschutzzentrum.de)
Legal researcher
Office of the Data Protection Commissioner of Schleswig-Holstein
Kiel, Germany

Gloria González Fuster (gloria.gonzalez.fuster@vub.be)
Co-director
Law, Science, Technology and Society (LSTS) Research Group, Vrije Universiteit Brussel (VUB)
Brussels, Belgium

Marit Hansen (marit.hansen@datenschutzzentrum.de)
Data Protection Commissioner of Schleswig-Holstein
Kiel, Germany

Jens T. Theilen (jtheilen@hsu-hh.de)
Research associate
Helmut Schmidt University
Hamburg, Germany

For any editorial inquiry, please email us at: feministdataprotection@izew.uni-tuebingen.de

This publication project is funded by the Bundesministerium für Bildung und Forschung (German Federal Ministry of Education and Research) for the project Forum Privatheit – Selbstbestimmtes Leben in der Digitalen Welt (Privacy-Forum).

Important Dates



Release of the call for papers

11 May 2020

Deadline for abstract submissions (500-750 words). Send abstracts to feministdataprotection@izew.uni-tuebingen.de (now closed)

26 June 2020 

Invitation to submit full papers

18 July 2020

Full papers submission deadline

27 October 2020

Peer review process

November 2020 - January 2021

Final submission of papers following review

March 2021

Preparation for publication

April 2021

Publication

May 2021

 

 

References

Abu-Laban, Y. (2015). Gendering Surveillance Studies: The Empirical and Normative Promise of Feminist Methodology, Surveillance & Society, 13(1), 44-56.

Allen, A. L. (1988). Uneasy Access: Privacy for Women in a Free Society. Totowa, New Jersey: Rowman & Littlefield.

Beauchamp, T. (2019). Going Stealth. Transgender Politics and U.S. Surveillance Practices. Durham and London: Duke University Press.

Bhattacharjee, A. (1997). The Public/Private Mirage: Mapping Homes and Undomesticating Violence Work in the South Asian Immigrant Community. In Alexander and Mohanty (eds.), Feminist Genealogies, Colonial Legacies, Democratic Futures. New York: Routledge, pp. 308-329.

Bivens, R. (2017). The gender binary will not be deprogrammed: Ten years of coding gender on Facebook, New Media & Society 19(6), 880-898.

Buolamwini, J. and Gebru, T. (2018). Gender Shades. Intersectional Accuracy Disparities in Commercial Gender Classification, Proceedings of the 1st Conference on Fairness, Accountability and Transparency (PMLR), 81, 77-91.

Browne, S. (2015). Dark Matters. On the Surveillance of Blackness. Durham and London: Duke University Press.

Chander, S. (2020). Data Racism: a New Frontier, European Network Against Racism Blog, available at: https://www.enar-eu.org/Data-racism-a-new-frontier.

Crenshaw, K. (1991). Mapping the Margins: Intersectionality, Identity Politics, and Violence against Women of Color, Stanford Law Review 43(6), 1241-1299.

DeCew, J.W. (2015). The feminist critique of privacy: past arguments and new social understandings. In Roessler and Mokrosinska (eds.), Social Dimensions of Privacy. Interdisciplinary Perspectives. Cambridge: Cambridge University Press, pp. 85-103.

Gandy O.H. (2010). Engaging Rational Discrimination: Exploring Reasons for Placing Regulatory Constraints on Decision Support Systems, Ethics and Information Technology, 12(1), 29–42.

González Fuster, G. (2014). The Emergence of Personal Data Protection as a Fundamental Right of the EU, Cham and others: Springer.

González Fuster, G., Bellanova, R. and Gellert, R. (2015). Nurturing Ob-Scene Politics: Surveillance Between In/Visibility and Dis-Appearance, Surveillance & Society, 13(3/4), 512-527.

Guzik K. (2009). Discrimination by Design: Predictive Data Mining as Security Practice in the United States' 'War on Terror', Surveillance & Society, 7(1), 3–20.

De Hingh, A. (2018). Some Reflections on Dignity as an Alternative Legal Concept in Data Protection Regulation, German Law Journal 19, 1269-1290.

Keyes, O. (2019). The Body Instrumental, Logic 9: Nature, available at: https://logicmag.io/nature/the-body-instrumental/

Lake, J. (2016), The Face That Launched a Thousand Lawsuits: The American Women who Forged a Right to Privacy. New Haven, Connecticut: Yale University Press.

Lyon D. (ed.) (2003). Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination, London and New York: Routledge.

MacKinnon, C. (1989). Toward a feminist theory of the state. Cambridge, Mass.: Harvard University Press.

McEwen, K.D. (2018). Self-Tracking Practices and Digital (Re)Productive Labour, Philosophy & Technology 31, 235-251.

Noble, S.U. (2018). Algorithms of oppression: how search engines reinforce racism, New York City: New York University Press.

Peña, P. and Varon, J. (2019). Consent to our Data Bodies. Lessons from feminist theories to enforce data protection. Developed by Coding Rights. Available at: https://codingrights.org/docs/ConsentToOurDataBodies.pdf.

Steinmüller, W., Lutterbeck, B., Mallmann, C., Harbort, U., Kolb, G., Schneider, J. (1971). Grundfragen des Datenschutzes, Bundestags-Drucksache VI/3826, Anlage 1; available at: https://dipbt.bundestag.de/doc/btd/06/038/0603826.pdf

de Vries, K. (2010). Identity, Profiling Algorithms and a World of Ambient Intelligence, Ethics and Information Technology, 12(1), 71–85.

Weinberg, L. (2017). Rethinking Privacy: A Feminist Approach to Privacy Rights after Snowden. Westminster Papers in Communication and Culture, 12(3), 5-20. DOI: http://doi.org/10.16997/wpcc.258