Bleeding data: the case of fertility and menstruation tracking apps

: Journalists, non-profits and consumer organisations, as well as the authors’ first-hand review of relevant privacy policies reveal that fertility and menstruation tracking apps (FMTs) collect and share an excessive array of data. Through doctrinal legal research, we evaluate this data processing in light of data and consumer protection law but find the commonly invoked concepts of ‘vulnerability’, ‘consent’ and ‘transparency’ insufficient to alleviate power imbalances. Instead, drawing on a feminist understanding of work and the autonomist ‘social factory’, we argue that users perform unpaid, even gendered, consumer labour in the digital realm and explore the potential of a demand for wages.


Introduction
Tracking one's menstrual cycle is an ancient practice (Dean-Jones, 1989), which remained largely confined to the analogue world until the emergence of female technology (femtech) products and services. As the femtech market grew, fertility Hence, compared to regular offline or online calendars, which might be visible to others (e.g., their cohabitants or employers), users choose FMTs as a way to conceal menstruation and fertility 1. We acknowledge that not all users of FMTs are women. As not all those who menstruate identify as women and not all those who identify as women menstruate, we mainly refer to 'users' of FMTs or ' consumers' , unless where being a woman is particularly relevant to our analysis.
identifier, Crash information, Browser, Browser settings, Manufacturer, Media Access Control address, Time-stamped logs of messages sent and received, Network status, Screen information, Mobile service provider, Installed app version, Applications installed in a mobile device.
• App usage data: Frequency of use, General activity (e.g., visited areas/features, tabs opened, links/ buttons clicked, sessions information, use patterns), Engagement with particular features/services, Acquisition channel / exit URLs, Authentication, Notification activity.
The list above indicates that fertility and menstruation tracking app users 'bleed' an excessive array of data, especially if we consider that the privacy policies of profiling (Forbrukerrådet, 2020), and research institutions (Palus, 2019). These revelations suggest that third-party sharing, especially for marketing and advertising purposes, is an integral part of FMTs' business model rather than a one-off incident.
Nonetheless, compared to health and wellness applications in general (European Commission, 2014a, 2014b), legal research has paid little attention to the challenges of FMTs (e.g., Rosas, 2019) and even less to their particularly gendered dimensions. Our contribution attempts to bridge this gap by evaluating FMTs through the combined lens of feminism and law. In what follows, we assess the extent to which data protection and consumer protection law could tackle thirdparty sharing practices such as those mentioned above and we suggest a more radical approach based on feminist theory and political economy.

Section 2: The use of FMTs through the lens of data and consumer protection
The excessive data processing and third-party sharing in the context of FMTs illustrated in the previous section raises legal questions regarding data protection as well as consumer protection. 3 Given that, as seen above, FMTs users are often unaware of how their data are (further) processed, the requirements chosen from both legislations were identified following a common theme, i.e., the means to counteract information and control asymmetries between FMTs users and providers. Hence, the following subsections investigate the notion of 'vulnerability' in privacy and consumer law, illustrate the requirements relevant to the mitigation of information imbalance, and present key limitations of these requirements.

Imbalance of power and vulnerability: a data subject's perspective
Where there is an imbalance of power, users' autonomy may be at risk. For instance, FMTs users who are uninformed of the modalities of data processing will likely be unaware of the effects of their actions in the app's environment. Or, not knowing who is further processing their personal data may lead users to lose control thereof. This eventuality becomes more evident if data concerning health or mood are used to profile users and nudge them into buying products when, according to the app's predictive analysis, they might be more vulnerable.
Privacy and data protection legislation is a 'manifestation of the idea that all individuals are vulnerable to the power imbalances created by data driven-technologies' (Malgieri & Niklas, 2020, p. 2). The rationale for privacy protection is precisely addressing individuals' vulnerability (Calo, 2018 here FMTs users, are unable to easily oppose the processing of their data or exer-3. Due to space limitations, we address here a selection and brief overview of issues: vulnerability, consent and information asymmetry (see infra). Beyond data protection and consumer protection law, issues pertaining to content moderation and publishers' liability would also be suitable for inclusion, given that some FMTs moderate, recommend and curate content produced by users and/or themselves. However, as such content moderation aspects are not central in FMTs, they are not examined in this article.
cise their rights. In any case in which an imbalance exists in the relationship between the data subject and the controller, also 'in terms of possible impacts on fundamental rights and freedoms, significant information asymmetry based on predictive analytics' (Malgieri & Niklas, 2020, p. 6), vulnerable data subjects can be identified (WP29, 2017).

Transparency and control, information and consent
Privacy and data protection laws furnish means useful for implementing fair, lawful and transparent personal data processing. The primary legal instrument for privacy and protection of personal data in the EU is the GDPR. The Regulation sets high standards for companies both inside and outside the EU, insofar as EU residents are being monitored. Following prior data protection laws, the GDPR maintains overarching principles to which data controllers must adhere, including transparency, purpose limitation, data minimisation, security and confidentiality, and storage limitation. Amongst others, GDPR tools that may strengthen users' autonomy are information transparency, which should help FMTs users become aware of data processing, and the legal bases for processing, including consent, which should enhance users' control over personal data.
Concerning information transparency, Articles 13 and 14 of the GDPR identify a list of elements that should be included when informing data subjects about the processing of their data. As Forbrukerrådet (2020) noted, the comprehensive data collection and sharing by such apps (often referred to as adtech) appears to be in conflict with the principle of transparency. Although promising, these requirements imply challenges in practice. For instance, controllers do not diligently map and record the third parties involved in the processing activity. On account of these challenges, subjects' information rights are not always respected, for instance, the sub-processors of personal data are not disclosed, leaving subjects in obscurity with regard to third-party data outsourcing (Forbrukerrådet, 2020). Thus, notwithstanding the existing rules and guidance at national and EU level, it remains questionable whether such information criteria are rightfully operationalised in practice.
As regards the legal basis of FMTs' data processing activities, some observations on the nature of personal data are needed. Menstruation data, users' symptoms, moods, computation and evaluation thereof in the context of FMTs are ascribable to the GDPR's 'special categories of personal data' (Article 9 GDPR), as they allow controllers to infer subjects' health status (European Data Protection Supervisor [EDPS], 2015) or because they concern 'a natural person's sex life or sexual orien-tation' (Article 9 GDPR). The processing of personal data and special categories thereof requires a legal basis, foreseen respectively in Articles 6 and 9 GDPR. For special categories of personal data notably, among the closed list of Article 9 GDPR, consent appears the most suitable legal basis for FMTs.

FMTs through the lens of consumer protection laws
Imbalance of power and vulnerability: a consumer's perspective Imbalance of power and individual vulnerability is a consumer law concern, too.
Over the last years, the advent of data-driven services online has fostered the relevance of consumer laws to digital content and services. In terms of consumer law, Such vulnerability might also be related to gender. A European Commission (2016a, p. 168) report explains that gender is a strong link with the dimension of consumer vulnerability. Other sources maintain that men are often less likely to be vulnerable than women on a number of indicators (Nardo et al., 2011, p. 12) and that pregnancy or significant life changes that women experience can be sources of vulnerability (The VOICE Group, 2010, p. 180).

Transparency and omission of information in misleading commercial practices
The UCPD prohibits commercial practices that are unfair, such as misleading or aggressive commercial practices (Articles 5-9 UCPD). Commercial practices are considered unfair when they materially distort or are likely to materially distort consumers' ' economic behaviour' (Article 5(2) UCPD). Examples of commercial practices that are considered per se unfair are provided in Annex I of the UCPD. A commercial practice is misleading if it contains false information, is untruthful, deceives or is likely to deceive the average consumer (Article 6 UCPD). A misleading practice could also occur through the omission of material information that consumers need in order to take an informed transactional decision that they would otherwise not have taken (Article 7 UCPD). Transparency in data sharing is key in that regard. For example, providing an FMT without disclosing full information about the further sharing of personal data may constitute a misleading commercial practice. This is because, as the European Commission (2016b) observed, not informing the consumer that the data will be used for commercial purposes might constitute a misleading omission of material information, following Article 7 (2) and No 22 of Annex I UCPD. In order not to be misled, consumers need to have, among other types, information on the extent of personal data processing necessary for service provision as well as information on any monetisation or third party sharing of the data (Helberger et al., 2017).

Evaluative remarks on data protection and consumer protection law
However appealing the notions of vulnerability, consent and transparency prevalent in data and consumer protection, they come with their own limitations, which are broached upon below.

The limitations of vulnerability: entrenching problematic assumptions
Emphasising vulnerability entails the risk of implying that there is a normative subject, that is, the average or reasonable data subject or consumer, from which the vulnerable individual deviates. Indeed, Recital 18 of the UPCD states that 'this Directive takes as a benchmark the average consumer, who is reasonably well-in- Most importantly, vulnerability is a concept that accepts different interpretations.
Article 5(3) of the UCPD adopts an understanding that ascribes vulnerability status to individuals or groups based on their characteristics rather than the situation(s) in which they find themselves. When vulnerability is thereby linked to consumers' specific traits, the law can empower the individual so as to be more resilient. However, as long as the focus remains on specific individuals or groups and their labelling as vulnerable, the causes of possible harm, including structural injustices, remain intact and the social reasons or factors contributing to this vulnerability (such as market-related ones) remain unexplored. Hence, in line with Fineman, we should instead turn our attention to the unequal ways in which 'systems of power and privilege [...] interact to produce webs of advantages and disadvantages' (Fineman, 2008, p. 16).

The limitations of consent: how genuine is consent vis-à-vis power imbalance?
Although individuals appear concerned by the insufficient protection of their personal data, they tend to keep disclosing them in a seemingly careless way, giving rise to the so-called 'privacy paradox' (Barnes, 2006). Suárez-Gonzalo (2019, p. 176) attributes this paradox to power inequalities between individuals and those who exploit their data. At the same time, users are expected to resort to some sort of 'privacy self-management' . While it appears as a promising solution-relying on individuals' own capacity to evaluate the collection, use or disclosure of their information-privacy self-management is unlikely to be possible or feasible anymore (Solove, 2013) and could, among other practical problems, result in ' consent fatigue' .
According to the GDPR, consent shall be freely given, specific, informed, explicit.
'Freely given' means that data subjects should have real choice and control: ' [i]f the data subject has no real choice, feels compelled to consent or will endure negative consequences if they do not consent, the consent will not be valid' (EDPB, 2020). However, when it comes to the role of consent as a means to balance control asymmetry, the GDPR's conceptual framework is criticised as unable to protect users from data domination (Suárez-Gonzalo, 2019, p. 176). Suárez-Gonzalo (2019, p. 179) notes that consent cannot be guaranteed to always be specific and informed. Particularly in the case of FMTs examined in Section 1, specificity emerges as a challenge, given that their privacy policies mention the data processed only indicatively. More generally, the 'freely given' criterion is increasingly harder to meet if we consider users' ongoing manipulation through behavioural advertising.
Furthermore, from a feminist orientation, consent serves as a 'function of power' (Marling, 2017). As Peña and Varon (2019, p. 27) note, '[r]ushly clicking a button to express accordance with all the conditions we listed above in a situation of power imbalance and practically no other option means we are currently deprived of no' .
Conversely, through the lens of a feminist ethics of care, consent to privacy poli-cies should be approached as an ongoing and mutually advantageous relationship between two equal parties rather than a box which needs to be ticked once by the user (Wittkower, 2016). In light of these concerns, 'we should work towards new informed consent models in forms that consent can be genuine' (Verhenneman, 2020, p. 305) or move beyond consent altogether.

The limitations of transparency: overlooking collective forms of data governance
Transparency requirements seem to assume that providing more information to users suffices to improve their position. However, as data processing and aggregation become more complex, ubiquitous and opaque, information asymmetries likewise grow (Ausloos, 2018, p. 14). Contrary to those data collectors who are equipped with data mining capabilities, even if app users are granted access to their own data, they remain deprived of the-computational but not only-power to process and meaningfully leverage these data; this recurrent state of powerlessness is described by Andrejevic (2014) as the 'big data divide' . Even further, it is argued that the current data protection legal framework and transparency requirements fall short of ensuring effective answers to large parts of the social challenges implied in a machine learning big data context-especially when the data concern not only individuals but also groups (Vedder, 2018). should be considered as political and not confined into the realm of the household, private sphere. Instead, governing data should be seen as 'a collective resource that necessitates far more democratic, as opposed to personal, forms of institutional governance' (Viljoen, 2020, p. 9).

The limitations of law in general
Apart from the limitations of these specific concepts, appeals to data and consumer protection in general tend to privatise problems and do not lend them- Overall, the challenges raised to the notions of vulnerability, consent and transparency as well as to law more broadly point to the need for structural and collective approaches in order to alleviate power imbalances.

Section 3: The use of FMTs through the lens of feminism and political economy
Although data protection and consumer protection law need not be rejected, a more structural and collective approach necessitates an investigation of more radical theories. As hinted in the challenges raised against vulnerability, consent and transparency, feminist theory highlights issues of subordination and power imbalance. Hence, this section relies on socialist feminism as a lens to evaluate the processing of FMT data.

Feminist perspectives to work and the 'social factory'
The folk, as well as the legal, understanding of work focuses on the paid, contractual employment relationship that takes place in the public realm. Highlighting female-dominated activities, feminist theory and practice have challenged this restrictive understanding and the legitimacy of what is at each time perceived as work (Daniels, 1987). Against the traditional public/private demarcation, based on which the (remunerated) activity of male breadwinners occupies the public realm and the non-work activities of female spouses the private realm, feminists have argued that private and unpaid activity can likewise be work (Daniels, 1987). Even Despite the integral role that such private activities play for workers' (re-)production as well as capital accumulation, the social system in which they take place avoids remunerating them by treating them as 'a natural resource or a personal service' outside capital (Federici, 2014, p. 8; see also Dalla Costa & James, 1975).
Rather than arising naturally, Dalla Costa and James (1975) as well as Federici (2014) have argued that it was the advent of capitalism which caused the division between paid and unpaid labour and their respective association with factory work and housework. This division has further been gendered and hierarchical, with women being relegated to a devalued domesticity (Davis, 1981). Nevertheless, an almost exclusive focus on the struggles of the male working class has overshadowed the unpaid female labour that takes place within the confines of the house (Fortunati, 1996). To counteract this omission, Dalla Costa and James (1975)

Data sharing in FMTs as free, digital labour
As indicated in the introduction, users might be experiencing their interaction with In addition, feminism has drawn attention to the female work of community-making, which, despite preserving society, is relegated to a voluntary activity (Daniels, 1987  Finally, the cultivation of market-aligned subjectivities, which will be conducive to production and consumption, as well as the cultivation of norms and identities that subjects will adopt and reproduce (Charitsis, 2018)  Based on the above, by sharing their data, users create and augment the value of such apps, together with that of component or affiliated products and services, in ways that often exceed their awareness or control. The fruits of their activity become resources that benefit the companies processing these data alongside the latter's business partners-Glow, for example, partners with pharmacies and fertility clinics (Hall, 2017). As FMTs users become active participants in the production process, the demarcation between who produces, consumes and benefits from their data fades: they become 'prosumers' (Toffler, 1980), standing in-between value consumption and production (Charitsis, 2016;Lupton, 2015b). Such prosumption becomes clearer when conceived not only in terms of each user's contribution, but also in terms of the collective value created through their totality. More broadly, even when the value produced is not extracted by the app providers and partners directly, it indirectly reproduces (digital) capitalism, meeting Glazer's broad definition of work from the previous sub-section. Therefore, using an FMT might concurrently be a leisurely, handy or meaningful activity and, insofar as it is not adequately rewarded, an exploited one.

Wages for FMT data?
Could the answer to this generation of value by users be their compensation?
Again, feminist struggles regarding domestic labour, particularly the demand for wages against housework, as articulated in the eponymous manifesto (Federici, 1975), could prove an alternative reaction to 'heteromation' . Indeed, the 'wages for

The potential of the demand for wages
We identify three main advantages in such a demand. In a first instance, visibility lends legitimacy to one's work and is a precondition for successfully resisting or struggling against that work (Federici, 1975). In the same vein as 'wages against housework' and 'wages for Facebook' , requesting wages for the data that FMT users share could, therefore, stimulate critical debate. It could garner social and institutional recognition for users' contribution and its quantifiable value as well as capturing users' distinct status as 'prosumers' beyond the data subject/controller or consumer/producer binaries. In addition, as Federici (2012) has noted, '[i]t seems to be a social law that the value of labor is proven and perhaps created by its refusal. This was certainly the case of housework which remained invisible and unvalued until a movement of women emerged who refused to accept reproduction work as their natural destiny. ' (p. 96). Similarly, demanding wages for FMT data could empower the apps' users to refuse this type of unpaid consumer labour. Most importantly, and in contrast to the legal requirements previously examined, such empowerment would not rely on users' problematic characterisation as vulnerable.
In a second instance, demands for wages come with a provisional recognition of those demanding as workers. As such, they would enhance users' bargaining power and enable them to negotiate the terms and extent of their contribution to such apps, a possibility which is currently precluded by the 'take it or leave it' nature of consent requirements. Moreover, compared to the individualistic nature of data protection and consumer protection laws in general, such a demand would offer the opportunity for collective bargaining and negotiations. It would be a direct demand for power premised upon the shared experience of data production and processing and the equally shared need for better conditions if one is to keep engaging in such production.
Lastly, 'to demand wages for housework is to make it visible that our minds, bodies and emotions have all been distorted for a specific function, in a specific function, and then have been thrown back at us as a model to which we should all conform if we want to be accepted as women in this society' (Federici, 1975, p. 5). Requesting that an activity be recognised as work and therefore compensated serves to recognise that this activity comes with its own standards of mental, physical and emotional excellence. Depending on individuals' adherence to these standards, some will be positioned as good or superior and others as bad or inferior. Similarly, FMTs construct and reinforce norms that ' order and stratify menstruation and menstruating' , aptly summarised as 'menstrunormativity' (Persdotter, 2020). Users, in turn, might internalise these norms and position themselves as normal or healthy versus abnormal or unhealthy menstruators. Demanding wages for FMT data would be an alternative way to illuminate users' engagement and even struggle with such norms.

The limitations of the demand for wages
However, such a recognition is not without concerns. First, accepting wages in exchange for data might be received as a sort of ratification of this unpaid user-driven labour, which can backtrack rather than accelerate users' empowerment. If we accept that this relation enters the realm of employment, then the power asymmetries and hierarchies of the workplace might be reproduced. One might wonder, for example, if we recognise data provision as work does it follow that users should comply with directions from those for whom they work, i.e., the data controllers?
Such an outcome seems off-putting.
Second and related, linking data collection with wages might enhance the same logic of accumulation that the feminist, autonomist thinking seeks to criticise. The existence of compensation is likely to act as an incentive for users to give more of their privacy away. This, in turn, could support even more the formation of marketaligned subjectivities and change users' perspective on themselves.
However, none of these two objections constitutes a fatal blow to the demand for wages. Specifically, if the remuneration of users did not take the form of a wage conditional on the extent of users' contribution but that of an unconditional and fixed Universal Basic Income, which would be funded by the redistribution of the profits generated by those data, it would not be necessary to affirm a working relationship between FMTs users and providers. Also, as the amount received would be fixed, users would not have any particular incentive to share more rather than less data.
Third, demanding wages for data might imply that the data we generate are a piece of property which can be alienated from ourselves, measured, reused and sold in the market. However, it is crucial not to forget that these data are often about female bodies, which feminist thought has recognised as 'the main targets, the privileged sites, for the deployment of power techniques and power-relations' (Federici, 2014, p. 15). Encouraging the commodification of such embodied data and subjecting them to market norms would, therefore, oppose this line of thinking.
As with the previous ones, though, this objection is not absolute. Given that anticommodification arguments tend to be more strongly raised in relation to women rather than men, possibly owing to a romantic essentialism of the former (Silbaugh, 1997), it should be approached from a critical and balanced perspective. Fourth, and most important, a demand for wages could be characterised as unrealistic or too distant from the current legal landscape. Federici (1975) (Kirchner, 2011). Although AOL eventually settled with the plaintiffs, the court did not bring the case to trial (Kirchner, 2011). In Jeung v. Yelp, Inc. the court denied Yelp reviewers the status of employees, and rejected their compensation claims (Goldman, 2015). Similarly, in Rojas-Lozano v. Google Inc, the plaintiff argued that the second of the two letter sequences she had to decipher when signing up for an email account was not necessary for security purposes; rather, it was improving Google's digitisation service which the company offered for profit to third parties (Dinzeo, 2016). However, the court did not uphold the plaintiff's claim that Google was unfairly benefitting from her labour, and maintained that the benefit of accessing free email services was overriding (Dinzeo, 2016 providing ' counter-performance other than money in the form of personal data or any other data' (European Commission, 2015). 4 The scrutiny of Directive 2019/770 opened ways to further conceptualise the provision of personal data as part of a mutual contractual relationship with the provider of a digital service. How should we characterise the relationship between consumers (and thus FMTs users) and the digital service providers to which the former provide their personal data? Will 'data monetisation' be possible?
The first question is far from settled in legal doctrine. Traditionally, some maintain that privacy, as a fundamental right, is undeniable and unavailable for commercialisation ('moral approach'). Others consider data as susceptible to contractual negotiations, given that they incorporate economic value and correspond to a patrimonial interest of the subjects involved (' contractual approach') (D'Ippolito, 2020). 4. However, the actual formulation differs, as the EDPS (2017) was concerned that said proposal would alter the fundamental rights' balance struck in the GDPR. Consequently, the EDPS advised against it.
Some even hypothesise that users should be remunerated by the controller, further to the use of data relating to them (idem). Resta and Zeno-Zencovich (2018) offer such a 'reversing' perspective, observing that 'it is users that provide a service (the data) to certain businesses, which are in turn remunerated with digital services' (p. 417). 5

Conclusion
Notwithstanding their perceived benefits, FMTs gather an extensive array of user data, which their providers reportedly share with third parties for, among others, commercial purposes. This issue falls directly within the scope of data protection and consumer protection legislation. Nonetheless, as selectively illustrated, the concepts of vulnerability, transparency and consent which dominate in the relevant legal discourse are inadequate to alleviate the power imbalances observed in the context of FMTs and do not sufficiently address the challenges faced by FMTs users. Rather, the latter's position in the legal landscape appears tainted by legal uncertainty. The law recognises such users as data subjects and consumers; yet, 5. Provocatively, the former president of the Italian data protection Supervisory Authority noted '[a] new digital underclass is emerging, a "Fifth Estate" made up of those who are willing to give up their freedom with their data, in exchange for services offered online only apparently "at zero price"' (Soro, 2019, p. 7). one may wonder if there is room to legally classify them as prosumers. In this direction, Marsden (2018) advocates for a new type of 'prosumer law' that 'draws on competition, technology regulation, fundamental human rights, privacy, and free expression' (pp. 378-379). It might be too early or unrealistic to give definite answers to this question. However, evaluating the contractual nature of the provision of personal data as a service could be an interpretative way forward.
Such an interpretation would also be more aligned with feminist thinking. Drawing on a feminist theory and political economy perspective, and particularly on the broad understanding of work inherent in feminism alongside the autonomist concept of the social factory, we assessed the value generated by FMTs users and appropriated by the apps' providers. Specifically, in sharing their data with FMTs, users produce value that benefits FMTs providers, their partners, and (digital) capitalism more broadly. This value manifests itself in the form of commodification of user data, online community-making, reproductive labour, and the cultivation of market-aligned, algorithmic subjectivities, leading to its characterisation as unpaid, digital and, to a certain degree, gendered consumer labour. As a next step, in accordance with the adopted perspective, the recognition of such labour should take the form of a demand for wages. Yet, such a demand is not meant to be taken literally and has its own limitations, which merit analysis. All things considered, though, it constitutes a demand of high heuristic potential, allowing us to fathom and expose the economic relations sustaining FMTs. It is the functional equivalent of a refusal to perform unpaid consumer labour, an expression of collective solidarity, a way to politicise this condition of exploitation and unsettle what we have come to accept as normal and, consequently, a stimulus for much needed legal reform that would capture these problematic economic relations more accurately.
Finally, while keeping in mind that data should not be viewed solely as a commodity, new perspectives opened by the Directive 2019/770 and by the, even theoretical, discussion of data provision in exchange for services might prove useful in more accurately reflecting the realities of the value-producing relations in FMTs and similar apps. Such a potentially synallagmatic approach could provide new theoretical and practical scaffolding to, on the one hand, recognise the value that users' 'bleeding' data yield, and, on the other, curtail the 'leaky' processing activities that app providers deploy.