Harnessing the collective potential of GDPR access rights: towards an ecology of transparency

René L. P. Mahieu, Delft University of Technology, Netherlands, r.l.p.mahieu@tudelft.nl
Jef Ausloos, Institute for Information law (IViR), University of Amsterdam, Netherlands

PUBLISHED ON: 06 Jul 2020

The GDPR’s goal of empowering citizens can only be fully realised when the collective dimensions of data subject rights are acknowledged and supported through proper enforcement. The power of the collective use of data subjects’ rights, however, is currently neither acknowledged nor properly enforced. This is the message we sent to the European Commission in response to its call for feedback for its two-year review of the GDPR. In our submission entitled Recognising and Enabling the Collective Dimension of the GDPR and the Right of Access – A call to support the governance structure of checks and balances for informational power asymmetries, we demonstrate the collective potential of GDPR access rights with a long list of real-life examples.

According to the European Commission's recently published evaluation, the GDPR is doing well in attaining this goal of empowering citizens. We do not agree with this conclusion. While we share some aspects of the positive evaluation, our research shows that the empowerment provided by the GDPR is severely limited. We cannot ignore the fact that most data protection experts, including regulators, academics, practitioners and NGOs are indicating that in a fair assessment of the success of the GDPR the glass is at best half full. Moreover, as shown by the surveys conducted for the Commission, the majority of European citizens already felt that they did not have control over the personal data they provide online before the introduction of the GDPR, and the proportion of citizens feeling that way has only grown since the introduction of the GDPR [see Eurobarometer 497a, Eurobarometer 431]. The whirlwind of cookie banners, “informed” consent forms and privacy policies which the GDPR triggered, have probably contributed to a sense of dis-empowerment, as these are very demanding on the individual and may have just made people more aware of their existing lack of control. We believe the discrepancy between the Commission’s largely positive evaluation and the practical experience of many citizens can be explained by the Commission ignoring the two key elements we highlighted in our submission.

Lack of collective dimension and problems of enforcement

First, the Commission fails to acknowledge the collective dimension at the core of the governance system put in place by the GDPR. In the face of an ever-increasing digitalisation of our society, and the growing informational power asymmetries that accompany this shift, the potential for empowerment through individual rights is limited. This fact is recognised in the “architecture of empowerment” provided by the GDPR, which places individual citizens and their rights in a broader infrastructure, also empowering societal organisations as well as data protection authorities (DPAs). In order for data subject rights, and the right of access in particular, to live up to their potential for empowerment and social justice in a datafied society, we need to recognise and stimulate an ‘ecology of transparency’.

Second, the Commission underestimates the existing problems in compliance and enforcement. While enforcement went up over the last years, most DPAs are structurally under-resourced and many blatant infringements of the GDPR remain unenforced. Without proper enforcement there can be no citizens’ empowerment. The “architecture of empowerment” will inevitably lack the backbone that is needed to enable the “ecology of transparency” to express its full potential.

The ecology of transparency

In order to substantiate our call for more attention and effort to be put into enabling an ecology of transparency, we provided a broad overview of real-life cases where the right of access has been used collectively. The annex to our submission describes around 30 cases in which an engaged civil society (including NGOs, journalists and individuals) used the right of access to achieve collective goals. As such, we wish to highlight this collective dimension of access rights under GDPR, emphasising their potential for social justice and the actions that need to be taken for rendering them effective. The European Commission, along with the European Data Protection Board (EDPB) and national data protection authorities has a duty to create an enabling environment for collective access rights.

The ecology of transparency we envisage is constituted by the intra-institutional network of actors, laws, norms and practices in which the right of access is exercised. It is shaped by the interplay between the law, the regulators and the actual practices of civil society. Taking this broader view on the ecosystem of institutions and practices allows us to better identify the social conditions that need to be in place for the right of access to achieve its goal of enabling citizens to assess and contest systems that rely on the processing of personal data.

This ability to scrutinise and challenge digital infrastructures or ecosystems has become even more urgent in the wake of the massive migration of work, education and social life to online services and platforms. GDPR transparency measures – and the right of access in particular – offer a vital legal tool for investigatory research into these digital infrastructures, identifying what data is collected, how and why it is processed, with whom it is shared, and how it is (supposed to) affect people.

Apart from the “usual suspects” such as Privacy International, NOYB and Bits of Freedom, we also observed other NGOs, collectives, journalists and motivated individuals capitalising on access rights in order to achieve goals reaching beyond mere curiosity and/or self-interest. These range from climate activists fighting corporate surveillance, to students uncovering discriminatory admission criteria, content creators challenging YouTube’s demonetisation and content recommendation practices, gig-economy workers pushing for better working conditions and a whole range of investigative journalism projects. These examples illustrate a crucial point: data subject rights are not only necessary tools to safeguard ‘privacy’ or ‘data protection’ rights, but are vital to the defence of all fundamental rights.

No empowerment without enforcement

While the real-life cases listed in our submission clearly demonstrate the importance and collective potential of access rights, we observe major failings that obstruct this ecology of transparency, and thereby thwart the emancipatory potential of the GDPR. Meaningful compliance with the right of access is still very low. Many data controllers only grant access to some of the information they are legally required to give, and/or raise many legal and technical obstacles along the way. This has resulted in numerous complaints filed with data protection authorities across the EU. For example, in the last year, almost 40% of the complaints received by the UK Information Commissioner's Office (ICO) concerned access requests, and almost 30% of the complaints received by the Dutch DPA were about data subject rights, with a substantial part relating to the right of access.

Considering this high number of complaints, as well as the seriousness of the alleged infringements, it is harrowing to see how weak enforcement has been in the last two years. The Commission holds the view that “DPAs have made balanced use of their strengthened corrective powers” (p. 5). We forcefully disagree with this position. In our submission to the Commission, we raised four enforcement issues in particular: (a) lack of consistent enforcement across EU member states; (b) apparent low-priority of data rights cases; (c) very slow enforcement; and (d) over-tolerant enforcement. DPAs should take data subject (access) rights much more seriously, as they are a crucial tool within the GDPR’s architecture of empowerment. We observe that even where NGOs or academics have filed well-argued and documented complaints for often blatant cases of non-compliance, DPAs have only taken action occasionally, and if so, very mildly. This stands in sharp contrast to their pivotal role in the ecology of transparency, in which DPAs are explicitly tasked (and given extensive powers) to monitor and enforce the application of the GDPR. A significant increase in resources and know-how is crucial in resolving these issues. In light of this, we welcome the Commission’s acknowledgment that many DPAs lack the required funding. Crucially, both the right of access and DPA’s duty to verify compliance are explicitly mentioned in the Charter of fundamental rights and freedoms of the EU. As it stands now, there is a strong argument to be made that most member states fail to comply with Article 8(2)-(3) Charter.

The effectiveness of the ‘ecology of transparency’ depends on the effectiveness of its individual components, i.e. the network of actors, laws, norms and practices in which the right of access is being exercised – and their ability to mutually reinforce each other. Active citizens, digital rights organisations, the media, academia, but also regulators, data protection authorities and data protection officers interact with each other and function together as a network of checks and balances.

Supporting a thriving European culture of data protection

The severe information and power asymmetries in modern society cannot be addressed effectively by data subjects acting alone. It is in recognition of this reality that the GDPR provides a broader architecture of empowerment. Yet the importance of the collective dimension underlying the GDPR has still not been properly recognised. This is problematic, because collective processes are vital when contesting situations where the current status quo is essentially at odds with fundamental rights. Especially in contexts characterised by strong information and power asymmetries. In our submission, we list numerous real-life cases that exemplify this, such as Max Schrems contesting data transfers to the US because their surveillance laws are in contradiction with European fundamental rights, or civil society scrutinising the ad personalisation sector. Without immediate action reinforcing the collective dimension of the GDPR, we risk further solidifying the current status quo where individuals - and society more broady - are at the mercy of those operating data infrastructures.

For these reasons, the competent institutions – i.e. the European Commission, EDPB, DPAs and European Data Protection Supervisor (EDPS) – should properly consider, value and strengthen the ecology of transparency when interpreting and applying the GDPR. Fully recognising the ecology of transparency is vital in enabling the GDPR to realise its function as a baseline framework for a fair data-driven society. The time is now to invest in collective empowerment so as to nurture a thriving European culture of data protection.

1 Comment

Elena

8 July, 2020 - 07:47

Hello. This is a great article. I could see that the paragraph just above the heading 'No empowerment without enforcement, also mentions ''discriminatory admission criteria'' when it comes to students to which I would like to add the extremely invasive phenomenon of online proctoring as it has impacted millions of students around the world so far with online petitions signed by students against such services both in the EU and North America. This phenomenon has started to take outrageous dimensions, and I can only hope someone will look into that because it is really heart-breaking what is happening. It is a relatively new phenomenon in Europe but has existed in North America for a while because the approach to data protection, privacy and use of emerging digital technologies is legislated and controlled differently on the two different sides of the Atlantic as we all know so well. It is alarming how more and more European educational institutions fail to show any real understanding of students' right to privacy and any real consideration, it seems, for Art.8 of the European Convention on Human Rights (right to respect for private and family life) and EU's GDPR Art. 9, Art. 4, Art. 21(1) and Art 21(4), Recital 2, Recital 4, Recital 7, Recital 58 , Art 6.(1)(f) and Art 6(4), Art. 82. A number of European educational institutions have over the past year (pre-Covid) started to impose on students so radically online proctoring for online exams (e-assessments) proctored remotely by US-based companies with extremely incongruous and intimidating policy statements and terms of service that if one was to sit and analyse their content line by line would like feel just as alarmed and intimidated and reduced to just an experiment like many many students out there who wonder what on earth is happening. These proctoring services video record students in their own homes or at their training firm’s premises if the firm can provide the student with a private room (uninterrupted for hours) or in a public library if the student can find one available to book for hours. Not only does the online proctoring service infringe on the student’s right to respect for private and family life (Art.8 of ECHR) through the video footage of them in their home (some students live in open concept homes; others in shared accommodation, others have big families and a shared computer only, others may live in an environment that is in no way conducive to taking an exam ever, while other students live in heart-breaking environments; each student’s private life and home is different), but also requires the student to download invasive remote surveillance software on their laptop, to disable VPN and firewall protection, to understand that cookies will be placed on their laptop and also have to allow the proctoring company to collect such an amount of biometric personal data that literally makes one’s mind freeze: digital image of their face and digital copy of their government-issued photo ID (passport or driver’s licence, etc. that all include so much more personal data (including signature and everything else that is displayed on such IDs with some EU countries’ IDs even having the holder’s social insurance/security number of them) that in a traditional exam setting no locally-invigilated examiner would ever be allowed to make digital copies of. Along with all these all taken via a webcam, the proctoring service ProctorU (to speak from the standpoint of a data subject who had this service imposed on me by the educational institution whose educational services I regrettably paid for) also collects and transfers to the US: their name, student number, IP address, browser and operating system of laptop, email address, contact phone number, country of residence. This amount and level of invasion and digital exposure of a person’s biometric personal data to cyber risks is utterly shocking since Proctor clearly states that ProctorU assumes ’’no liability for any loss or damage resulting from the use of or inability to use the services’’ and that ’’access to and use of the Services or any Content is at your own risk’’.
Never in a million years, did reasonably expect that a European educational institution could just like that impose a US-based proctoring service on students just because it is convenient for the institution since the deal must be super financially convenient for them as a number of statements in the institution’s documents reveal as a cost-cutting measure. What happened to all the other EU competitors in the testing industry that could be used by EU students including locally invigilated halls in large universities and libraries and local and EU testing centres? Why are they entirely out of the question? Is the US-based proctoring service charging EU educational institutions unrealistically low prices which may drive competitors out of the market? Why are European educational institutions forcing such conditions on their students? What kind of relationship is there between the student body who is paying fees for educational services and their educational institution? Are students clients? Are they employees? That is obviously not the case. It is in fact very unclear what category students fall under and why they treated like that. Without students (collectively) such institutions would and could not exist. And what is even worse and unethical and extremely disproportionate is the fact that the educational institution (and I am speaking about the one for whose educational services I have paid this year) even has the audacity to tell the students that they do not need the students’ consent as they believe that they have legitimate interests to ensure academic integrity via the world wide web that override the students’ right to privacy and respect for their private and family life and data security concerns all while having the audacity to also tell students at the same time that they must agree blindly without questioning anything, to ProctorU’s privacy policy and terms of service. First, under EU’s GDPR, one cannot invoke legitimate interests while at the same time ordering data subjects to consent to a third-party’s privacy policy and terms of service. Second, this is extremely unethical and against any Professional Code of Ethics since it is riddled with lack of transparency, excessive disproportionality and inequity since the burden of exam logistics is shifted overwhelmingly onto the shoulders of the students and their training firms (in the case of those who study and do their training at the same time) as they are required to find a place in their own home or their training firm that is conducive to taking exams for long hours – in a private space that is quiet, well-lit, and uninterrupted by family members or colleagues and all that after the students or their training firms have paid hefty tuition and exam fees, after all this >> as if they haven’t already paid enough for an educational service to be so presumptuously expected and required to even turn their homes or training firms (if they are signed up with a firm) into an exam venue and be video recorded there and then have all this packet of biometric personal data transferred to the US and all for the purpose of taking exams in the EU! What an infringement of Art.8 of The European Convention on Human Rights regarding the right to respect for the person’s private and family life! As far as fairness, transparency and proportionality go, many institutions present the imposed use of AI-driven facial recognition and surveillance software used in online proctoring in a very biased light as if AI is flawless and perfect, creating such a distorted, false representation of what AI can do and cannot do and its risks and liability for emerging digital technologies that many researchers and data protection regulators have recently emphasized along with the need for more scrutiny and more legislation. These educational institutions mention absolutely nothing about this side of reality. If one listens to the discourse of many of these educational institutions that impose such third-party proctoring services on students with no opt-out solutions as if students are collectively some kind of second class citizens with no rights to privacy and data security concerns, they can quickly realise that only one side of the argument is presented in an attempt to brainwash students telling them ‘oh online proctoring is good for you, you can do it from the comfort of your home’. Well, the reality is that we call it ‘the comfort of our home’ because it is private and safe and unexposed to unknown people and unsolicited services.

Add new comment