Trust needs control

Katarzyna Szymielewicz, Panoptykon Foundation, Warsaw, Poland

PUBLISHED ON: 28 May 2015

The often declared death of privacy in the digital society, which is based on sharing and accessing information, is a gross misunderstanding. Indeed, it is more and more difficult to control our data and prevent unwanted exposure. This difficulty only makes privacy more valuable, not less relevant. Like the fact of living in a war time does not make our life mean less or us more likely to give it away. What has certainly been changing is the meaning that people attribute to privacy: it is no longer about hiding or "being left alone", but about control. The more we share and the more we care about being visible to other members of our online communities, the more we need such control.

Can you imagine social networking applications that pick pieces of information and share them with the whole world at their own initiative? It’s not exactly what we have asked for. But indeed, our sensitive information is collected and shared with companies and state authorities – beyond our awareness and control. This is what we call a loss of privacy in the online environment.

People care about their privacy

It is not true that people don't care but many feel that there is no choice. The Future of Digital Trust study involving consumers from France, Poland, Spain and the UK showed that the great majority of people feel they have little power to control the way their personal data is used by organisations (82%). As many as 78% stated that it is hard to trust companies when it comes to the way they use consumer personal data. The same number of respondents felt that service providers hold too much information about their behaviour and preferences (Future of Digital Trust, 2014). According to the Eurobarometer, 73% of Europeans would like to give their specific approval before the collection and processing of their personal information. Only 22% have full trust in internet companies such as search engines, social networking sites and email services (Eurobarometer, 2011).

This gloomy picture is not only bad news for us, but also for the market: as long as the digital market resembles a battlefield –  where our personal data is under siege and we operate in a defensive mode – there won't be much growth and innovation. For the European Union this is no mystery. In every speech that she gave as a Commissioner responsible for the EU data protection reform, Viviane Reding made it clear: "privacy is an integral part of human dignity and personal freedom", but it is also "good for business". Privacy understood as real control over our own data can also mean sharing. Information shared as a result of a conscious choice and not a forceful act or deception, is very likely to be more accurate and, therefore, more valuable.

The current President of the European Commission, Jean-Claude Juncker, saw purely economic value in personal data when he made the EU data protection reform part of the Digital Single Market strategy. The official communication reads: "The General Data Protection Regulation will increase trust in digital services, as it should protect individuals with respect to processing of personal data by all companies that offer their services on the European market reinforce trust and security in digital services, notably concerning the handling of personal data". But will it really? Well, it depends.

From citizens to consumers

In the centre of Viviane Reding's idea for the data protection reform there was a person, whose data is processed by public or private entities and who was promised more control over her actions. Every person could use his or her rights in various capacities: as a citizen - defending his or her fundamental right to privacy, and as a consumer  – requesting a specific service from a company and making sure that such service does not come at a hidden price. Both aspects of our life are important and neither can be ignored when it comes to data protection. Corporate and state surveillance has become part of the same ecosystem, which feeds from the same resources. Now, the European Commission seems to be forgetting this in its strategy for re-building trust in digital services. Citizens have been replaced by consumers, such as their fundamental right to control data flows has been reduced to an effective complaint mechanism when a service is not delivered. Is my right to protect my data rendered conditional to my purchasing power and my willingness to participate in the digital market? Reading the Digital Single Market strategy, one might think so.

Data protection: fixed or broken?

Setting political rhetoric aside, let's come back to the negotiating table, where European institutions fight over the final shape of the reform. What is the state of play, weeks before the expected political compromise and the adoption of the so-called general approach agreement in the Council of the European Union? For Viviane Reding it was clear from the beginning that the main challenge of the negotiations is to make sure that the level of data protection in Europe does not fall below the level established by the Directive. The whole point of this process was to move forward, not backward. The EU promised much more simplified rules and a level-playing field for business and, stronger, technology-proof protections for citizens. One without the other will never bring trust – it could only mean the beginning of a second open season for our personal data, this time inside the EU.

Independent experts from non-governmental organisations, cooperating under the umbrella of European Digital Rights, analysed the first chapters of the General Data Protection Regulation as agreed by the Council (in particular provisions on general principles, data subjects’ rights, remedies and sanctions) and compared them with the two starting points for the negotiations: (i) draft of the General Data Protection Regulation that came from the European Commission in 2012 and (ii) the November 2013 legislative report adopted by the European Parliament. European Digital Rights’ assessment was summarised in one phrase: "Data protection broken badly". Without transparency, strong safeguards and effective remedies, there can be no control and no data protection.

It is not the case that every provision concerning citizens and their rights was weakened – there are bright points and even highlights. But the fundament of the whole legal framework has been undermined. In the text, governments agreed that consent for processing data does not have to be explicit. To put it straight: even if I don't say "yes", a company will be able to assume that this is what I meant. To make things worse, companies would be able to change the original purpose of data processing without even consulting me. For example, my bank or insurance provider might be able to take personal data that I had to disclose while entering into a contract and use it for entirely different purposes, like predictive profiling or behavioural marketing. Is this a way to restore trust in digital environment?

Add new comment