Data protection in times of COVID-19: the risks of surveillance in Brazil

Clara Iglesias Keller, Digital Disinformation Hub, Leibniz-Institute for Media Research/Hans-Bredow-Institut, Hamburg, Germany, c.keller@leibniz-hbi.de
Jane R. G. Pereira, Faculty of Law, Rio de Janeiro State University (UERJ), Brazil

PUBLISHED ON: 02 Apr 2020

The legitimacy of government surveillance measures to fight the spread of COVID-19 has taken over the data protection debate in affected countries. Starting with China (first epicentre of the pandemic and known for its institutionalised surveillance), use of personal data to enforce containment policies (like tracing contacts and spotting agglomerations) was also reported to have been adopted in different other countries, raising concerns in the academic community.

In Brazil, the possibility of implementing similar practices has first appeared as a technical discussion on mass and social media on the lawfulness of institutionalised surveillance practices designed to prepare and enforce isolation policies. Now, the theoretical controversy escalated to concrete measures, since local and federal governments are enacting data access agreements with telco companies.

While data protection scholars have already signaled that the use of personal data to fight the pandemic is not necessarily illegitimate, there are a series of issues regarding the way these processes are being led by state authorities.

Surveillance initiatives in place

For the time being, a few worrying developments have been confirmed. In Rio de Janeiro, city hall and telco operator TIM have signed an agreement that allows local authorities to track the concentration and movement of people in the territories affected by the pandemic. The goal is to support the pandemic control by allowing government agencies to evaluate the success of the measures that have already been implemented and inform future actions. Even though further details of the agreement were not yet disclosed, TIM has stated that all customer data is anonymised and used to draw heat maps, crossing information about epidemiological outbreaks and points of high concentration of people. A similar arrangement takes place in the north-eastern city of Recife, where cell phone tracking data associated to at least 700,000 telephone numbers is being used to coordinate actions encouraging social isolation.

On the federal level, the Ministry of Communications announced another partnership with cell phone operators to allow agglomeration monitoring and provide personal information on the gender and age of tracked users. These measures are particularly problematic since President Jair Bolsonaro has repeatedly denied the magnitude of the pandemic threat and, at the same time, supported demonstrations against isolation policies. While the president insists in undermining these restrictions, regional and local governments as well as other federal bodies officially support heavier isolation and contingency policies, hitting the population with mixed messages and likely increasing the sense of insecurity and uncertainty during the pandemic. For the data protection debate, this contradiction raises a flag on what exactly are the destination and purpose of the data targeted by the federal agreement, as its collection is not coherent to the president’s discourse on the measures that the pandemic requires.

Overall, these initiatives are not being met with the level of public accountability required by highly complex policy choices with strong impacts on rights and liberties.

Risks for privacy and accountability concerns

First, there are no participation or disclosure mechanisms in place to ensure transparency over the terms of these documents or the negotiation processes that led to their adoption. As they have not yet been made available to the public, the content can only be assessed by what the authorities report. With no transparency, there is also no debate over such texts and the extent of their legitimacy (which in a timely sensitive context could at least take place in post-implementation stages), contradicting some basic public administration fairness principles guaranteed by the Brazilian Constitution.

It also allows the reinforcement of a rhetoric according to which privacy is a relativised or possibly devalued right in the face of promising technological solutions, a mindset that now presents itself in a far more dramatic guise as the world faces a sanitary emergency. Complex and possibly irreversible initiatives that could take years to be debated and responsively shaped are being implemented overnight.

As duly pointed out by some scholars, Brazil’s GDPR-inspired data protection law does provide for fundamental rights that could help design these policies while still protecting individuals – a task that is challenged by the lack of a structured Data Protection Authority, but still, not impossible. However, even the application of this framework is currently threatened by legislative proposals to postpone the validity of the law. Initially fixed for next August, there is at least one bill of law under congress appreciation since last year proposing its postponement (possibly motivated by difficulties in structuring the data protection authority), a claim that has now been reinforced by the COVID-19 pandemic.

What is next?

The abrupt adoption of these measures - coupled with the eventual postponement of the validity of the data protection law – is likely to relocate this debate to the courts. As much as the checks and balances system arms these bodies with the legitimate remedy against authoritarian abuse, in the face of a health emergency judges will be pressured to decide provisionally, in a short amount of time and with a low level of information. In this context, there is an important risk of developing a legal and jurisprudential framework based on a fait accompli, with irreversible restrictive effects on freedoms and privacy.

In this challenging landscape, we should not ignore some bitter lessons from the global technology regulation experience, like the risk of function creep, so often brought into the debates on the risk of repurposing content filters with questionable ends. Overall, many democratic systems have faced the contingencies of unchecked tech powers or misguided regulation. At the very least, these experiences lead to the conclusion that depriving these processes of transparency and accountability can impose high costs in the future.

Add new comment