Solving data protection problems with eCommerce Directive tools

Daphne Keller, The Center for Internet and Society, Stanford University, United States of America

PUBLISHED ON: 12 Nov 2015

This is the fourth of a series of posts about the pending EU General Data Protection Regulation (GDPR), and its consequences for intermediaries and user speech online. In an earlier introduction and FAQ, I discuss the GDPR’s impact on both data protection law and internet intermediary liability law. Developments culminating in the GDPR have put these two very different fields on a collision course -- but they lack a common vocabulary and are in many cases animated by different goals. Laws addressing concerns in either field without consideration for the concerns of the other can do real harm to users’ rights to privacy, freedom of expression, and freedom to access information online.

Cross-posted to Stanford Law School’s CIS Blog

Disclosure: I previously worked on "Right to Be Forgotten" issues as Associate General Counsel at Google.

In my previous two posts about the GDPR, I identified serious problems with its notice and takedown process, and resulting threats to internet users’ rights to free expression and access to information. The legal framework of intermediary liability provides a lens for identifying these problems. It also offers a set of ready-made tools to address them. Lawmakers could and should take advantage of these tools to improve the GDPR.

The cleanest and simplest solution to the GDPR’s notice-and-takedown problems comes from existing law under the EU’s eCommerce Directive. That body of law could govern removal of user content by intermediaries, leaving intact the GDPR’s current provisions for deleting back-end data companies collect and store about user behaviour. More ambitiously, GDPR drafters could craft a new and better process. European lawmakers could take either approach without undermining other important data protection goals or provisions of the Regulation.

Existing EU legal and policy resources could vastly improve the GDPR’s notice-and-takedown process

Applying existing eCommerce Directive law directly to the GDPR

Existing law under the eCommerce Directive provides the most obvious and simple way to sweep away the problems created by the GDPR’s current takedown process. The GDPR could state clearly that its erasure obligations, for intermediaries processing third-party content, are subject to Articles 12-15 of the eCommerce Directive.1 Those articles cover enumerated activities such as hosting or caching user-generated content. Unlike the current GDPR, they protect online expression by only requiringintermediaries to remove unlawful content once they know about it, typically following notice and takedown processes.

Internet content removal laws under member state legislation or case law implementing the eCommerce Directive were designed for precisely the situation an internet intermediary encounters when faced with a “Right to Be Forgotten” erasure request. Their purpose is to balance rights of aggrieved parties seeking removal of online content or links on the one hand, and the rights of other internet users to share or access information on the other. Of course, existing laws are far from perfect. The widespread over-removal documented in academic studies illustrates the need for improvement in those laws as well. But they are worlds better than the GDPR’s current provisions, and they bring to bear the right set of considerations about rights and responsibilities of multiple parties on the internet.

Invoking the eCommerce Directive within the GDPR is also a clean solution as a drafting matter. With a few new sentences, the GDPR could eliminate the thicket of ill-suited rules for intermediaries, without changing removal processes that work for back-end content collected by internet companies about users.2 As I will discuss below, this can be done without any change to the substantive privacy protections defined by the GDPR in its “Right to Be Forgotten” provisions and elsewhere.

Crafting new GDPR removal processes consistent with intermediary liability principles

Of course, another option is to craft a new process that incorporates proportional protections for online expression. This would be challenging in the time remaining, but if GDPR experts wanted expert input about notice and takedown, they would not have to look far. The European Commission has developed considerable internal expertise in precisely this area in recent years.As part of the 2012 Notice and Action Initiative, the Commission conducted a lengthy public consultation.The resulting staff working document provides a thorough and nuanced review of notice and takedown law and practice in Europe, and discusses concerns raised by stakeholders including free expression advocates.The Commission is delving into the topic again through the Digital Single Market project.

Europe also has a number of well-established civil society organisations that have thought hard about the nuts-and-bolts procedural aspects of content removal. Article 19 has a concrete and sophisticated model for notice and takedown – which looks nothing like the GDPR.La Quadrature du Net has also published extensive commentary and concrete recommendations for notice and takedown, and in its early responses to the Costeja case called for regulatory limitations to protect free expression.These groups and others could provide thoughtful input.

Improving GDPR notice and takedown to protect online free expression would not harm the privacy rights protected by other parts of the GDPR

Either of these approaches –invoking the eCommerce Directive, or inventing a better removal process – could be carried out without undermining the GDPR’s other achievements for data protection and privacy.

Protecting users’ rights to delete data tracking their online behaviour

First, improved notice and takedown rules need not have any effect on rights or processes for deleting the other kinds of personal data held by internet companies. Much of the GDPR is designed for this important, separate purpose – giving data subjects legal erasure rights with respect to the stored, back-end data that companies hold about their online behaviour. The GDPR’s removal process seems designed for this pure user-to-business, two-party interaction. Applying it to the very different situation that arises when one internet user wants to delete content posted by another is dangerous to online expression, for the reasons I set out in my second post.But using this single set of rules for both situations is a drafting choice, not a necessity. Drafters could invoke eCommerce law or other improved provisions for content notice and takedown without changing provisions for back-end data erasure at all.

Protecting the “Right to Be Forgotten”

Second, content removal process issues can be separated from the substantive scope of the “Right to Be Forgotten”.European lawmakers could decide that this right is very broad, and most user erasure requests should be granted; or they could decide the opposite.That decision should not affect, or be affected by, the procedural rules for implementing an erasure request. Well-crafted processes remain important to protect whatever content does fall outside of the “Right to Be Forgotten”, and to prevent it from being unfairly targeted and removed from the internet.

Procedural protections are especially important because the rights and remedies created by the GDPR will be around for a long time, and affect a broad and evolving internet ecosystem – not just the large and well-resourced companies that appear in current headlines. Some of those companies, including Google, allocate considerable resources in an effort to avoid over-removal of content under intermediary liability law. Processing requests carefully and rejecting the ones they believe are legally unfounded is, in my opinion, an important service to users.But it is not behaviour that should be taken for granted in crafting laws of general application.The law should not incorporate any assumptions that all intermediaries will put effort into avoiding over-removal, or even that the ones doing it now will do it forever.

Companies’ voluntary removals of lawful content

Finally, processes for content removal under the law can, and in this case should, be considered separately from processes companies use for discretionary content removal under their own community guidelines or policies. The two kinds of content removals pose important and related questions – about rights, procedure, and transparency in particular.Comparison may be fruitful in other contexts.But only one kind of removal, the one compelled by law, is being decided in the next six weeks under the GDPR. The tools to improve protections for lawful online expression are readily available, drawing on existing intermediary liability law and models put forth by civil society groups. Lawmakers should use them.

Conclusion

Public discussion of the GDPR has understandably been dominated by topics more traditionally associated with data protection, such as the data transfer provisions thrown into the spotlight by the recent Schrems case.There has been very little public discussion of the Regulation’s notice and takedown provisions. But principles for notice and takedown have been extensively discussed, debated, and passed into law in the field of intermediary liability. By invoking the protections European laws create in that context, lawmakers can fix these serious problems with the GDPR while still achieving its data protection goals.

Footnotes

1. A possible formulation would be, “where a data subject seeks erasure of personal data under Articles 17 and 19 from a controller that is processing data provided by a third party pursuant to its function as an intermediary protected by Articles 12-15 of the eCommerce Directive, procedures for requesting and carrying out the erasure shall be governed by Member State law implementing those Articles of the eCommerce Directive.” A shortcoming of this formulation is that it leaves intact other nagging problems with treating internet intermediaries as controllers under data protection law. Certain existing data protection obligations, including limitations on the processing of “sensitive” data categories such as health information, would, if truly applied to intermediaries’ processing of user-generated content, effectively make normal operations impossible. The GDPR maintains these, and adds more requirements that sit poorly with the function of Internet intermediaries. For example, the requirement that companies notify data subjects at the time of collecting data about them from third parties (Art. 14.3) would be very difficult for intermediaries to comply with, since an intermediary does not know when user-posted content includes personal data about another individual. Other revisions, invoking eCommerce law broadly for intermediaries with respect to their processing of user-generated content could solve this class of problems.

2. As Miquel Peguera discusses masterfully in a forthcoming article, data protection enforcers have themselves wrangled with the peculiarity of treating intermediaries as data controllers or processors under the law. The Article 29 Working Party in 2008 recommended special treatment for search engines for this very reason. It distinguished personal data that a search engine collects from users from personal data included in indexed, third-party content, and said that for the latter, the “formal, legal and practical control the search engine has over the personal data involved is usually limited to the possibility of removing data from its servers.”  The CJEU’s Costeja ruling, similarly, identified notice and takedown as the locus of Google’s obligations as an intermediary.

Add new comment