The Digital Services Act: risk-based regulation of online platforms

Zohar Efroni, Frameworks for Data Markets, Weizenbaum Institute, Berlin, Germany, zohar.efroni@rewi.hu-berlin.de

PUBLISHED ON: 16 Nov 2021

This op-ed is part of a series of opinion pieces edited by Amélie Heldt in the context of a workshop on the Digital Services Act Package hosted by the Weizenbaum Institute for the Networked Society on 15 and 16 November 2021 in Berlin. This workshop brought together legal scholars and social scientists to get a better understanding of the DSA Package, in detail and on a meta level.

I. The DSA as risk-based regulation

The causal connection between the fourth wave of COVID-19 infections that Europe currently experiences, the rate of (non)vaccinated persons in the population and the influence of social media platforms on this rate is currently a very pressing question. The observed impact of social media on critical matters of public health and safety was certainly one of the motivators behind  the EU Commission’s attempt to regulate large online platforms, an attempt that attracted much attention already before, and especially after the publication of the Digital Services Act (DSA) draft legislation in December 2020. The DSA represents in the eyes of many a milestone in the regulation of online platforms. The legislation proposal is multifaceted, and one facet that so far has received fairly little attention is its risk-based characteristics (one exception is embodied in this 2021 IRIS Report, discussing the DSA’s risk-based approach to disinformation). Risk oriented regulation is often classified as “principle-based”—as distinguished from "rules-based" regulatory approaches that, for their part, focus rather on prescribing in detail behaviour rules.

Risk-based regulation has been a topic especially in the context of environmental regulation, health and food, financial markets, banking, insurance and occupational hazards, among others. This is changing as the principle of risk captures an increasingly dominant role in regulation of digital markets and digital society, for instance, in the area of EU data protection law, or artificial intelligence regulation with its focus on “high-risk AI systems”. There seems to be a consensus that the rapid digitisation of nearly every aspect of modern life brings with it, alongside enormous benefits, also considerable risks. Scholars such as Ulrich Beck and Anthony Giddens suggested that we are living in a “risk society”, a term implying that human decisions create global risks and could result in disasters of untold dimensions. It is therefore not surprising that in our technological and political climate, identification and management of risks associated with modern life, and most relevantly, with the digital modern life, become a priority for lawmakers and governments.

As indicated by scholars such as Bridget Hutter, risk-based regulation is a term often used to denote frameworks adopted by governments or governmental agencies that occupy regulative powers. The regulator faces a dilemma of how to manage and allocate its resources, such as those required for rule-making, compliance monitoring, supervision and enforcement. The guiding principle helping the regulator to manage its limited resources wisely and efficiently is risk, namely, the risk certain actors and behaviours pose to the objectives pursued by the law. Risk-based regulation frameworks often include components of risk identification, risk evaluation, collection of relevant information and impact assessments as well as monitoring and adjustment schemes. A risk-based framework may contain ex ante risk mitigation measures that are preventive in nature alongside dissuasive measures (e.g., sanctions for noncompliance) intended to deter risk-enhancing behaviours and punish bad actors.

The DSA draft legislation reveals distinct features of risk-based regulation which begin already with its choice of tying compliance obligations proportionately to the type and magnitude of the risks online platforms create. Here, the DSA relies on the following simple, yet fundamental, assumption: The bigger an online platform is, the greater is its impact, and hence, the higher are the risks it poses to individuals and society. Consequently, larger platforms—specifically very large online platforms (or VLOPs)—must comply with more stringent due diligence, transparency and reporting obligations compared to smaller platforms. By the same token, “additional obligations” imposed on online platforms under Chapter III, Section 3 DSA et seq. do not apply to micro or small enterprises—unless “their reach and impact is such that they meet the criteria to qualify as very large online platforms under [the DSA]” (Recital 43 DSA). The drafters of the DSA explicitly intend to create a law that is asymmetric by design. There are strong indications that risk, among others, is an important rationale rooted in the basis of discrimination between regulated actors. As a principle-based regulation with the notion of risk at its centre, the DSA cares little about specific (liability) rules; this is manifested by the detail that the (un)lawfulness of a given information dissemination over online platforms is left to be determined by other sources of EU law and the domestic laws of its member states.  

II. Specific risks in the DSA

What types of risks does the DSA seek to confront? Although the core mission statement of the DSA legislation does not seem to explicitly use risk terminology, a number of specific risks involved in the operation of online platforms are nonetheless mentioned. For one thing, the explanatory text speaks generally in terms of “new risks and challenges, both to society as a whole and individuals using [online] services”. For another thing, the use of online platform to disseminate content creates risk to fundamental rights, e.g., in the context of spreading illegal content, imposing limitations on free expression, facilitating discrimination and aggravating harms that emanate from online advertising and profiling. Automated systems online platforms use can be biased, abusive, intentionally manipulated or designed in a way that jeopardises its users, for instance, by amplifying harms to electoral rights and to the democratic process as a whole, privacy harms, and harms caused by fraudulent/ deceptive commercial practice.

Another type of risks emanates not from the operation of online platforms per se but from the legal environment in which they operate. Fragmentation, legal uncertainty and coordination problems across EU jurisdictions are risks that the DSA explicitly seeks to address via horizontal regulation and through the structure of national Digital Services Coordinators and the European Board for Digital Services. This brings home another point that the literature on risk regulation is well aware of: The legal framework applicable might have intrinsic characteristics that create unique risks of its own, as certainly recognised by the Commission when it warns, in the explanatory text to  the DSA, that diversified approaches by member states to the problems of online platforms “put an additional strain on achieving already existing objectives as the increased legal fragmentation shows”. Taking the point yet one step further, risk-based regulation (as any other regulation) might introduce new risks associated with unintended consequences of the law, including cross-effects between diverse interests of multiple actors in complex environments.

III. Mitigating systemic risks and other DSA risk-based mechanisms

The DSA applies the concept of “systemic risks” and prescribes a set of obligations imposed on VLOPs in connection with such risks. The proposal does not provide a general definition to systemic risks, however, a term familiar mostly in the area of financial markets and regulation of major financial institutions. Instead, the DSA introduces the novel contextual application of systemic risk associated with VLOPs mainly by listing three specific systemic risks such major platforms create. These are dissemination of illegal content, negative effects on certain fundamental rights (privacy, freedom of expression, prohibition on discrimination and the rights of a child), and intentional manipulation of the services (Article 26(1) DSA). It further requires VLOPs to put in place “reasonable, proportionate and effective measures” to mitigate systemic risks, which might include adaptation of their content moderation systems, advertising display systems, or require them to  operate under codes of conduct and crisis protocols (Article 27(1)).     

Beyond the specific context of systemic risks, the DSA proposal is abundant with risk regulation mechanisms concerning risk identification, risk management and risk mitigation, information and transparency, external and internal supervision as well as audit and reporting. To name just a few examples, the DSA anticipates, as part of its “notice and action” mechanisms, a system of “trusted flaggers” (Article 19 DSA). The involvement of such flaggers should support more effective monitoring and reduce reaction time once harmful content surfaces on the platform. Crisis protocols under Article 37 DSA are designed for extraordinary circumstances affecting public security or public health. The assumption is that VLOPs, though not necessarily being the source of hazard, can significantly influence crisis management and the actual impact of extraordinary circumstances on public security and health. Clearly, transparency reporting obligations—especially for VLOPs under Article 33 DSA—play a significant role in identifying risks and assessing their probability and impact.  

An additional risk-based layer is the bundle of functions and powers devised in the hands of the regulator: The EU Commission, the European Board for Digital Services and national Digital Services Coordinators. The DSA would create an extensive supervise and compliance apparatus. It includes the regulatory power to take interim measures per Article 55 DSA in case of “urgency due to the risk of serious damage for the recipients of the service” and to imposed fines (Article 59 DSA)—mostly vis-a-vis VLOPs that the DSA regards as high-risk platforms.

IV. Implications

Viewing the DSA through the lens of risk regulation begs the question about the implications of such a perspective for implementing the DSA and for achieving its (risk mitigation) goals. I will mention here three possible takeaways:  

1. Both regulators and regulated entities need to develop a better and more systematic understanding of the genuinely novel risks associated with the operation of (large) online platforms. The past few years provide plenty of anecdotal examples, such as the Facebook-Cambridge Analytica scandal, allegations that Amazon uses information on third-party vendors when developing private-label merchandise, or the use of social media networks to spread COVID-19 related conspiracy theories. As much as such examples are alarming, both regulators and regulated platforms need to develop methodologies, including evidence-based schemes and pragmatic tools, to deal with the tasks of risk identification, risk management and compliance in general. Research on the topic is only beginning to emerge, and warnings about current deficiencies of platform response mechanisms, for instance, should be heard and acted upon.  

2. Policymakers, regulators, and ultimately also courts need to bear in mind the following tradeoff effect: Every risk-based regulation creates new risks which must be factored into, and addressed by the general legal scheme. Those risks may be structural, such as overcomplexity of the law, legal uncertainty, problems of coordination, uniformity, transaction and compliance costs or misuse. They can also be substantive, if they present new threats to legally protected interests. The requirement that online platforms implement effective compliance schemes calls to our attention the possibility of undesired risk-enhancing effects of regulation and the mechanisms it creates. Over-enforcement of content moderation could suppress legitimate speech and have a chilling effect, both on platforms and their users. 

3. Risk is a dynamic subject matter. Risk-based regulation should remain flexible and attuned to new insights, methods, socio-economic processes and technological developments that could affect our society quite in opposite ways, namely, by enhancing existing risks/ creating new ones on the one hand, and by helping to mitigate risks the platform economy gives rise to on the other hand.  

Conclusion

Debates around the ultimate provisions of the DSA will continue in the coming weeks and months as this legislation assumes its final shape. A risk-based perspective can help casting new light not only on the design of the DSA but also on more general discussions concerning the advantages and disadvantages of risk-based regulatory approaches (as opposed to rule-based approaches, for instance) in the context of addressing the challenges of modern technologies and their applications for law and society. 

Add new comment