Germany is amending its online speech act NetzDG... but not only that

Amélie Heldt, Leibniz Institute for Media Research, Hans-Bredow-Institut, Hamburg, Germany

PUBLISHED ON: 06 Apr 2020

Germany is amending its Network Enforcement Act (hereinafter NetzDG) in order to respond to the criticism from civil society and to address issues that emerged in the past two years. This first draft of amendments by the government will next be discussed in parliament before its probable adoption. On paper, the NetzDG did not have the harmful consequences on online speech that many feared. There is at least no proof that platforms removed more content than before NetzDG’s adoption. Nevertheless, the government still overestimates the benefits of such a law.

The German “censorship law”

The NetzDG was adopted as a reaction to a lack of self-regulatory efforts by social media platforms. After Angela Merkel’s 2015 decision to let over a million Syrian asylum seekers into the country, the German government observed a peak of hate speech and disinformation on social media platforms. Since the platforms did in the government’s view not adopt sufficient self-regulatory measures for fast removal of unlawful content (e.g., insults, defamation, incitement to violence, etc.), the government drafted, and the parliament adopted the NetzDG in summer 2017.

The Act obliges social networks with two million users or more to set up user-friendly complaint mechanisms to report content, to remove ‘manifestly unlawful content’ within 24 hours, and to deliver transparency reports twice a yeaƒr. The goal is to prevent the dissemination of offensive and aggressive content falling within the scope of application under sec. 1 (3) NetzDG. In other words, no new criminal offences were adopted, but the platforms had to take on responsibility for unlawful content.

Hardly anyone thought of the NetzDG as the ideal way forward: it was deemed unconstitutional for many reasons and decried worldwide as a ‘censorship-law’. Speech-restricting laws are, in principle, allowed if they meet the requirements set by art. 5 (2) Basic Law and by the Federal Constitutional Court’s jurisprudence. But the main fear was that social media platforms might remove more content than necessary to avoid being fined (overblocking). It was also expected that they would adjust their community guidelines to the strictest law worldwide to avoid the costs of adapting for each country (over-removal). Especially in the First Amendment context, scholars described it as the paradigm of ‘new school speech regulation’1, that is, controlling speech online via the control of digital networks. Again, there are many reasons to criticise the NetzDG but what it does, in the end, is increase intermediary liability for not reacting to user notices concerning unlawful content.

Striking a balance

From a global perspective, one might argue that the NetzDG contributed to some sort of race to the bottom when it comes to content control, and eventually serves as a model for authoritarian regimes. Although it’s important to point out that the requirements for speech restricting laws need to be very high, and that only when meeting them can a law be consistent with democratic principles, the NetzDG is not the condition sine qua non for the way platforms moderate user-generated content and handle user-generated flagging. In sum, the NetzDG has flaws but there is no empirical proof of over-removal or other harmful effects on online speech due to it. This is partly because the first version of the law did not specify how social media platforms needed to implement the complaint tools, and how granular their transparency reports should be regarding the removal reason. As a result, the complaint numbers published in the reports were not conclusive (more details in this paper). These two aspects have been observed and improved in the current draft amendment.

What’s new?

According to the draft, sec. 3 (1) sentence 2 NetzDG would read as follows: "The provider must provide users with an easily recognisable, directly accessible, easy-to-use and permanently available procedure when perceiving the content for transmitting complaints about illegal content.” (amendments in italic). Usually, users flag content as unlawful in the same window that opens when flagging any offensive or otherwise unwanted content. On YouTube, for example, users can flag content as hate speech according to their guidelines and as unlawful according to German criminal law with the same complaint tool. On Facebook, on the other hand, the complaint tool for unlawful content was somehow “hidden” next to the company’s legal information. Hence their low complaint numbers.

Until now, it has been up to the social media platforms to decide on which basis they remove illegal content. Indeed, the NetzDG is to introduce an additional possibility for users to complain, not an obligation to complain. In the near future they will most likely have to provide information regarding the basis of their content moderation decision: sec. 2 (2) no. 3 NetzDG proposed in the draft stipulates three levels of description: that of "mechanisms for the transmission of complaints", that of "the decision criteria for the removal and blocking of illegal content" and that of "the examination procedure including the order of examination".

NetzDG amendment “to play part in a crime series”

So far so good, but the version adopted by the government on 1 April 2020 is part of a package of ‘measures to counter right-wing extremism and hate crime online’. It involves amending other laws than the NetzDG, that is, the Criminal Code, the Code of Criminal Procedure, the Telecommunication Act, and the Law on the Federal Criminal Police Office (BKA). Actually, the critical points are to be located there, because the amendments to the latter would extend the responsibilities of the BKA and, among other, allow access to IP addresses and user passwords under certain conditions. Furthermore, the criminal liability for speech-related offences would be increased by criminalising preparatory conduct far in advance of aggressive opinions and calls for violence. Adding these changes to the catalogue of ‘unlawful content’ under NetzDG is a true challenge.

In a recent case, the district court of Augsburg sentenced a Facebook user for sharing a news video produced and distributed by public broadcaster Deutsche Welle showing ISIS flags. The Bavarian High Court reversed this decision and held that the district court did not sufficiently consider the defendant’s freedom of opinion and that sharing media coverage on a terrorist organisation should not be confused with propagating terror propaganda. This case anecdotally shows that even courts struggle with discerning criminal actions when they are speech-related – what consequences will it have on commercial content moderation by social media platforms if the laws become more complicated?

Taken together, extending the scope of criminal provisions or even resurrecting highly restrictive laws that have been abolished is hardly convincing. The planned changes in the package of laws do not always address the underlying problems, such as right-wing extremism and cyberbullying. The government proposes a legal framework that surely will be effective against hate speech and will facilitate criminal prosecution. It will also force social media platforms to be more transparent when their content moderation rules overlap with the elements of criminal offences. Nevertheless, one should bear in mind potential solutions on the preventive side, such as educational opportunities and programmes for social cohesion. For while jurisprudence serves to standardise norms for life in a society and thus to control behaviour, it does not provide answers to social problems.

With regards to the NetzDG, this law will increasingly restrict freedom of opinion and information if the definition of ‘manifestly unlawful content’ becomes broader due to changes in the Criminal Code for instance. In other words, we should worry less about the proposed amendments to the NetzDG and more about the planned amendments of the four other laws.

Footnotes

1. Jack M Balkin, ‘Old School/New School Speech Regulation’ (2014) 127 Harvard Law Review 2296, 2306.

Add new comment