Back up: can users sue platforms to reinstate deleted content?

Matthias C. Kettemann, Leibniz Institute for Media Research | Hans-Bredow-Institut, Hamburg, Germany, m.kettemann@hans-bredow-institut.de
Anna Sophia Tiedeke, Leibniz Institute for Media Research | Hans-Bredow-Institut, Hamburg, Germany, a.tiedeke@hans-bredow-institut.de

PUBLISHED ON: 04 Jun 2020 DOI: 10.14763/2020.2.1484

Abstract

A private order of public communication has emerged. Today, social network services fulfill important communicative functions. A lot has been written about the failings of companies in deleting problematic content. This paper flips the question and asks under which conditions users can sue to reinstate content and under which circumstances courts have recognised ‘must carry’ obligations for social network services. Our analysis, an initial comparative analysis of case law on the reinstatement of user-generated content, will point to a larger issue of systemic relevance, namely the differences in treatment of states and private companies as threats to and/or guarantors of fundamental rights in the United States and in Germany. It is a contribution to the important debate on the interaction of states and platforms in governing online content.
Citation & publishing information
Received: March 19, 2020 Reviewed: April 20, 2020 Published: June 4, 2020
Licence: Creative Commons Attribution 3.0 Germany
Competing interests: The author has declared that no competing interests exist that have influenced the text.
Keywords: Platforms, Intermediaries, Private spaces, Courts, Indirect application of human rights
Citation: Kettemann, M. C. & Tiedeke, A. S. (2020). Back up: can users sue platforms to reinstate deleted content?. Internet Policy Review, 9(2). https://doi.org/10.14763/2020.2.1484

Introduction

Platforms and their terms of service have a decisive impact on freedom of expression and communication online (Suzor, 2018). The private power of platforms is unprecedented and sits uneasily with the primary responsibility and ultimate obligation of states to protect human rights and fundamental freedoms in the digital environment. But states do not only have the negative obligation to refrain from violating the right to freedom of expression and other human rights in the digital environment but also the positive obligation to protect human rights. Companies, as the (international) law and practice of social responsibility of transnational corporations (Ruggie, 2008; 2011) demonstrates, have a responsibility not to violate human rights and to offer redress mechanisms, when they do. This paper asks whether and in what way this obligation extends especially to social networks and to the reinstatement of user comments that may have been wrongfully deleted. Put concisely: under what circumstances should platforms be forced by courts to reinstate content? We will address this question by looking at Germany and the United States, two jurisdictions that deal with the issue of ‘must carry’ in a very different way.1

Analysing a selection of US and German court cases on the question of reinstatement of accounts and republication of deleted content, we will draw out the differences in constitutional and statutory law and show why they explain some of the divergences. In comparative case studies of US and German courts we will address the following questions: Can users sue platforms to have deleted posts and videos reinstated? Do they have a right to a Facebook or Twitter account? Do platforms have corresponding duties to treat users equally in furnishing these services as long as users do not violate the terms of service or as long as users do not violate local law? We will also point to a larger issue, namely the differences in the treatment of states and private companies as threats to and/or guarantors of fundamental rights between the jurisdictions under review. We will finally show how public and private judicial and quasi-judicial approaches towards reinstatement can interact (Kadri & Klonick, 2019).

Today, a quickly growing percentage of communication takes place online. Platforms that are privately owned communication spaces have become systemically important for public discourse, in itself a key element of a free and democratic society (Hölig & Hasebrink, 2019). The internet has heavily influenced our communicative practices (Kettemann, 2018) and will continue to do so. As the European Court of Human Rights noted in 2015, the internet is ‘one of the principal means by which individuals exercise their right to freedom to receive and impart information and ideas, providing [...] essential tools for participation in activities and discussions concerning political issues and issues of general interest’ (Cengiz v. Turkey, 2015). It plays ‘a particularly important role with respect to the right to freedom of expression’ (Council of Europe, CM/Rec(2018)2, 2018). Due to technological innovation, social media platforms are now able to de facto (and de jure) regulate speech in real time at any time. The platforms do not only set the rules for communication and judge on their application but also moderate, curate, rate and edit content according to their rules. Speaking from a constitutional perspective, they combine the tasks of all three separate powers of states – law-making, judication and execution, plus the role of the press (Kadri & Klonick, 2019). As one author put it, ‘platforms [...] engage in intensive legislation, administration of justice and punishment, and develop eclectic governing and legitimation apparatuses consisting of algorithms, proletarian judicial labor and quasi-constitutional governing documents‘ (Schwarz, 2020, p. 117).

The major part of the research that has already been conducted on the issue focuses on the situation in the US (Keller, 2019b; Goldmann, 2017). These analyses, however, seem to accept and appreciate the dual systems of remedy (Kadri & Klonick, 2019). In fact, they consider a culture on platforms that would oblige platforms to carry all legal speech, a potential threat to free speech and to the economic interests of the platforms (Keller, 2019b). We will show that the key to understanding ‘must carry’ is to put a qualifying asterisk to the public/private distinction in law. We will also show that ‘must carry’ obligations need to be understood in the context of the impact platforms have as gatekeepers for discourses, when a growing number of societally relevant debates take place online. Recognising this, platforms, we submit, have to implement a transparent and consistent process of balancing the interests at stake. As their quasi-judicial functions grow, they have to become more judicial.

After a brief analysis of the challenges of regulating online speech between state duties and private obligations, the jurisprudence of US and German courts will be presented. On this basis we proceed with a critical assessment of the horizontal or third-party effects of human and fundamental rights on private contracts and draw conclusions.

Private and public freedom of expression governance

In times of digitality, online communicative spaces have enriched and partially replaced public offline spaces, e.g. town squares, as communicative settings where discourse is relevant for democratic decision-making. This is a challenge for states that continue to have the primary responsibility and ultimate obligation to protect human rights and fundamental freedoms, online just as offline. All regulatory frameworks they introduce, including self- or co-regulatory approaches, have to include effective oversight mechanisms over the companies controlling the private communication spaces and be accompanied by appropriate redress opportunities. However, the normativity inherent in the primary responsibility of states to protect human rights is at odds with the facticity of online communicative practices that are prima facie regulated by the rules of intermediaries through their terms of service, their Hausrecht.

The private sector assumes a distinct role that reveals the specificity of the internet: the vast majority of communicative spaces on the internet are privately held and owned. These intermediaries, including social media companies, today have become important normative actors (Zakon, 2018). They have established largely autonomous legal orders (Kettemann & Schulz, 2020), even if they still form part of the normative order of the internet (Kettemann, in press). Network effects and mergers have led to the domination of the market by a relatively small number of key intermediaries.

Social media companies set the rules for the private public online spaces they control. Some do it via Community Standards (Facebook, 2020; Kettemann & Schulz, 2020), others via their terms of service, while in some jurisdictions2 judges have applied the concept of indirect third-party effect of fundamental rights to online spaces. Social media companies remain – for the foreseeable future – the primary or at least prima facie norm-setters regarding online communicative spaces. Understanding the theory and practice of the private norm-setting process is thus essential. Intermediaries set the rules for communication and thereby define what they understand as ‘desirable communication’. An example to show us how far this can go is TikTok. In order – nominally – to avoid cyberbullying, TikTok would flag individuals with specific features such as ‘facial disfigurement, autism, Down syndrome’, and ‘[d]isabled people or people with some facial problems such as birthmark, slight squint (…)‘ as vulnerable and would limit their videos to be shown to a wider audience or even block them to appear in other users’ feeds (Botella, 2019).

Pushing – from the perspective of TikTok – information and communications that are mainstream and monetisable translates into more growth in today's immaterial production environment (Han, 2014, p. 19) – at least in the short run. This creates ‘a cultural system as well as a political system’ (Balkin, 2004, p. 4). The commodity in this system is not just the user, but the content, produced and used (prod-used) by ‘user culture’ (Klonick, 2018, p. 1630; Balkin, 2004, p.5) ). This user culture is shaped by the specific rules of the digital platform. These sets of rules have matured and now (frequently) include a specific set of values (Facebook, 2019; Twitter, 2019).

In light of the persuasive power of the UN Guiding Principles on Business and Human Rights and the ‘Protect, Respect and Remedy’ Framework (Ruggie, 2011), intermediaries have started to pledge commitment to human rights-inspired values and principles that have certain self-constitutionalising functions. Facebook’s Oversight Board, for example, will have substantial leeway in framing selected norms that apply to online speech on Facebook’s platform (Facebook, 2019). Facebook has undertaken to implement the Board’s decision ‘to the extent that requests are technically and operationally feasible and consistent with a reasonable allocation of Facebook’s resources’ (Facebook, 2019). Next to authenticity, safety, privacy and dignity, Facebook thus favours voice as the paramount value and states:

The goal of our Community Standards is to create a place for expression and give people voice. Building community and bringing the world closer together depends on people’s ability to share diverse views, experiences, ideas and information. We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable. In some cases, we allow content which would otherwise go against our Community Standards – if it is newsworthy and in the public interest. We do this only after weighing the public interest value against the risk of harm, and we look to international human rights standards to make these judgments (Bickert, 2019).

This reliance on ‘newsworthiness’ or ‘public interest’ as criteria to allow content that would otherwise be deleted echoes similar policies at Twitter, which defines the importance of public interest for its network as follows (Twitter, 2019):

Serving the public conversation includes providing the ability for anyone to talk about what matters to them; this can be especially important when engaging with government officials and political figures. By nature of their positions these leaders have outsized influence and sometimes say things that could be considered controversial or invite debate and discussion. A critical function of our service is providing a place where people can openly and publicly respond to their leaders and hold them accountable. With this in mind, there are certain cases where it may be in the public’s interest to have access to certain Tweets, even if they would otherwise be in violation of our rules. (...). We’ll also take steps to make sure the Tweet is not algorithmically elevated on our service, to strike the right balance between enabling free expression, fostering accountability, and reducing the potential harm caused by these Tweets.

However, private (platform) companies still have an overriding interest in creating a hospitable communication environment that fosters and attracts advertisements and business activity (Klonick, 2018, p. 1615). As Hill puts it: ‘social media companies, their transnational nature, and the transnational, risk-averse nature of their advertising stakeholders has created an emphasis on brand safety in media content governance’ (Hill, 2019, p. 2). They depend on the ‘prod-users’ to generate and share information and personal data. On platforms, ‘users are not customers, (…) users are ‘value creators’ (Schwarz, 2019, p. 121). Platforms, by offering users social ‘connectedness’ (Van Dijck, 2013, p. 13), turn social interaction and attention into data which are captured and sold to advertisers (Schwarz, 2019, p. 121).

In that way the motivation to safeguard the right of free speech differs significantly from the persuasion shared by liberal democratic societies. For a liberal democratic state order the right of free speech is ‘absolutely essential (...), for it makes possible only the constant intellectual confrontation, the clash of opinions, which is its vital element (…)’. In a certain sense it is considered to be the basis of any freedom or, as the German Federal Constitutional Court (BVerfG) put it: ‘the matrix, the indispensable condition of nearly every other form of freedom (Cardozo)’ (Lüth, 1958). Clashing opinions by definition include negative communication, it includes disruption and it includes content that might not be attractive for advertisers’ brand safety (Hill, 2019, p. 10, 12).

In an environment that promotes and protects speech only to that degree that speech is still good for business (Citron & Norton, 2011, p. 1454), clashes of opinions will not always be desired and protected in a way they would be in liberal democratic societies. For platforms, ultimately, it will remain a business decision to do anything to protect voice, meaning ‘desirable communication’ in the view of the social network services. By favouring this kind of communication, they have changed the social condition of regulating quasi-public speech (Balkin, 2004, p. 26).

The above cited statements of Facebook and Twitter matter. They show that social networking services begin to see that just evaluating content on the basis of their terms of service (and deleting content, if it falls foul of a private norm) might lead to unjustified (or unjustifiable) decisions. Taking up the example of the Napalm girl incident: clearly a picture of an unclothed child is a violation of Facebook’s Community Standards on child nudity. But, the picture of a specific unclothed child, namely Phan Thị Kim Phúc, has a special place in history. Deleting it carries a different message, even though this set of values and the commitment to the Ruggie Principles as a ‘social licence to operate’ (Ruggie, 2008) reinforces an international trend to force platforms to commit to constitutional and human rights principles. Providing access to ‘content’ - and content (including ads) creators’ access to customers’ attention remains the essence of the platforms. Especially in light of potential liability risks, substantiated for example by the fines companies can incur under the German Network Enforcement Act (NetzDG) (Kettemann, 2019) or the EU Code of conduct against illegal hate speech online, which was adopted by Facebook, Microsoft, Twitter and YouTube already in 2016, platforms will promote ‘desirable communication’ on the platform and moderate content accordingly. To minimise the risk of being held liable for (potentially) illegal content is one of the strong drivers for platforms concerning the question of how they draft their rules and how fast they remove content. This preference for speed bears the potential risk that this goes to the expense of a thorough assessment of the legality of the content and thus the rule of law (Coche, 2018, p. 11). In this regard, the ruling of European Court of Justice (CJEU) Glawischnig-Piesczek v
. Facebook Ireland Limited is relevant. The CJEU ruled that EU law does not preclude national courts ordering social network services to seek, identify and delete comments identical to illegal comments and equivalent comments from the same user – globally (Glawischnig v. Facebook, 2019).Since the CJEU chose to follow Advocate General Szpunar’s Advisory Opinion (AG Opinion Glawischnig v. Facebook, 2019) and ruled that the E-Commerce Directive ‘does not preclude a court of a member state from ‘ordering a host provider to remove information which it stores, the content of which is identical to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information’ (Glawischnig v. Facebook, 2019, para. 53) negative implications for free speech are not unlikely. Legal speech might be caught like ‘dolphins in the [tuna] net’ (Keller, 2019a).

There is some content that companies want, some content that companies put up with, and some content they a) wish to delete or b) legally have to delete. The question now arises how – in the US and the German jurisdiction – courts have dealt with arguments that content platforms want to delete or have deleted should be reinstated as long as it is not illegal. The choice of these two jurisdictions is not a coincidence, it rather will allow us to approach the issue from two very different angles. On the one hand the United States understands freedom of speech as the freedom from interference by the state. The idea that there is a marketplace of ideas can be regarded as the foundational theoretical basis and rationale for freedom of speech doctrine (European Parliament Study, 2019, p. 40). In Germany, freedom of expression is regarded as essential for a free and democratic state order (Lüth, 1958) and also needs to be guaranteed by the state (Saunders, 2017, pp. 11, 14).

Through the lens of the ‘must carry’ approach we will now take a closer look at the situation in the United States and Germany and show how ‘must carry’ is sometimes the only way to guarantee effective protection of speech online. We chose the US as the home of the currently leading social networking sites and the jurisdiction with many judgments regarding freedom of speech in private communication spaces. We selected Germany because of its history of strong regulation of network sites, through for instance the Network Enforcement Act, and the courts’ willingness to consider the application of fundamental rights to platforms. Comparing the US and the German approach to reinstatement of contents allows us to highlight the differences.

The United States: private spaces under private rules

In the US, courts have regularly sided with social networks that have blocked user accounts or deleted tweets (Mezey v. Twitter, Cox v. Twitter, Kimbrell v. Twitter). In the 2018 Twitter v. San Francisco case, for instance, the California Court of Appeal confirmed that a service provider’s decision to restrict or make available certain material is expressly covered by section 230 Communications Decency Act (CDA), the clause shielding internet service providers from liability (Twitter v. San Francisco). The court presupposes the existence of ‘must carry’ claims (Keller, 2019b) but shields platforms from them because section 230 Communications Decency Act (CDA, 1996) and the Digital Millennium Copyright Act (DMCA, 1998) intend to limit the take-down of legal speech (Keller, 2019b). In light of the potential misuses of Sec. 230 by 'bad samaritans' (Citron & Witte, 2017, p. 409), scholars have developed nuanced approaches for the law's reform (Citron & Franks, 2020, pp. 20-25).

The purpose of this grant of immunity was both to encourage platforms to be ‘Good Samaritans’ and take an active role in removing offensive content, and also to avoid free speech problems of collateral censorship (Zeran v. America Online Inc.). The courts rejected the claims with reference to section 230 CDA in the majority of cases, for example in Mezey v. Twitter Inc., Twitter Inc. v. The Superior Court ex rel Taylor, Williby v. Zuckerberg, Fyk v. Facebook Inc., Murphy v. Twitter, Inc. and Brittain v. Twitter Inc. Besides these two regimes, any other arguments were also rejected in court. Up until today, in the US there has been no successful ‘must carry’ claim with relation to platforms (Keller, 2019b,), in contrast to cases against individuals exercising state functions and controlling subspaces within the platforms, such as the comment section under a tweet (e.g., Knight First Amendment Institute v. Trump).

But US jurisprudence has insights to offer into the relationship of private property and public communication goals. Back in the day, it was booksellers, broadcasters or editors that would put limits to content or speech. According to the United States Supreme Court (SCOTUS), strict liability on their part would lead booksellers ‘to restrict the public’s access to forms of the printed word, which the State could not constitutionally suppress directly’ (Smith v. California; Keller, 2018, p. 17). Therefore, also this argument was rejected to protect free speech. In Johnson v. Twitter Inc., the California Superior Court refused to consider Twitter akin to a ‘private shopping mall’ (Pruneyard v. Robins) that was ‘obligated to tolerate protesters’ (Johnson v. Twitter). In Prager v. Google, the Northern California District Court refused to see YouTube as a state actor in accordance with the ‘public function’-test, arguing that providing a video sharing platform fulfils neither an exclusive nor a traditional function of the state. The court did not see YouTube as a ‘company town’ (Marsh v. Alabama) either. A claim relaying on the ‘company town’ rule, which was established in 1964 Marsh v. Alabama, today would only succeed if a claim was brought against a private entity that owns all the property and controls all the functions of an entire (virtual) town (Prager v. Google).

Economic dominance — or dominance in the ‘attention marketplace’ — was not considered to be enough to justify must carry obligations and override the platforms’ own speech rights (First Amendment to the United States Constitution), because the courts do not consider major platforms, comparable to the cable companies in Turner v. FCC, to control ‘critical pathway[s] of communication’.

In Manhattan Community Access Corporation (MNN) v. Halleck, the SCOTUS had the chance to weigh in again on the tension between cable operators’ and cable programmers’ First Amendment rights - and, by implication, on the viability of must carry claims for internet platforms. However, in June 2019 the court only ruled on the status of MNN (non-state actor) rather than on whether the actions directly affect free speech. Only the dissenting opinion of Justice Sotomayor in MNN v. Halleck argued that MNN ‘stepped into the City's shoes and thus qualifies as a state actor, subject to the First Amendment like any other.’ Justice Sotomayor also argued that since New York City laws require that public access channels be open to all, MNN also took responsibility for this law with the public access channels. It did not matter whether the city or a private company runs this public forum since the city mandated that the channels be open to all.

In fact, US courts have repeatedly held that the platform versus publisher dichotomy is irrelevant in the context of section 230 CDA (Chukwurah v. Google). There is established case law on the notion that immunity under section 230 CDA protects platforms against a variety of claims, just recently confirmed in FAN v. Facebook, Sikhs v. Facebook and Chukwurah v. Google. This includes claims for breach of contract and the implied covenant of good faith (FAN v. Facebook). Courts in the US have continuously rejected the notion that platforms are public fora (Prager v. Google; Ebeid v. Facebook; Buza v. Yahoo! Inc.; Langdon v. Google). In May 2020 in Freedom Watch, Inc., et al v. Google Inc., et al the U.S. Court of Appeals for the D.C. Circuit referring to the 2019 SCOTUS decision in MNN v. Halleck confirmed that ‘the First Amendment prohibits only governmental abridgment of speech (...).’ The judges rejected the argument brought forward by Freedom Watch and held that ‘a private entity who provides a forum for speech is not transformed by that fact alone into a state actor.’ (Freedom Watch, Inc., et al v. Google Inc., 2020, p. 2). Only if a social media account, for instance a Twitter account, is used by a public official ‘as a channel for communicating and interacting with the public about his administration’ and ‘to conduct official business and to interact with the public’ (Knight First Amendment Institute v. Trump) can the interactive space on that account be regarded a public forum. However, this does not make Twitter itself a public forum. Only a part of Twitter, namely the account which, in the Knight First Amendment Institute case, Donald Trump, ‘upon assuming office, repeatedly used (...) as an official vehicle for governance’ with ‘interactive features accessible to the public without limitation’ (ibid.) can be considered a ‘public forum’ with the clear consequence that an exclusion from that space (by blocking users or deleting posts) has to be considered an unconstitutional viewpoint discrimination (ibid., p. 23).

Even though there have been more decisions in a similar setting (Morris & Sarapin, 2020, p. 11) supporting this line of argumentation, Pruneyard v. Robins remains an exception and the closest a US case has come with respect to third-party effect on fundamental rights. In that case the SCOTUS confirmed the Californian Supreme Court's decision and thereby the plaintiffs’ rights under the California Constitution to enter a Silicon Valley shopping mall to distribute leaflets. Plaintiffs suing today’s platforms argue that the platforms fulfill the public forum function at least as much as shopping malls ever did and, in consequence, must tolerate unwanted speech. In Pruneyard v. Robins, SCOTUS held that a shopping mall owner’s own autonomy and communication power were not undermined by leafleteers’ presence on its premises (Pruneyard v. Robins). In Hurley, it held that to ‘require private citizens who organize a parade to include among the marchers a group imparting a message the organizers do not wish to convey [...] violates the First Amendment.’ (Hurley v. Irish Am. GLIB Ass., 1995, p. 559). In the US, therefore, what is taken down, stays down. The situation in Germany is different.

Public law in private spaces: German jurisprudence

Since 2018 German civil courts have decided a number of ‘put-back’ cases arising from deletions by social media companies (especially Facebook and Twitter) in favour of the plaintiff. The judgments of the civil courts are taken against the background of a specific understanding of the public sphere that was shaped by Germany’s highest court regarding constitutional questions, including the protection of fundamental rights, the German Federal Constitutional Court (BVerfG). We will look at how this understanding has now been transferred into the digital sphere with the preliminary decision by the BVerfG delivered in the case “Der III. Weg” in 2019 after having taken a closer look at the BVerfG's past decisions on private gatekeepers.

In one of its landmark decisions, Fraport in 2011, the BVerfG considered that, ‘depending on the ‘guaranteed scope [of the fundamental right] (Gewährleistungsinhalt) and the case’, the ‘indirect fundamental rights obligation of private parties (…) can come close or even be close to a fundamental rights obligation of the state’ if the private actor has ‘already taken over the provision of the framework conditions of public communication (…)’ (Fraport, 2011, para. 59). This is a nuancing of the doctrinal concept of indirect third-party effect of fundamental rights (mittelbare Drittwirkung der Grundrechte), which was developed in 1958 (Lüth, 1958). However and since more than 50% of the shares of the Fraport AG were held by public shareholders, the BVerfG found that in this case fundamental rights were to apply directly. It was left open to what extent the indirect third-party effect of fundamental rights applied ‘to materially private companies that open up public services and thus create places of general communication with regard to freedom of assembly or freedom of expression’ (Fraport, 2011, para. 59). In its Bierdosen-Flashmob decision 2015, the BVerfG confirmed this reasoning. Three years later, in Stadionverbot, the BVerfG applied the doctrine of indirect third-party effect of fundamental rights (mittelbare Drittwirkung) and found that according to the principle of equal treatment (Art. 3 Basic Law (GG)) a ban for (suspected) hooligans and other potentially violent soccer fans must ‘not [be] imposed arbitrarily but must be based on an objective reason (...) [and] is associated with procedural requirements (…).’ From this the BVerfG concluded that individuals should not be excluded ‘without objective reason’ and not without ‘compliance with procedural requirements’. Otherwise the principle of equal treatment would be violated. However, what the ruling in Stadionverbot does not tell us is if these requirements also apply to the protection of Art. 5 (1) (1) Basic Law (GG) and the public sphere in a digital environment or on platforms.

On 22 May 2019, the BVerfG was concerned with a put-back claim for the first time. Considering that the BVerfG found that the outcome of the proceedings in the main proceedings was not manifestly founded or unfounded, it performed a genuine weighing of the disadvantages for the involved parties. In a preliminary injunction decision, the BVerfG found that the consequences that would occur if the interim injunction was not issued but the main proceedings were successful would outweigh the disadvantages that would arise if the interim injunction was issued, but the main proceedings proved to be unfounded. In Der III. Weg the BVerfG ordered Facebook to allow a right-wing party to access its Facebook page and resume posting (Der III. Weg, 2019).

Even though the BVerfG did only order Facebook to temporarily re-grant the right-wing party Der III. Weg access to its Facebook page and resume posting (the preliminary injunction decision expires after six months, according to § 32 (6) BVerfGG), we can draw some insight from its decision. The BVerfG argued that, by excluding the use of its Facebook base, the right-wing party was ‘denied an essential opportunity to disseminate its political messages and actively engage in discourse with users of the social network,’ which would ‘significantly impede’ its visibility, especially during the run-up to the European elections (Der III. Weg, 2019).

The circumstances of the case and the reasoning of the BVerfG was very similar to that of the Tribunale di Roma in CasaPound v. Facebook, where another right-wing party had had their account suspended by Facebook. The Tribunale di Roma granted a precautionary measure against the suspension and found that Facebook has reached a level of systemic relevance regarding political participation under Art. 49 of the Italian Constitution. CasaPound’s rights to political participation was potentially subject to irreparable damage pending ordinary proceedings (Golia & Behring, 2020).

The BVerfG emphasised inter alia that Facebook has ‘significant market power’ within Germany and that fundamental rights can be effective in disputes between private parties by means of the doctrine of indirect third-party effect of the fundamental right. Therefore, Art. 3 (1) Basic Law (GG) (‘All persons shall be equal before the law’) may have to be interpreted in ‘specific cases’ to force powerful private actors to respect equality of treatment provisions with regard to private contracts (see Maunz, Dürig, & Herdegen, 2019, Art. 1 (3) para. 64).The fact that in Der III. Weg the BVerfG argued that Facebook will have to adhere to the principle of equal treatment with regard to its interaction with its users in the same way the state has to adhere to this principle does not mean that this holds true in regard to other fundamental rights, in particular Art. 5 (1) (1) Basic Law (GG). As the BVerfG clarified in its Fraport decision, the scope of the indirect third-party effect of fundamental rights always depends on the ‘guaranteed scope [of the fundamental right] and the circumstances of the case (Fraport, 2011). This suggests that not only Art. 5 Basic Law (GG) but all relevant fundamental rights in question need to be considered and balanced in order to determine if community standards can justify the deletion of a specific statement, even though it would be protected under Art. 5 Basic Law (GG).

With the introduction of the Act to Improve Enforcement of the Law in Social Networks (Network Enforcement Act (NetzDG)) in 2018 the issue received a lot of attention (Schulz, 2018; Kettemann, 2018, 2019; Heldt, 2019; Wagner, 2020; Peukert, 2018; Löber & Roßnagel, 2019; Bassini, 2019). Users have become more sensitive about the issue that content that was (in most of the cases) permissible under statutory law was taken down by platforms. Since 2018 we have seen the first cases that were decided by civil courts regarding put-back claims. Most of the cases concerned statements which constituted or were deemed to constitute hate speech, according to the platform’s definition of hate speech (Facebook, 2020). Only in rare cases is the solution clear cut, because only if the statement clearly violates the law, for instance § 130 German Criminal Code (StGB),3 a put-back claim will clearly fail.

This is not the case where the statements that have been taken down do not violate any laws but rather go against the terms of service or community standards of the platform that might be protected under Art. 5 (1) (1) Basic Law (GG) (Maunz, Dürig, & Grabenwarter, 2019, Art. 5 (1) (2) para. 108). Art. 5 (1) (1) Basic Law (GG) protects the right of every person to freely express and disseminate their opinions without hindrance. ‘There shall be no censorship’, the Basic Law (GG) confirms. Still, limitations to freedom of expression do exist ‘in the provisions of general law, in provisions for the protection of young persons and in the right to personal honor’ (Maunz, Dürig, & Grabenwarter, 2019, Art. 5 (1) (2) para. 121, 190, 195). In these cases, there are different ways to argue and the courts – in the end – have the obligation to balance constitutional values.

With reference to Art. 5 (1) (1) Basic Law (GG) and the function of Facebook as a ‘public marketplace’, German courts found that Facebook is a ‘public marketplace’ for information and opinion-sharing,4 and therefore it had to ensure – via the doctrine of indirect third-party effect of fundamental rights – that “zulässige Meinungsäußerungen” (admissible opinions = legal opinions) are not deleted.5 German courts concluded that platforms have a ‘substantial indirect meaningful duty’6 to protect the rights under Art. 5 (1) (1) Basic Law (GG). They argued that Facebook had developed a ‘quasi-monopoly’7 and that it is a private company offering a ‘public communicative space’. Therefore platforms8 would generally9 not be allowed to remove ‘admissible expressions of opinion’ and the community standards would not be allowed to exclude such content. 10

Such restrictions on the terms of service, however, could only be explained by a direct and state-like duty to guarantee Art. 5 (1) Basic Law (GG), which many courts of instance have rejected so far.11 Their argument is convincing because the indirectly binding nature of fundamental rights of private individuals is not about minimising interference restricting freedom, but about balancing fundamental rights.12 That is, balancing the legitimate interests of the intermediary in setting their own communication standards – and ruling over their own private space – as well as the interests (and concomitant communication rights) of the affected user and other users and their right of information (Spindler, 2019, p. 8, para. 22).

It is in line with that reasoning that a contract between a user and Facebook constitutes a contract ‘sui generis’13 and that Facebook’s Declaration of Rights and Duties forms part of the terms of service (Allgemeine Geschäftsbedingungen, AGBs). These were considered to be (partially) invalid, insofar as they substantially disadvantage the user contrary to good faith (§ 307 German Civil Code (BGB). The court found that provision on deletion of content and accounts in the terms of service could not survive the ‘disadvantage test’ since the provision restricted the reviewability of any decision to delete.

To put it concisely: social networks can prohibit hate speech that does not yet amount to a criminally punishable content pursuant to § 1 (3) NetzDG, but only as long as deletion is not performed arbitrarily, and users are not barred from the service without recourse. A private company, the court continued, that ‘takes over from the state the framework of public communication to such a degree’ must also have the ‘concomitant duties the state as a provider of essential services used to have’ (‘Aufgaben der Daseinsvorsorge’). Intermediaries have a right to police their platforms (‘virtuelles Hausrecht14) and must have the right to delete uploaded content in order to avoid liability (Kettemann, 2019). But opinions that are protected under Art. 5 (1) (1) Basic Law (GG) enjoy a higher level of protection (from deletion by a private actor) than other forms of expression. The generic terms in the BGB allow for and demand an interpretation that ensures that constitutional guarantees are being observed in contractual relations and by private actors. Thus, the violation of the terms of service does not always suffice to justify a deletion of a statement if it is protected under Art. 5 (1) (1) Basic Law (GG), thus restricting the rights of Facebook under Artt. 2, 12, 14 Basic Law (GG) (Maunz, Dürig, & Grabenwarter, 2019, Art. 5 (1) (2) para. 106, 143).

Conclusion: integrating public values into private contracts

The comparative analysis of the German and US case law shows us that there is not just ‘one’ answer to the question of whether social network services do incur ‘must carry’ obligations. It rather depends on the jurisdiction. In the US and in Germany, social network services may restrict content on their platform via terms of service. However, depending on the importance of a communication made (user-side) and the ‘significant market power’ (intermediary-side), social network services in Germany face restrictions in limiting access to the platform by suspending users or cancelling profile access contracts via the concept of indirect third-party effect of fundamental rights.

This may include restrictions regarding the design of terms of service (§§ 307, 305c BGB15), the interpretation of the terms of service in light of the Basic Law and the obligations for companies to take into account (§§ 241 (2) BGB and 242 BGB (good faith)). There might even be grounds to argue for an exclusion of the ordinary right of termination and an obligation to contract for particularly important networks, due to the adverse effects of the exclusion from the platform on fundamental rights of individual users and considering the self-defined general or issue-specific role of the platform (Twitter, 2020) it might even have an obligation to contract (§ 242 BGB). Whether the indirect third-party effect of fundamental rights that has been accepted for Art. 3 Basic Law (GG) is transferrable to Art. 5 (1) (1) Basic Law (GG) is not clear yet. The German line of cases following the adoption of the Network Enforcement Act confirms that certain intermediaries – namely those with a key role for public communication – have duties towards private users under fundamental rights law, namely a duty to respect the equality principle.

On the other hand, US law and jurisprudence in general is reluctant to recognise fundamental right-based duties for private intermediaries. As Knight v. Trump and other cases thoroughly analysed by Morris and Sarapin show, only parts of the privately owned online communication spaces can be regarded a public fora if they were used by government officials, as such (Morris & Sarapin, 2020). In an offline setting, such as in PruneYard v. Robins, the US Supreme Court, which has traditionally been very reluctant to apply fundamental rights obligations to private actors, acknowledged that under certain circumstances and only if a private entity fulfills a public-forum function, it must tolerate unwanted speech. This jurisprudence has not impacted intermediaries and ‘must carry’ cases until now, which is mainly because PruneYard v. Robins was decided against the backdrop of the Californian Constitution (Keller, 2019b). Rather, the courts continue to reject the argument that platforms are generally subject to constitutional speech guarantees (e.g., FAN v. Facebook ). This is also because US law is very sensitive to interferences with free speech by the government. This becomes clear when looking at the First Amendment argument invoked by private companies against ‘must carry’ claims. The freedom from interference by the government goes further and protects the private company from being forced to restore and put back speech that they do not want to host on their platforms (negative free speech). The understanding of speech is much broader than what German jurisprudence would comfortably interpret as falling under Art. 5 (1) (1) Basic Law (GG). However, this leaves citizens less protected regarding interferences with their right to free speech by private actors. This is unfortunate as these have become important providers of online communicative spaces.

In order to meet the fundamental rights guarantees (applied horizontally), content-related standards need to be (and by now usually are) published, enshrined in terms of service that meet fundamental rights-standards, formulated as general rules that are applied non-arbitrarily and allow for effective recourse against deletions and suspensions, as foreseen, for example, by the Council of Europe Recommendation on Intermediaries (Council of Europe, CM/Rec(2018)2, 2018). With Facebook’s introduction of revised values, and a charter for an Oversight Board, content governance is progressively ‘constitutionalized’. But the scope of review is limited to ‘single-user content' that was taken down and does not deal with ranking decisions, ‘shadow banning’ (Menegus, 2019) and, most importantly, does not include the possibility to review Facebook’s algorithm (Douek, 2019).

We expect other platforms to watch the development closely and potentially follow. The step to implement values can be considered a reaction of Facebook to the demand of a number of scholars (Kadri & Klonick, 2019) for a ‘constitution-building’ within the platform (Kettemann, in press). What they are trying to do is to implement self-regulation first, before governments force them to implement regulation which might be difficult to enforce or bad for business. Further, the values and the Oversight Board can be a vehicle for Facebook to add legitimacy to their actions and to outsource controversy while achieving a higher level of actual compliance with their policies (Douek, 2019). In that way, platforms’ anticipatory normative action spares governments the need to enact (and enforce) actual laws - and at the same time makes it more difficult for affected users to challenge takedowns in courts (Bassini, 2019, p. 186), especially in the US (Keller, 2019b). This is why the horizontal application of fundamental rights is so important as a concept.

We argue that insofar platforms serve as (quasi)public fora for communications this influences the ‘normative order’ in which they operate (Kettemann, in press). The German approach to this question offers elements worth considering. The reasoning of the Stadion ban decision of the BVerfG, transferred in the digital sphere, already can be regarded a ‘must carry’ obligation on the internet regarding an access related dimension to online content. It is very likely that after the preliminary injunction decision in ‘Der III. Weg’ the BVerfG will extend its reasoning of indirect third-party effect of fundamental rights onto platforms in regard to the principle of equal treatment to Art. 5 (1) (1) Basic Law (GG). This is appropriate and will facilitate a transparent process of balancing in regard to the fundamental rights in conflict.

The reasoning behind the third-party effect of fundamental rights is confined to one or two European jurisdictions. The Court of Justice of the European Union confirmed the indirect third-party obligation of fundamental rights when assessing how search engines have to balance fundamental rights in the context of de-referencing decisions:

(…) It is thus for the operator of a search engine to assess, (…) in the light of all the circumstances of the case, (…) if (…) the inclusion of the link in question is strictly necessary for reconciling the data subject’s rights to privacy and protection of personal data with the freedom of information of potentially interested internet users (…) (CNIL, 2019).

Drittwirkung by another name – the horizontal application of fundamental rights – is thus a common theme of CJEU jurisprudence as well.16 But even acknowledging that platforms have a ‘must carry’-obligation does not mean they ‘have to carry’ any content. As the CJEU confirms, they can still restrict content in specific cases after balancing the fundamental rights at stake.

This holistic approach to the normative order of online speech is less concerned with public versus private ownership of the communicative space but focuses on the function of online speech. We conclude that this approach makes much sense in times of divergence of online actors and redistribution of responsibilities for governing the public sphere. It is thus time to – figuratively – back up and consider the potential impact of the horizontal application of human rights on the normative order of private-public interaction on the internet as a whole, including governance by algorithms and governance by affordance, which influences the way speech is communicated and received. ‘Must carry’ cases and put-back-attempts draw our attention – with much potential gain – to clashes between private and public orders, between public law and private law.

References

Balkin, J. M. (2004). Digital Speech and Democratic Culture: A Theory of Freedom of Expression for the Information Society. New York University Law Review, 79(1), 1–58. https://www.nyulawreview.org/wp-content/uploads/2018/08/NYULawReview-79-1-Balkin.pdf

Bassini, M. (2019). Fundamental rights and private enforcement in the digital age. European Law Journal, 25(2), 182–197. https://doi.org/10.1111/eulj.12310

Botella, E. (2019). TikTok Admits It Suppressed Videos by Disabled, Queer and Fat Creators. Slate. https://slate.com/technology/2019/12/tiktok-disabled-users-videos-suppressed.html

Bickert, M. (2019, September 12). Updating the Values That Inform Our Community Standards [Press release]. https://newsroom.fb.com/news/2019/09/updating-the-values-that-inform-our-community-standards

Coche, E. (2018). Privatised enforcement and the right to freedom of expression in a world confronted with terrorist propaganda online. Internet Policy Review, 7(4). https://doi.org/10.14763/2018.4.1382

Council of Europe, Committee of Ministers to Member States. (2018, March 7). Recommendation CM/Rec(2018)2, On the roles and responsibilities of internet intermediaries. Council of Europe. https://search.coe.int/cm/Pages/result_details.aspx?ObjectID=0900001680790e14

Citron, D. K., & Norton, H. (2011). Intermediaries and Hate Speech: Fostering Digital Citizenship for Our Information Age. Boston University Law Review, 91. 1435–1484. https://scholar.law.colorado.edu/articles/178/

Citron, D. K., & Wittes, B. (2017). The Internet Will Not Break: Denying Bad Samaritans § 230 Immunity. Fordham Law Review, 86(2), 401–423. https://ir.lawnet.fordham.edu/flr/vol86/iss2/3

Citron, D. K., & Franks, M. A. (2020). The Internet as a Speech Machine and Other Myths Confounding Section 230 Reform (Public Law & Legal Theory Paper No. 20-8). Boston University School of Law. https://scholarship.law.bu.edu/cgi/viewcontent.cgi?article=1833&context=faculty_scholarship

Douek, E. (2019). ‘Facebook’s “Oversight Board”: Move fast with stable infrastructure and humility’. North Carolina Journal of Law & Technology, 21(1). http://ncjolt.org/wp-content/uploads/2019/10/DouekIssue1_Final_.pdf

European Parliament. (2019). Freedom of expression, a comparative law perspective. The United States (Comparative Law Library Unit Study No. PE 642.246). European Parliamentary Research Service. https://www.europarl.europa.eu/RegData/etudes/STUD/2019/642246/EPRS_STU(2019)642246_EN.pdf

European Union Code of Conduct against illegal hate speech online. (2016). https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combatting-discrimination/racism-and-xenophobia/eu-code-conduct-countering-illegal-hate-speech-online_en

Facebook. (2019, September 17). Establishing Structure and Governance for an Independent Oversight Board [Press release]. https://newsroom.fb.com/news/2019/09/oversight-board-structure

Facebook. (2020). Community Standards. https://m.facebook.com/communitystandards

Facebook. (2020) Give people the power to build community and bring the world closer together. https://www.facebook.com/pg/facebook/about

Goldmann, E. (2017). The Ten Most Important Section 230 Rulings. Tulane Journal of Technology and Intellectual Property, 20. https://journals.tulane.edu/TIP/article/view/2676/

Golia, A., Jr., & Behring, R. (2020, February 18). Private (Transnational) Power without Authority: Online fascist propaganda and political participation in CasaPound v. Facebook. Verfassungsblog. https://doi.org/10.17176/20200218-164225-0

Han, B.-C. (2014). Krise der Freiheit. In Psychopolitik. Neoliberalismus und die neuen Machttechniken (pp. 9–24). Fischer Verlag.

Heldt, A. P. (2019). Reading between the lines and the numbers: an analysis of the first NetzDG reports. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1398

Hill, S. (2019). Empire and the megamachine: comparing two controversies over social media content. Internet Policy Review, 8(1). https://doi.org/10.104763/2019.1.1393

Hölig, S., & Hasebrink, U. (2019). Reuters Institute Digital News Report 2019: Ergebnisse für Deutschland (HBI Working Paper No. 47). Leibniz-Institut für Medienforschung, Hans-Brewdow-Institut. https://www.hans-bredow-institut.de/uploads/media/default/cms/media/os943xm_AP47_RDNR19_Deutschland.pdf

Kadri, T., & Klonick, K. (2019). Facebook v. Sullivan: Public Figures and Newsworthiness in Online Speech (Legal Studies Research Paper No. 19-0020). St. John’s University School of Law. https://doi.org/10.2139/ssrn.3332530

Keller, D. (2019a). Dolphins in the Net: Internet Content Filters and the Advocate General’s Glawischnig-Piesczek v. Facebook Ireland Opinion. Stanford Center for Internet and Society. https://cyberlaw.stanford.edu/files/Dolphins-in-the-Net-AG-Analysis.pdf

Keller, D. (2019b). Who do you sue? State and platform hybrid power over online speech (Aegis Series Paper No. 1902). Hoover Institution. https://www.hoover.org/research/who-do-you-sue

Keller, D. (2018). Internet Platforms observations on speech, danger, and money (Aegis series paper No. 1807). Hoover Institution. https://www.hoover.org/research/internet-platforms-observations-speech-danger-and-money

Kettemann, M. C. (2019). Stellungnahme als Sachverständiger für die öffentliche Anhörung zum Netzwerkdurchsetzungsgesetz auf Einladung des Ausschusses für Recht und Verbraucherschutz des Deutschen Bundestags. https://www.hans-bredow-institut.de/uploads/media/default/cms/media/up8o1iq_NetzDG-Stellungnahme-Kettemann190515.pdf

Kettemann, M. C., & Schulz, W. (2020). Setting Rules for 2.7 Billion. A (First) Look into Facebook’s Norm-Making System: Results of a Pilot Study (Working Papers: Works in Progress No. 1). Hans-Bredow-Institut. https://leibniz-hbi.de/uploads/media/Publikationen/cms/media/5pz9hwo_AP_WiP001InsideFacebook.pdf

Kettemann, M. C. (In press). The Normative Order of the Internet. Oxford University Press.

Klonick, K. (2018). The New Governors: The People, Rules, and Processes Governing Online Speech. Harvard Law Review, 131(6), 1598–1670. https://harvardlawreview.org/2018/04/the-new-governors-the-people-rules-and-processes-governing-online-speech

Löber, L. I., & Roßnagel, A. (2019). Das Netzwerkdurchsetzungsgesetz in der Umsetzung Bilanz nach den ersten Transparenzberichten. MMR, 22(2), 71–75.

Maunz, T., Dürig, G., Grabenwarter, C., & Herdegen, M. (2019). Kommentar zum Grundgesetz (GG) Teil B, (89th ed). C.H. Beck.

Menegus, B. (2019). Facebook patents shadowbanning. Gizmodo. https://gizmodo.com/facebook-patents shadowbanning-1836411346

Morris, P. L. & Sarapin, S. H. (2020). You can’t block me: When social media spaces are public forums. First Amendment Studies. https://doi.org/10.1080/21689725.2020.1742760

Peukert, A. (2018). Gewährleistung der Meinungs- und Informationsfreiheit in sozialen Netzwerken - Vorschlag für eine Ergänzung des NetzDG um sog. Put-back-Verfahren. MMR, 21(9), 572–578.

Ruggie, J. (2008, April 7). Human Rights Council. Protect, Respect and Remedy: a Framework for Business and Human Right. Report of the Special Representative of the Secretary-General on the issue of human rights and transnational corporations and other business enterprises [Report No. A/HRC/8/5]. United Nations. https://www.business-humanrights.org/sites/default/files/reports-and-materials/Ruggie-report-7-Apr-2008.pdf

Ruggie, J. (2011, March 21) United Nations Guiding Principles on Business and Human Rights: Implementing the United Nations ‘Protect, Respect and Remedy’ Framework. Report of the Special Representative of the Secretary-General on the issue of human rights and transnational corporations and other business enterprises (UN Doc. A/HRC/17/31). United Nations.

Saunders, K.W. (2017). Free Expression and Democracy. A comparative analysis. Cambridge University Press. https://doi.org/10.1017/9781316771129

Schulz, W. (2018). Regulating Intermediaries to Protect Privacy Online – the Case of the German NetzDG (HIIG Discussion Paper Series No. 2018-01). Alexander von Humboldt Institute for Internet and Society. https://www.hiig.de/wp-content/uploads/2018/07/SSRN-id3216572.pdf

Schwarz, O. (2019). Facebook Rules: Structures of governance in digital capitalism and the control of generalized social capital. Theory, Culture & Society, 36(4). https://doi.org/10.1177/0263276419826249

Spindler, G. (2019). Löschung und Sperrung von Inhalten aufgrund von Teilnahmebedingungen sozialer Netzwerke. Computer und Recht, 35(4), 238–247. https://doi.org/10.9785/cr-2019-350411

Suzor, N. (2018). Digital Constitutionalism: Using the Rule of Law to Evaluate the Legitimacy of Governance by Platforms. Social Media + Society, 4(3). https://doi.org/10.1177/2056305118787812

Twitter (2019, June 27). Defining Public Interest on Twitter [Blog post]. https://blog.twitter.com/en_us/topics/company/2019/publicinterest.html

Twitter (2020). We believe in free expression and think every voice has the power to impact the world. Retrieved from https://about.twitter.com/en_us/values.html

van Dijck, J. (2013). The Culture of Connectivity: A Critical History of Social Media. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199970773.001.0001

Wagner, G. (2020). Haftung von Plattformen für Rechtsverletzungen (Teil 1). GRUR, 122(4), 329–337.

Zakon, R. (2017). Hobbes’ Internet Timeline 10.2. https://www.zakon.org/robert/internet/timeline

Cases

Bierdosen-Flashmob (2015). Decision of BVerfG - 1 BvQ 25/15. BVerfGE 139, 378.

Brittain v. Twitter, Inc. (2018). Northern California District Court, CV-18-01714-PHX-DGC.

Buza v. Yahoo!, Inc. (2011). WL 5041174, 1.

CasaPound v. Facebook, Inc. (2019). R. G. 59264/2019.

Cengiz and Others v. Turkey. (2015) applications nos. 48226/10 and 14027/11

Chukwurah v. Google, Inc. (2020). No. 8:2019cv00782.

Cox v. Twitter, Inc. ́(2019). 2:18-2573-DCN-BM (D.S.C.).

Cyber Promotions, Inc. v. America Online, 948 F. Supp. 436 (E.D. Pa. 1996).

Der III. Weg. (2019). 1 BvQ 42/19. NJW 2019,1935.

Ebeid v. Facebook, Inc.. (2019). WL 2059662, 6.

FAN vs. Facebook, Inc.. (2019). Case No. 18-CV-07041-LHK.

Fraport (2011). Decision of BVerfG - 1 BvR 699/06. BVerfGE 128, 226 – 278.

Freedom Watch, Inc., Individually and on behalf of those similarly situated and Laura Loomer, individually and on behalf of those similarly situated Palm Beach, Florida v. Google Inc., et al. (2020) Appeal from the United States District Court for the District of Columbia (No. 1:18-cv-02030)

Fyk v. Facebook, Inc. (2019). Northern California District Court, C 18-05159 JSW.

GC and Others v. Commision nationale de I ́informatique et des libertés (CNIL). (2019). Case no. C-136/17. Digital reports, ECLI:EU:C:2019:773.

Glawischnig-Piesczek v. Facebook Ireland Limited. (2019). Case no. C-18/18, ECLI:EU:C:2019:821.

John J. Hurley and South Boston Allied War Veterans Council v. Irish-American Gay, Lesbian and Bisexual Group of Boston et al (1995). 515U.S. 557.

Johnson v. Twitter Inc. (2018). California Superior Court, 18CECG00078.

Kimbrell v. Twitter Inc. Northern California District Court, 18-cv-04144-PJH.

Knight First Amendment Inst. at Columbia Univ. v. Trump. (2017). No. 1:17-cv-5205 (S.D.N.Y.),No. 18-1691 (2d Cir.).

Langdon v. Google, Inc. (2007). 474 F. Supp. 2d 622, 632.

Manhattan Community Access Corporation v. Halleck (2019). No. 17-702, reviewing 882 F. 3d 300 (2d Cir. 2018).

Marsh v. Alabama. (1946). 326 U.S. 501.

Mezey v. Twitter Inc. (2018). Florida Southern District Court, 1:18-CV-21069.

Murphy v. Twitter Inc. (2019). San Francisco Superior Court, CGC-19-573712.

Prager University v. Google. (2018). WL 1471939, 8.

Pruneyard Shopping Center v. Robins. (1980). 447 U.S. 74.

Smith v. California, (1959). 361 U.S. 147.

Stadionverbot (2018). Decision of BVerfG - 1 BvR 3080/09; BVerfGE 148, 267 - 290.Turner Broad. Sys. v. FCC. (1994). 512 U.S. 622, 629 at 657.Turner Broadcasting System, Inc. v. FCC, 520 US 180 (1997))

Twitter Inc. v. The Superior Court for the City and County of San Francisco, (2018). California Court of Appeal, A154973.

Williby v. Zuckerberg. (2019). Northern California District Court, 18-cv-06295-JD.

Zeran v. America Online, Inc. (1997). 129 F.3 d 327, 330; 4 th Cir. 1997.

Footnotes

1. ‘Must carry’ from a US perspective originated from a set of rules instituted by the Federal Communications Commission (FCC) in 1965. Originally, ‘must carry’ was a claim, which was brought forward with the aim ‘to preserve[ing] a multiplicity of broadcasters’ (Turner II v. FCC, 1997) and that obliged cable television networks to carry particular (local) programmes. In the context of communications law, ‘must carry’ is not alien to the European and in particular Germany legal order. It is part of the statutory broadcasting obligations which apply under the German Interstate Broadcasting Treaty (RStV) to so-called platform providers, in particular cable network operators. But the context in which ‘must carry’ arguments are put forward has expanded in the last years. Speakers on online platforms in the US have tried to make use of the idea behind ‘must carry’ in a different context. They have tried to force privately owned (social media) platforms to carry (i.e. publish) their speech. In their view ‘a private entity becomes a state actor through its operation’ of the private property as ‘a public forum for speech’ (Cyber Promotions v. America Online, 1996). The approach to obligations of private digital platforms regarding speech to be carried differs significantly between the US and Germany.

2. E.g. Der III. Weg (2019) 1 BvQ 42/19. NJW 2019, 1935; Regional Court Berlin (LG Berlin) (2018) 31 O 21/18; Regional Court Offenburg (LG Offenburg) (2018) 2 O 310/18; Higher Regional Court Munich (OLG München) (2018) 18 W 1294/18; District Court Tübingen (AG Tübingen) (2018) 3 C 26/18; Regional Court Bamberg (LG Bamberg) (2018) 2 O 248/18.

3. OLG Stuttgart (2018) 4 W 63/18; ´Drecksvolk´ (2018), 1 OLG 21 Ss 772/17.

4. Higher Regional Court Frankfurt/Main (OLG Frankfurt/Main) (2017) 16 U 255/16 at 28.

5. Similarly, Higher Regional Court Munich (OLG München) (2018) 18 W 858/18 (LG München I).

6. Higher Regional Court Stuttgart (OLG Stuttgart) (2018) 4 W 63/18 at 73.

7. Higher Regional Court Dresden (OLG Dresden) (2018) 4 W 577/18.

8. Regional Court Berlin (KG Berlin) (2019) 10 W 172/18 at 17.

9. Higher Regional Court Munich (OLG München) (2018) 18 W 1955/18 at 19 et seq.- possible exception for subforums.

10. Higher Regional Court Munich (OLG München) (2018) 18 W 858/18 at 30; 18 W 1873/18 at 21; 18 W 1383/18 at 20 et seq.; 18 W 1294/18 at 28; Regional Court Karlsruhe (LG Karlsruhe) (2018) 11 O 54/18 at 12; Regional Court Frankfurt/Main (LG Frankfurt/Main) (2018) 2-03 O 182/18 at 16; Regional Court Bamberg (LG Bamberg) (2018) 2 O 248/18 at 86.

11. Higher Regional Court Dresden (OLG Dresden) (2018) 4 W 577/18 at 19 et seq.; Higher Regional Court Karlsruhe (OLG Karlsruhe) (2019) 6 W 81/18 at 51 et seq.; Higher Regional Court Karlsruhe (OLG Karlsruhe) (2018) 15 W 86/18 at 21; Higher Regional Court Stuttgart (OLG Stuttgart) (2018) 4 W 63/18 at 71; Regional Court Offenburg (LG Offenburg) (2019) 2 O 329/18 at 80; Regional Court Bremen (LG Bremen) (2019) O 1618/18 at 59; Regional Court Heidelberg (LG Heidelberg) (2018) 1 O 71/18 at 38. 


12. Higher Regional Court Karlsruhe (OLG Karlsruhe) (2019) 6 W 81/18 at 52. 


13. Higher Regional Court Munich (OLG München) (2018) 18 W 1294/18 (LG München II).

14. Regional Court Bonn (LG Bonn) (1999) 10 O 457/99.

15. Higher Regional Court Dresden (OLG Dresden) (2018) 4 W 577/18 (LG Görlitz).

16. The CJEU is interpreting the relevant provisions ‘in the light’ of the fundamental rights without talking about a third-party effect or ‘must carry’, for example Alemo-Herron v. Parkwood Leisure Ltd. Case no. C-426/11 at 29-30 Digital reports, ECLI:EU:C:2013:521; Google Spain SL and Google Inc. v. Agencia Española de Protección de Datos (AEPD) and Mario Costeja González´ (2014) Case no. C-131/12 at 68 and 74, Digital reports, ECLI:EU:C:2014:317; Y.S. v. Minister voor Immigratie, Integratie en Asiel´ (2014) Case no. C - 141/12 at 54; Opinion of Advocate General Poiares Maduro delivered on ´International Transport Workers´ Federation, Finnish Seamen's Union v. Viking Line ABP, OÜ Viking Line Eesti´ (2007) Case no. C-438/05, at 39. Digital reports, ECLI:EU:C:2007:772 ; Opinion of Advocate General Trstenjak on ´Dominguez v. Centre informatique du Centre Ouest Atlantique, Préfet de la région Centre´ (2012) Case no.C 282/10 at 83.

Add new comment