Ruling by bullying: Threats of regulation as an internet governance device
Abstract
Government by regulation structures how constitutional democracies normally operate. Legislatures and executive agencies enact formal rules that govern conduct, embodying the ideal of government by laws rather than by individuals. Yet regulators also govern through threats of regulation. When public officials seek to alter private behavior, they may warn regulated actors that failure to comply will trigger new or stricter rules. These warnings can achieve regulatory goals without the adoption of formal rules. Because officials often issue such threats in informal, private communications, the practice escapes public scrutiny and challenges the dominant model of democratic rule-making, which assumes open deliberation by accountable institutions. This paper theorizes threats of regulation as a governance device that remains largely invisible to outsiders but offers significant advantages to regulators. Although United States courts attempt to distinguish unlawful coercion from permissible persuasion, they struggle to enforce these boundaries in practice. The paper argues that increasing transparency in routine communications between regulators and corporate actors would reduce the risk of abuse while preserving regulatory effectiveness.
Acknowledgements
This paper has been long in the making. I would like to thank commentators at the IGF's GigaNet symposium held in Kyoto in 2023, where I presented a rough version of this research. My colleagues at CELE pushed me to make the argument sharper. I have benefited from discussions with Agustina Del Campo, Matías González, Rachel Griffin, Robert Gorwa, Joan Barata and Micheal Karanicolas and the research assistance of David Turbay to gather and analyse large volumes of data from different publicly available data sets and sources of information. I also would like to thank Soledad Vázquez for proof-reading the last version of the manuscript. Finally, I would especially like to thank the reviewers and the editors of Internet Policy Review for helpful comments that forced me to add some nuance and to develop a necessary update considering recent events. The errors that remain are mine alone.
Introduction
Generally, it is reasonable to say that companies dislike regulation. Assuming a generally capitalist free-market context (as this paper does), companies can develop their business free of outside demands and constraints (Becker, 1983; Erfle et al., 1989). Regulation imposes on them duties and obligations that most companies would avoid if possible. However, it can have effects even when not fully implemented: regulatory threats may nudge companies in certain directions by tapping into their desire to prevent that regulation from becoming a reality (Karanicolas, 2019a, p. 186). Recent studies show how companies strive to stay ahead of the curve of proposed regulations (Chang et al., 2023) and how regulatory threats operate pervasively across fields and industries as a device that nudges companies into “voluntary schemes” of compliance with public officials’ desires and goals (Gorwa, 2019; Hall & Hysing, 2019; Linder, 2017; Lyon & Maxwell, 2004; Maxwell et al., 2000a; Patten & Trompeter, 2003; Short & Toffel, 2010; Suijs & Wielhouwer, 2019). By yielding to some of the demands of the would-be regulators, companies can appease some of the concerns of public officials in order to diminish their commitment to regulate them. Evidence of this dynamic exists in this regard in environmental matters (Khanna & Anton, 2002; Maxwell et al., 2000b; Reid & Toffel, 2009), in broadcasting (Cooper, 1978; Horwitz, 1993; Lerner, 1978), in the energy sector (Acutt et al., 2001a; Erfle et al., 1990), in the ready-to-eat cereal market (Cotterill, 1999), in mining (Verkuil, 1980, p. 946), in the Swedish textile industry’s commitments to reduce the use of certain chemicals (Hall & Hysing, 2019), and on hedge funds (Suijs & Wielhouwer, 2019), among others. This dynamic is also present on internet governance (Balkin, 2014; Bambauer, 2015; Citron, 2017; Duffield, 2022; Hagemann et al., 2018; Karanicolas, 2019b, 2019a; Leerssen, 2020; Monahan, 2021; Sander, 2021).
This paper theorises threats of regulation as a specific form of governance, that taps on the regulatory power of the state with the goal of keeping it dormant, in order to avoid “the fuss and mess of formal rulemaking” (Hagemann et al., 2018, p. 53; Héritier & Lehmkuhl, 2008; Hunt & Wickham, 1994). It has been defended as a useful mechanism in the deployment of regulatory power, specially in contexts of high uncertainty and dynamic industries (Brotman, 1988; Wu, 2011). But it has also been questioned in terms of its efficacy and potential arbitrariness (Brito, 2014). In the United States, there have been legal standards that tried to define what is permissible and what is not in the contacts between public and corporate officials (Bazelon, 1975; Corn-Revere, 1995; Duffield, 2022). But these standards emerged in unusual circumstances, after litigation by an affected third-party (Writers Guild of America, West, Inc. v. FCC, 1976; National Rifle Association of America v. Vullo, 2024) or when corporations decided to resist the pressure exerted upon them by public officials (Bantam Books Inc. v. Sullivan, 1963). Both conditions are rare, for reasons I will explore in this paper. In other jurisdictions, regulatory threats do not pose a legal problem at all. move forward in the following way.
In the first section, I state the problem and offer a definition of regulatory threats. I propose a two-axis compass to identify the forms in which officials may attempt to persuade or coerce individuals and corporations, ranging from simple verbal pressure to legislative or administrative steps toward regulation (hearings, green or white papers, legislative inquiries, requests for information, formal letters, guidance documents, etcetera). When accompanied by specific pre-regulatory acts, it is reasonable to assume that public officials’ commitment to regulation is higher, and thus the potential nudging effects of threats are larger. Regulatory threats are different from enforcement threats, but pose similar – albeit different – problems.
The second section briefly introduces the normative problems involved in regulatory threats but also their usefulness as a way of communicating the state’s willingness to escalate its regulatory strategy (Ayres & Braithwaite, 1992, p. 38). This is the key of the conundrum: we want public officials to be effective in office but we simultaneously want to prevent abuses. The section also discusses why large internet platforms are especially susceptible to this form of governance: it is because the rules of intermediary liability have spared them from potential costs associated to user generated content, they extract value from these rules, and – as chokepoints or points of control of a decentralized network – can play a meaningful role assisting governments in combating illegal activities over the Internet, such as illegal drug markets or CSAM (Balkin, 2014; Karanicolas, 2019a; Keller, 2019, 2023; Tusikov, 2016a; Zittrain, 2003). For the incentive structure stemming from this regime, these companies are “far more vulnerable to Government pressure than other news sources” (Murthy v. Missouri, 2024, p. 5). Unlike newspapers, the modern state exerts much more regulatory power over them and it also relies on them to combat some form of crimes specifically associated with the internet. This makes regulatory threats a governance mechanism of crucial importance on the internet.
The third section posits that the legal standard developed by American courts to prevent abuses through regulatory threats or jawboning is difficult to administer by courts and dependent on highly unlikely litigation. Threatening patterns are hard to distinguish from normal processes of policy-making, and abuses can only emerge in context and if full access to communications between corporate and public officials is gained. For that reason, I conclude with a narrow regulatory proposal that seeks to shed light on communications between public officials and corporate officers. Without these measures of enhanced transparency, the mechanism will remain opaque and likely to be abused.
A. Threats of regulation as a mechanism
The rule-of-law model of democratic decision-making stands for a very basic proposition: that we make rules through our representatives in a legislative process to reach basic agreements that we lay down as legal rules destined to govern our conduct (Post, 2010, p. 1343; Raz, 2009). This instrumentalist conception of the law sees it as an outcome of a process in which deliberation of proposed rules within a legislative assembly leads towards consensus or majority rule, and thus new rules and regulations are created. While the model has been more or less challenged by the rise of the administrative state (Posner & Vermeule, 2010; Rakoff, 2000, p. 161), it basically holds true for a large part of modern rule-making in a democratic society.
Threats of regulation should be seen within this model and as part of its early stages, encompassing the steps that usually precede actual regulation, such as official inquiries, efforts at gathering information, drafting a bill, and so on. These pre-regulatory steps are sometimes public and sometimes private. A legislator, before introducing a bill, usually gathers information, holds meetings with experts and relevant stakeholders on the issue that matters to her, asks for information and feedback, and so on. In those meetings, public officials express desires, present concerns, and ask questions. These statements are often meant to convey an argument, to persuade, and to ask for conduct they would like to see adopted. Sometimes, the very utterance of those desires or requests is enough to influence the conduct of private actors potentially subjected to regulation, who may comply to appease the concerns of public officials. If they do, the public officials might feel that regulation is no longer necessary (Halfteck, 2008, p. 632). All of this happens – as well – in a context of lobbying and negotiations, where companies try to get the regulation they want or, at least, one that approximates their interests. Regulatory threats, as discussed here, are therefore part of the normal dialogue that precedes regulation in modern democratic societies.
Researchers have consistently found the effectiveness of threats and informal mechanisms to pressure individuals and firms in a wide set of settings: pension funds administrators to invest in housing (Schotland, 1970), price-setting actors amidst inflation (Ackley, 1978, p. 508; Bartels, 1983; Fisher, 1970; Haberler, 1972), publicly-owned companies under the command-and-control model of the Securities and Exchange Commission (SEC) (Hipple & Harkelroad, 1975, p. 697), Federal Reserve officers under pressure from politicians (Kane, 1990, p. 291), electric companies deciding on the price of utilities (Joskow & MacAvoy, 1975, p. 296), banks deciding on interest rates amidst a steep political crisis (Glazer & McMillan, 1992, pp. 1097–1098), directors at companies that were undesirably interlocked (Jorgensen & Clark, 1980), or banks under pressure by regulators to disinvest in fossil-fuel (Sinclair, 2020), network industries (Haucap et al., 2007), and so on. While most research has been centred in the United States, these mechanisms have been found in use on gas price control policies in Spain (Perdiguero, 2004), in dealing with the pricing of utilities in the United Kingdom (Acutt et al., 2001b), in the Swedish textile industry (Hall & Hysing, 2019), and in Indian and European internet governance politics (Citron, 2017; Karanicolas, 2021a).
While some see this mechanism as an “entirely novel way” of inducing regulatory changes (Halfteck, 2008, p. 636), it can be viewed as a necessary part of democratic rule-making. In this account, officials are granted authority to enact rules and engage in formal or informal spaces where rules and their enforcement are imagined, proposed, demanded, or discussed. Sometimes what is said in these spaces elicits conduct change without the need for formal rule-making. In fulfilling their duties, public officials may freely share their “views and criticize particular beliefs” and “do so forcefully to persuade others to follow” their lead (National Rifle Association of America v. Vullo, 2024, pp. 8–9). Public officials have always “achieved their purposes largely through informal means—by giving speeches, threatening enforcement, or advising on their view of the law” (Rakoff, 2000, p. 162). In employing these means, they seek not only to persuade but also to pressure and encourage changes in conduct.
Veiled threats of using their powers in more assertive ways are part of these dynamics. This is where United States courts have attempted to draw a line between what is permissible from what is not. For that reason, most American legal scholars define jawboning in ways that connote coercion 1. Those who studied these dynamics outside of the narrow ground of the First Amendment and outside of the United States tend to define them in less normatively charged ways (Cotterill, 1999, p. 197) and often speak of regulatory threats as a matter of course (Acutt et al., 2001b; Cotterill, 1999, p. 197; Erfle et al., 1989, p. 150; Hall & Hysing, 2019, p. 1006; Haucap et al., 2007, p. 172; Khanna & Anton, 2002; Maxwell et al., 2000b; Noah, 1997; Reid & Toffel, 2009). But it is only in the American context that they are seen as normatively problematic. Outside the United States, regulatory threats have not been regarded as raising fundamental or constitutional rights questions. Neither in Europe nor in Latin America, for instance, has the legality of regulatory threats been considered a problem by courts, and – to the best of my knowledge – only a handful of authors have addressed the issue from a European perspective (van de Kerkhof, 2023; Leerssen, 2024).
The line drawn by American courts in its case-law is – however – somewhat fictitious. Words by public officials always carry the regulatory power of the state as a necessary background and are clothed with their authority. Some exert more power than others. The lonely senator trying to persuade a big online retailer may not have a lot of direct authority over its operations (Kennedy v. Warren, 2023), but the lone board member capable of making a regulatory proposal within the antitrust authority may be very persuasive in her demands. But corporations are susceptible to these dynamics.
Indeed, research shows that firms strive to “stay ahead of the curve and prepare for future regulatory developments, long before the proposed regulations are finalized and codified” (Chang et al., 2023, p. 1; Halfteck, 2008, p. 662). As Keller noted, platforms engage in “anticipatory obedience” that “spares governments the need to enact actual laws” (Keller, 2019, p. 2). Ayres and Braithwaite suggest that modern regulators face a key question: “when to punish, when to persuade” (Ayres & Braithwaite, 1992, p. 21). It is not a matter of either/or but of choice, because words by public officials can by themselves affect the world they intend to govern, whether they seek to persuade or coerce. As Harbath and Perault recall from their experience as Facebook executives, corporate officials feel the weight of the office held by those making demands (Harbath & Perault, 2023). This applies both to corporations facing potential regulations—as Internet companies—and those subjected to well-established regulatory authority, such as broadcasters. When Chairman Wiley of the FCC pressured broadcasters to adopt a Family Hour policy for minors, he argued that he was simply sharing a concern and making a suggestion. “Here are some thoughts I have,” he testified, conveying more a suggestion than a threat (Writers Guild of America, West, Inc. v. FCC, 1976, p. 1092). But those at the receiving end of his words swiftly complied with his desires. Just as I was finishing the last revision of this paper, Jimmy Kimmel’s show on ABC was pulled off-air after the FCC chairman publicly pressed the network to do so (Koblin, 2025).
When public officials speak, their words carry some weight. They always operate under a “shadow of hierarchy” (Héritier & Lehmkuhl, 2008). This state power is also what makes guidance documents and non-legislative rules so effective (Mendelson, 2006, pp. 400–401).
If this account holds, then it makes sense to think of regulatory threats generously, which leads to a definition that is broader than alternative accounts (Halfteck, 2008; Wu, 2011, p. 1844). A threat of regulation is defined here as any kind of public or private utterance or action by public officials who hold regulatory power over others in which they express, suggest, or imply – clearly or in a veiled manner – their desire for the subject’s conduct to shift in a particular direction. The regulatory response to a failure to comply needs not to be explicit, but it could be considered implied in the very dynamics of policy making at play. Against Wu, I do not exclude a priori from our definition “mere policy guidelines, studies, reports, and similar materials” (Wu, 2011, p. 1844). These can very well be part of a threatening pattern. Unlike Halfteck, I believe that regulatory threats are a more accurate concept than legislative threats (Halfteck, 2008). And against Bambauer, I do not feel that the definition of what a threat is should be normatively charged to suggest that threats are bad and should be necessarily curved as a mechanism of governance (Bambauer, 2015). The broader definition I propose is intended primarily for descriptive purposes. This definition reflects how threats operate in practice, based on the most detailed accounts produced through judicial inquiries in the United States but is also consistent with how other disciplines have theorized and researched regulatory threats in other sectors (Acutt et al., 2001b; Cotterill, 1999; Erfle et al., 1989, 1990; Hall & Hysing, 2019; Khanna & Anton, 2002; Maxwell et al., 2000b; Reid & Toffel, 2009).

I propose a two-axis compass as an approximate guide to classify these actions (Figure 1). On the one hand, the private/public distinction serves the purpose of describing the extent to which the action can be seen from the outside. The more private it is, the less likely it will become public. On the other, the informal/formal divide also captures a relevant difference. Informal contacts between public officials and subjects (individuals or corporations) under their hypothetical regulatory power are the bread and butter of the administrative state. But sometimes those contacts occur in formal settings, such as e.g. working groups, periodic and formal meetings that are part of an ongoing public policy process, and so on. Again, the more formal the setting the more likely it is to become visible to an outside observer. These differences are important for theoretical and practical reasons. From a theoretical point of view, the visibility of these events is relevant for how policy-making is supposed to happen in a democratic community (van de Kerkhof, 2023). The open legislature (with open debates, open and public meetings, and other forms of openness) offers the regulatory ideal, where the formal process of rule-making can be monitored by outsiders and—eventually—challenged. From a practical perspective, the difference is relevant because, as one moves further towards the upper-right quadrant of the compass, events become more “observable” through research. By contrast, events in the lower-left quadrant are less transparent and less observable through normal research techniques.
B. Desirability and normative problems
Some see benefits in the use of regulatory threats. For instance, researchers who studied the Environmental Protection Agency (EPA) and the Securities Exchange Commission (SEC) in the United States have found that when official pressure is applied, some forms of “voluntary” regulation are more likely to be adopted (Antweiler, 2003; Maxwell et al., 2000b, 2000a; Patten & Trompeter, 2003; Suijs & Wielhouwer, 2019; Susskind & Van Dam, 1986). Similarly, companies may act even when pre-regulatory steps have not been taken, if competing actors are perceived to be organizing to ask public officials for regulation deemed undesirable (Lyon & Maxwell, 2004). In the environmental field the existence of a legal environment committed to enforcing regulations has been found to push companies into voluntary agreements that are formally outside the scope of binding regulation (Short & Toffel, 2010). In these cases, pressure on an industry, or on a handful of relevant companies within it, may yield better results than regulation itself (Suijs & Wielhouwer, 2019, p. 5).
Others have argued that informal processes encourage a kind of “negotiated” regulation that yields better outcomes (Brotman, 1988; Hurwitz, 2019). This is the core of Wu’s argument to defend agency threats as a form of governance in conditions of “high uncertainty” (Wu, 2011, p. 1842). For Wu, regulation through threats is preferable to the alternative of poorly designed regulation, which occurs when agencies make decisions based on insufficient information, or to no intervention at all (Wu, 2011, pp. 1842–1843). Similarly, Halfteck considered that the use of threats “reduces transaction costs and facilitates regulatory bargaining” and “may result in superior regulatory measures, capable of dealing with the underlying policy concerns in a functionally effective and welfare-enhancing manner” (Halfteck, 2008, p. 704). Rakoff maintained that “open-ended, general grants of authority are needed in a fast-changing world where not all evils can be foreseen” (Rakoff, 2000, p. 171). Threats, once deprived of their negative normative connotation, can be placed within the broader category of regulatory mechanisms and procedures that deviate from the formal rule-making model, in which ideas of dialogue, negotiation, and compromise emerge (Husovec, 2023; Marsden et al., 2020). Communication between regulators and private individuals or corporations appears to be an essential part of Ayres and Braithwaite’s pyramid of regulatory strategies (Ayres & Braithwaite, 1992, ch. 2).
However, two problems arise with the use of informal processes to achieve regulatory ends. First, while such processes can indeed be effective from the government’s point of view, many of them occur behind closed doors and are hidden to the public (Mendelson, 2006). Thus, the values of openness, transparency, participation, and stakeholder engagement associated with the formal processes of rule-making are sacrificed (Ackley, 1978, p. 508; Anthony, 1992, p. 1312; Noah, 1997, p. 941). To an extent, this problem is the obverse of that posed by lobbying, through which powerful private actors often influence decisions by democratic or technocratic bodies. A regulatory proposal opens channels of communication between public and corporate officials in which its potential success looms in the background, but also where processes of deliberation and persuasion take place and where companies, as well as other private actors, influence the final outcome. The opacity of regulatory threats and lobbying efforts is the same opacity, and they pose similar problems2.
The second problem derives from the first. Because government officials can exercise this form of pressure under the radar of outside opposition and oversight, they can both prevent the intervention of veto players who may object to their demands. They may also request actions that could not be achieved through the formal process because they fall outside the scope of their jurisdiction or would be against the law. This problem has been the main focus of attention of First Amendment jawboning cases in the United States (Duffield, 2022), where courts have generally held that public officials may persuade others to do what they wish, but they cannot coerce them through threats (National Rifle Association of America v. Vullo, 2024, Bantam Books Inc. v. Sullivan, 1963). It is a line that is difficult to draw and that depends on extensive judicial discovery. In Europe, on the other hand, threats of enforcement of the DSA – what Leerssen calls lawboning – have created some push back in civil society organizations, which have complained about the potential negative effects of this governance mechanism on the public sphere (Leerssen, 2024; Access Now et al., 2024).
C. Threats and internet governance
Regulation by bullying is extremely likely and problematic on the internet because of the structure of incentives that control the behaviour of intermediaries, the market-concentration dynamic that increased their power, and the anxiety between elites in the West built around specific internet-based problems such as disinformation and hate speech, especially since 2016. Unlike newspapers, internet platforms are not entirely outside the scope of regulatory authorities. Unlike broadcasters, they are not subjected to a specific regulatory regime. But they are vulnerable to threats of change in the regulatory landscape.
1. Causes
Indeed, the regime of no or limited intermediary liability that was established in an early regulatory stage on the internet meant that corporations would both be relatively free from liability for content produced by others and that they were free to moderate content as they saw fit (Communications Decency Act, 1996). The Communications Decency Act of 1996 was drafted with the expectation that companies would undertake their own efforts at curbing specially problematic speech (Kosseff, 2019, p. 2). Section 230 granted immunity, and the Digital Millennium Copyright Act of 1998 established similar protections to deal with copyright infringement (Digital Millenium Copyright Act, 1998; Bambauer, 2015, pp. 61–65). The 2000 European directive on Electronic Commerce did the same (Directive 2000/31/CE, 2000). Because all these rules can be revisited and – in the last few years—have been increasingly questioned, threats of changing these regimes became a fundamental part of a self-regulatory mechanism that carried with it certain expectations (Black, 1996; Braithwaite, 1982; Ogus, 1995, pp. 97–98). The model was flexible and promised to better absorb “the transnational conflicts inherent in the global architecture of the Internet” (Price & Verhulst, 2000, p. 151). But it presented problems that Price and Verhulst foresaw in 2000 regarding private censorship, that “can be more coercive and sweeping than its public form. And the dangers of constitutional violation are particularly striking where the self-regulatory entity is acting in response to government or as a means of preempting its intervention” (Price & Verhulst, 2000, p. 151).
The incentives laid down by these rules are a double-edged sword. On the one hand, they free corporations from the obligation to moderate illegal content and be liable for failing to do so. On the other hand, they create a powerful incentive to be receptive to government requests and “knuckle under government jawboning over content” (Bambauer, 2015, p. 81). This dynamic turns internet companies into careful listeners to what public officials have to say regarding the way they exercise the privilege of moderating content as they want, to signal that they are worthy of the indemnity the law provides (Bambauer, 2023; Kern, 2022; Kerry, 2021). While intermediary liability laws are not the only lever at the disposal of public officials, liability reform is certainly part of the equation that makes regulatory threats possible and effective in the field of internet governance (Kern, 2022; Bambauer 2015, 87; Masnick 2024a). The regulatory arsenal in the hands of the state is, however, much greater in scope. It includes antitrust regulation and enforcement, data protection laws, transparency mandates, and new co-regulatory regimes based on risk assessments like the DSA or the Online Safety Act of the UK (Del Campo, 2025).
These incentives have become stronger because of the ongoing process of market-share concentration in several industry sectors, such as search engines (Srinivasan, 2021) and social networks (Benzell & Chang, 2023; Funta, 2020). Within the context of a broadly laissez-faire attitude by antitrust authorities that lasted for decades, especially in the United States (Argentesi et al., 2021), these services captured significant shares of their relevant markets and became especially important as “Internet points of control” (Zittrain, 2003) or chokepoints (Tusikov, 2016a), over which governments can exert new school modes of control over speech (Balkin, 2014; Douek, 2022, pp. 542–543). The method used by regulators “consists in leaning on these points of control as regulatory levers. The so-called ‘co-regulatory mechanism’ must be understood … as a legal device designed to put pressure on the points of control to achieve some regulatory result” (Frydman et al., 2012, p. 133).
This mechanism operates across jurisdictions. Sander has found that “the adoption by Facebook, Microsoft, Twitter and YouTube of a shared industry database of hashes for terrorist content appears to have been timed to diminish the prospect of future regulation that was feared might follow the European Commission’s critical review of their compliance with the Code of Conduct on Countering Illegal Hate Speech Online” (Sander, 2020, p. 952). Citron considers that most European initiatives before the DSA were driven by regulatory threats (Citron, 2017, p. 1043). As Hurwitz puts it, “the theory is simple: because no CEO likes to testify before Congress, spending time forced to answer questions intended to embarrass them and their company (to use one example), CEOs will conduct the company’s business to avoid such experiences” (Hurwitz, 2019, p. 32). Senator Dianne Feinstein was straightforward in the hearings called by the United States Congress on the fall of 2017 over disinformation and its perceived effects on the 2016 electoral process: “You’ve created these platforms and now they are being misused, and you have to be the ones to do something about it, or we will” (Duffield, 2022, p. 9). One of the most cited examples of the contrast betweenunsuccessful formal action against successful informal pressure was the SOPA/PIPA bills in the United States. While these were defeated in Congress, some of its provisions were later adopted voluntarily by major internet companies in what Tusikov called “secret handshake deals” (Tusikov, 2016b) that were “driven underground” (Bambauer, 2015, p. 53). Karanicolas has described how the Canadian government threatened Internet companies with regulation while expecting—from them—some form of “voluntary” action (Elkin-Koren, 2022, p. 185; Karanicolas, 2019b, p. 218). Prime Minister Trudeau stated that lack of compliance with these desires could lead to “meaningful financial consequences” (Karanicolas, 2019b, p. 218). Around that time, Canada hosted a meeting by the International Grand Committee on Big Data, Privacy, and Democracy, which Karanicolas reads as part of the broader pressuring effort (Karanicolas, 2019b, p. 218). He also found that when public officials raise questions over content moderation and mix them with antitrust concerns, the tactic seems to be more effective (Karanicolas, 2021b, p. 8). In Brazil, the Electoral Court asked for and got a good working relationship with internet companies in the context of the election disinformation crisis of 2018, a kind of cooperation that companies were used to providing in more menacing contexts (Santos, 2021). More recently, Brazilian judges famously clashed with X.com after some of their requests were not fulfilled (Nicas, 2024). And in the context of the DSA enforcement, commissioner Thierry Breton has been extremely vocal in terms of his desires and those of the European Commission, to the point that his requests has been criticised by civil society and somewhat rebuked by his peers (Access Now, Advocacy Initiative for Development (AID), et al., 2023; Access Now et al., 2024; Hancock, 2024; Tar, 2024), a dynamic that – eventually – led to him being pushed out of the Commission (Rankin, 2024) The anxiety over disinformation and hate speech that sprung around 2016 also explains the internet’s companies vulnerabilities, because pressure on them to act increased in parallel with growing concerns among Western governments. The laws adopted by France and Germany were precursors of the Digital Services Act of the European Union that – through a different approach – sought to reign on very large online platforms and search engines (Netzwerkdurchsetzungsgesetz [Network Enforcement Act], 2017, Digital Services Act, 2022, Loi No. 2020-766 Visant à Lutter Contre Les Contenus Haineux Sur Internet [Law No. 2020-766 Aimed at Combating Hateful Content on the Internet], 2020).
2. Invisibility
The main problem with these mechanisms is their opacity: they are almost invisible for outside observers (Duffield, 2023). Although communications between public and corporate officials sometimes emerge in public records, they most often occur in private settings beyond public scrutiny, as records obtained through discovery in United States courts show (Missouri v. Biden, 2023a, Missouri v. Biden, 2023b). These cases opened an unusual window into the kind of interactions through which regulatory threats happen in the field of internet governance. The kind of extensive judicial discovery behind these decisions is not entirely unusual (Writers Guild of America, West, Inc. v. FCC, 1976, National Rifle Association of America v. Vullo, 2024) but cases are few. Corporations are not generally inclined to protect the speech of their users (Keller, 2023), so litigation only happens when a third party feels affected by dynamics they cannot really see. However, those events in the upper part of the two-axis quadrant (Figure 1) are only part of the process and not the most important ones.
To reach this conclusion, we cross-references the 2021 events described in the District Court decision of Missouri v. Biden (n=92) with the 2021 events (n=966) gathered in a database we built in order to capture some of the public-facing expression of threatening dynamics 3. We coded all events to visualize them in a timeline that could reveal the extent to which some of them were visible for outside observers at the time they happened 4. The exercise showed that most of the communications between public and corporate officials happened through emails (n=51), calls (n=2), and meetings (n=14). But on some occasions issues were taken “to the public” in ways that affected the ongoing conversations. These included different forms of public announcements such as talking to the media, speeches at public events, policy announcements by corporations, or press conferences (n=23). Some formal actions – such as depositions (n=1) or a Request for Information (“RFI”) by the Office of the Surgeon General (n=1) – made it into the judicial decision as relevant.
The record produced during the Missouri v. Biden saga revealed that conversations between corporate and public officials were ongoing and requests were routine. The events reveal increasing frustration among public officials, who exert quite considerable pressure on companies and who,on occasion,take their concerns to the public. One of the most prominentpublic events in this dynamic was when President Biden said that platforms were “killing people” by failing to respond adequately to disinformation (Kanno-Youngs & Kang, 2021). This comment was preceded by a press conference by Surgeon General Murthy in which he explicitly acknowledged they were pressuring corporations to “do more”. These public events were followed by private communications in which corporate officials asked to “get back into the White House’s good graces” (Missouri v. Biden, 2023b, p. 25). A day later, Facebook’s Nick Clegg’s emailed Murthy showing himself “keen to find a way to de-escalate and work together” (Missouri v. Biden, 2023b, p. 35). The following day, President Biden clarified that he was not talking about corporations but about the twelve individuals behind Covid-19 disinformation (Biden, 2021). But two days before Biden’s initial comment, Rob Flaherty—then Deputy Assistant to the President and Director of Digital Strategy—sent an angry email to a Facebook official demanding answers in the strongest possible terms 5. As former corporate officials Harbath and Perault put it, taking private conversations to the public is a rather normal way of administering threats that public officials regularly use (Harbath & Perault, 2023).
The record of the Missouri v. Biden saga is hardly conclusive from the standpoint of the American legal standard on jawboning. A majority of the Supreme Court found it lacking, and – in particular – the factual pattern laid out by the District Court has been criticized as flawed and misrepresentative of the actual communications (Masnick, 2025). What I wish to recover from this litigation is not the viability of the rule proposed by United States courts to distinguish between coercion and persuasion and how American judges assess facts, but rather the communicative patterns through which threats may occur. In that sense, the discovery in Missouri v. Biden, as well as the discovery fifty years earlier in Writers Guild, suggests that most communications between public and corporate officials happen behind closed doors, outside the public’s view (on the left side of the threat compass presented in Figure 1).
These dynamics, however, regularly spill over into the public sphere, in the form of press-releases by potential regulators, public speeches, or more formal steps towards regulation. The facts uncovered by American courts in these cases suggest that when public officials put corporations under the spotlight it would be reasonable to believe that private communications are happening beyond public purview. Consider, for instance, the Press Corner of the European Commission 6. It is an immense database of press briefings, conferences, and announcements. It is full of statements that could be read as part of threatening dynamics. Again, just focusing on 2021 – when the Digital Services Act was being developed – reveals statements by Commissioner Věra Jourová on platforms’ shortcomings to deal with disinformation in the context of elections (European Commission Press Corner, 2021c), demands for more transparency by Commissioner Breton in the context of the Covid-19 pandemic (European Commission Press Corner, 2021b), Commissioner Ylva Johansson proposed regulations that engage specific corporate politics such as Facebook’s policy of implementing end-to-end encryption on its Messenger service (European Commission Press Corner, 2021a), and so on.
Letters by Commissioner Breton to corporations in the context of DSA enforcement have brought the mechanism under scrutiny in the European Union 7. The pattern of public interactions moved from threats of enforcement to actual pre-enforcement decisions, such as the opening of regulatory proceedings or investigations 8or formal requests for information 9and actual exchanges with corporate officials 10in which events of DSA infringement were explicitly announced 11. TikTok’s decision to remove TikTok Lite from the European market came months after a formal investigation was announced and was celebrated as a direct consequence of that decision 12. Similarly, to appease raising concerns regarding its links to the Chinese government, TikTok’s controlling company ByteDance Ltd. decided to change the default location of United States users’ data storage to Oracle servers located in the United States (Calamug, 2022). This did not prevent, however, swift regulatory action against the company through the passage of the Protecting Americans from Foreign Adversary Controlled Applications Act in April 2024, whose enforcement is currently on hold.
In the United States, Mark Zuckerberg’s letter to the Committee on the Judiciary of the House of Representatives acknowledged pressures by public officials from all over the world (Zuckerberg, 2024). When discussing the Biden administration’s communications during the Covid-19 pandemic, Zuckerberg considered that the “government pressure was wrong” and regretted “that we were not more outspoken about it” (Zuckerberg, 2024). He also said that the company was “ready to push back if something like this happens again” (Zuckerberg, 2024). In January 2025, Zuckerberg announced major changes to his companies’ content moderation approach, including eliminating fact-checkers in the United States and simplifying rules (Hendrix, 2025). He argued that this meant going “back to our roots” and “restoring free expression on our platforms” (Hendrix, 2025), a narrative that aligns with demands made by top government officials regarding alleged censorship of conservative viewpoints in social media companies and a broad attack – by US officials – on the European Digital Services Act (Carr, 2025). ABC’s folding under pressure from the FCC – “we can do this the easy way or the hard way” – is just the latest iteration of a move that is pervasive, usually effective, and deserves from us closer scrutiny (Koblin, 2025).
D. Conclusion
Threatening internet companies with regulations they dislike has been a stable, well-documented governance mechanism used by public officials in the United States and elsewhere. While the issue has been discussed in the context of American jawboning judicial cases, the mechanism should be conceptualised more broadly for the reasons discussed above. It is necessary to understand how the mechanism operates in practice and how it relates to – and, in fact, exists because of – formal rule-making processes. The lines drawn by American judges are ill-suited to capture the complexity of how this dynamic unfolds in practice, and the public policy cycle should be scrutinized under the assumption that these informal methods of governance are effective. Regulatory and enforcement threats should – then – be scrutinised when they allow public officials to do informally what they are not allowed to do through formal regulation.
This is more so after the Digital Services Act of the European Union came into force (Digital Services Act, 2022). The act, with all its complexity, partially shifts the approach of the self-regulatory model that ruled for the internet’s first thirty years. While it still protects intermediaries for content produced by third-parties, it imposes upon very large online platforms and search engines general duties and specific obligations under a broad regulatory regime that gives the European Commission and national authorities significant power over intermediaries. The incentives of the latter with regard to users’ speech rights, however, remain unchanged. As the first official regulatory actions taken by the European Commission show, the back and forth between European regulators and (mostly) United States corporations will be intense in the years to come (Elon Musk [@elonmusk], 2024; Breton, T. [@ThierryBreton], 2024h). The jawboning rule proposed in the United States is difficult to administer and extremely demanding in terms of facts and discovery. The United States Supreme Court has already showed some degree of scepticism in drawing inferences from a complex and muddy record (Murthy v. Missouri, 2024) but it has been more willing to draw the line when facts were simpler (National Rifle Association of America v. Vullo, 2024).
The rule, however, is artificial. The public policy cycle is a continuum that should not be segmented. In the agenda setting stage, speech by public officials is crucially important. The formal process of regulation always begins informally. The fact that all regulations can change is always present in the background of interactions between corporate and public officials. The compass proposed in Figure 1 is descriptively accurate of what we should be examined to adequately understand how pressuring dynamics operate in practice, even though it is empirically difficult to see them from the outside. Private and informal communications happen in parallel to public utterances of requests, demands, and complaints. But they are invisible: as discussed before, we only see the public-facing side of these communications. This limitation offers a possible path forward of narrow regulatory action mandating transparency in corporate and public official’s communications, including obligations of record-keeping and regular disclosures, especially on matters where third-party rights or interests may be unfairly affected by informal regulation that taps on the structure of incentives that control the relationship between public officials and internet intermediaries. This proposal gathered some support in the symposium held by Knight First Amendment Institute at Columbia University in October 2023 (Duffield, 2023; Grossman & Shapiro, 2023). The arguments presented in this paper should be read as providing further reasons for this course of action, in the United States and elsewhere. While in many countries the private but official communications of public officials are subjected to record-keeping obligations, and disclosures may happen in a number of ways – through time, through freedom of information requests, and so on – in many jurisdictions around the globe these communications are not covered by freedom of information regimes, are difficult to enforce or easy to evade (Astudillo Muñoz, 2020; Gamarra Galindo, 2019; Moreno Carrasco, 2015). For that reason, new rules should include private communications of public officials within the scope of the proactive transparency obligations that states should fulfil (Darbishire, 2010).
Looking ahead, further research should produce new case studies of pressuring dynamics on the ground from a comparative standpoint. What motivates public officials? Are they pursuing structural, company-wide changes (as in e.g., new rules or modifications to terms of services) or issue specific (e.g., to increase moderation of certain contents)? How do countries with diminished de facto jurisdiction over powerful internet intermediaries use threats in their relationship to companies, and how do the latter respond? To what extent are partnerships or informal working relationships affected by the existence—diminished or not—of the state’s threatening power? Another important question is why the only jurisdiction to come up with a jawboning rule is the United States, even though the practice exists everywhere (van de Kerkhof, 2023). An explanation may lie in the fact that, in the United States, legislation occurs through a democratically elected legislature, whereas in Europe the policy process is driven by experts at the European Commission. Hence, it is possible that the fear of abuse underlying the American jawboning jurisprudence is directly linked to a concern typical of constitutionalism: the fear of political abuse and democratic excess (Holmes, 1993). This fear, in Europe, may be more subdued.
These questions call for further research and critical analysis, in order to adequately understand how the mechanism manages to govern the Internet in such effective yet elusive ways.
References
Access Now, Advocacy Initiative for Development (AID), Africa Media and Information Technology Initiative (AfriMITI), African Freedom of Expression Exchange (AFEX), AI Forensics, Bits of Freedom, Centre for Democracy and Technology (Europe), Article 19, & EFF. (2023, July 26). Civil society statement: Commissioner Breton needs to clarify comments about the DSA allowing for platform blocking [Letter]. https://www.accessnow.org/press-release/dsa-internet-blocking-statement/
Access Now, Article 19, Algorithm Watch, ApTI, 7amaleh, Bits of Freedom, Digitale Gesellschaft (DE), & Electronic Frontiers Foundation (EFF). (2023, October 17). Civil society open letter to Commissioner Breton. Precise interpretation of the DSA matters especially when people’s lives are at risk in Gaza and Israel [Letter]. https://www.article19.org/wp-content/uploads/2023/10/Civil-society-open-letter-to-Commissioner-Breton.pdf
Access Now, Article 19, & Electronic Frontiers Foundation (EFF). (2024, August 24). Civil society statement: Commissioner Breton needs to stop politicising the Digital Services Act [Letter]. https://www.article19.org/wp-content/uploads/2024/08/Letter-to-Commissioner-Breton-August-2024.pdf
Ackley, G. (1978). Implications for policy: A symposium. Brookings Papers on Economic Activity, 1978(2), 507. https://doi.org/10.2307/2534272
Acutt, M., Elliott, C., & Robinson, T. (2001). Credible regulatory threats. Energy Policy, 29(11), 911–916. https://doi.org/10.1016/S0301-4215(01)00026-X
Anthony, R. A. (1992). Interpretive rules, policy statements, guidances, manuals, and the like: Should federal agencies use them to bind the public? Duke Law Journal, 41(6), 1311. https://doi.org/10.2307/1372817
Antweiler, W. (2003). How effective is green regulatory threat? American Economic Review, 93(2), 436–441. https://doi.org/10.1257/000282803321947489
Argentesi, E., Buccirossi, P., Calvano, E., Duso, T., Marrazzo, A., & Nava, S. (2021). Merger policy in digital markets: An ex post assessment. Journal of Competition Law & Economics, 17(1), 95–140. https://doi.org/10.1093/joclec/nhaa020
Astudillo Muñoz, J. L. (2020). El derecho a la vida privada de los funcionarios públicos frente al derecho de acceso a la información pública. Un estudio a la luz de la jurisprudencia del Consejo para la Transparencia en Chile [The right to privacy of public officials versus the right of access to public information. A study in light of the jurisprudence of the Council for Transparency in Chile]. Derecho global. Estudios sobre derecho y justicia, 5(15), 89–112. https://doi.org/10.32870/dgedj.v5i15.277
Ayres, I., & Braithwaite, J. (1992). Responsive regulation: Transcending the deregulation debate. Oxford University Press.
Balkin, J. M. (2014). Old-School/New-School Speech Regulation. Harvard Law Review, 127(8), 2296–2342.
Bambauer, D. E. (2015). Against jawboning. Minn. L. Rev., 100, 51.
Bambauer, D. E. (2023, October 30). Be careful what you ask for. Knight First Amendment Institute at Columbia University. http://knightcolumbia.org/blog/be-careful-what-you-ask-for
Bantman Books Inc. v. Sullivan, 372 U.S. 58 (United States Supreme Court 1963).
Bartels, A. H. (1983). The Office of Price Administration and the legacy of the New Deal, 1939-1946. The Public Historian, 5(3), 5–29. https://doi.org/10.2307/3377026
Bazelon, D. L. (1975). FCC regulation of the telecommunications press. Duke Law Journal, 1975, 213.
Becker, G. S. (1983). A theory of competition among pressure groups for political influence. The Quarterly Journal Of Economics, 98(3), 371–400. https://doi.org/10.2307/1886017
Benzell, S. G., & Chang, F. B. (2023). Evaluating antitrust remedies for platform monopolies: The case of Facebook. Vanderbilt Law Review, 76, 773.
Biden, J. (2021, July 19). Remarks on the national economy and an exchange with reporters. The American Presidency Project. https://www.presidency.ucsb.edu/documents/remarks-the-national-economy-and-exchange-with-reporters-10
Black, J. (1996). Constitutionalising self-regulation. The Modern Law Review, 59(1), 24–55. https://doi.org/10.1111/j.1468-2230.1996.tb02064.x
Braithwaite, J. (1982). Enforced self-regulation: A new strategy for corporate crime control. Michigan Law Review, 80(7), 1466–1507. https://doi.org/10.2307/1288556
Breton, T. [@ThierryBreton]. (2022, October 28). 👋 @elonmusk In Europe, the bird will fly by our 🇪🇺 rules. #DSA [Tweet]. Twitter. https://x.com/ThierryBreton/status/1585902196864045056
Breton, T. [@ThierryBreton]. (2023a, May 26). Twitter leaves EU voluntary Code of Practice against disinformation. But obligations remain. You can run but you can’t hide. [Tweet]. Twitter. https://x.com/ThierryBreton/status/1662194595755704321
Breton, T. [@ThierryBreton]. (2023b, October 10). Following the terrorist attacks by Hamas against 🇮🇱, we have indications of X/Twitter being used to disseminate illegal content & disinformation in the EU [Tweet]. Twitter. https://x.com/ThierryBreton/status/1711808891757944866
Breton, T. [@ThierryBreton]. (2023c, October 11). The #DSA is here to protect free speech against arbitrary decisions, and at the same time protect our citizens & democracies [Tweet]. Twitter. https://x.com/ThierryBreton/status/1712126600873931150
Breton, T. [@ThierryBreton]. (2023d, December 18). Today we open formal infringement proceedings against @X : ⚠️ Suspected breach of obligations to counter #IllegalContent and #Disinformation ⚠️ [Tweet]. Twitter. https://x.com/ThierryBreton/status/1736701607553692020
Breton, T. [@ThierryBreton]. (2024a, February 19). Today we open an investigation into #TikTok over suspected breach of transparency & obligations to protect minors: 📱Addictive design & screen time limits [Tweet]. Twitter. https://x.com/ThierryBreton/status/1759533374295667087
Breton, T. [@ThierryBreton]. (2024b, April 17). Is social media “lite” as addictive and toxic as cigarettes “light”? We have just sent a request for information [Tweet]. Twitter. https://x.com/ThierryBreton/status/1780585191108423941
Breton, T. [@ThierryBreton]. (2024c, April 30). Today we open cases against #Meta for suspected breach of #DSA obligations to protect integrity of elections [Tweet]. Twitter. https://x.com/ThierryBreton/status/1785246555710087280
Breton, T. [@ThierryBreton]. (2024d, May 16). 🚨 Today we open formal #DSA investigation against #Meta. We are not convinced that Meta has done enough to comply with the DSA obligations [Tweet]. Twitter. https://x.com/ThierryBreton/status/1791044200504385562
Breton, T. [@ThierryBreton]. (2024e, July 12). Back in the day, #BlueChecks used to mean trustworthy sources of information✔️🐦 Now with X, our preliminary view is that [Tweet]. Twitter. https://x.com/ThierryBreton/status/1811699711591489637
Breton, T. [@ThierryBreton]. (2024f, July 12). Be our guest @elonmusk ⚖️🇪🇺 There has never been—And will never be—Any “secret deal” [Tweet]. Twitter. https://x.com/ThierryBreton/status/1811811489889517697
Breton, T. [@ThierryBreton]. (2024g, August 12). The available brain time of young Europeans is not a currency for social media—And it never will be [Tweet]. Twitter. https://x.com/ThierryBreton/status/1820399371583713470
Breton, T. [@ThierryBreton]. (2024h, August 12). With great audience comes greater responsibility #DSA As there is a risk of amplification of potentially harmful content in 🇪🇺 [Tweet]. Twitter. https://x.com/ThierryBreton/status/1823033048109367549
Brito, J. (2014). Agency threats and the rule of law: An offer you can’t refuse. Harv. JL & Pub. Pol’y, 37, 553.
Brotman, S. (1988). Communications policy-making at the FCC: Past practices, future direction. Cardozo Arts & Ent LJ, 7, 55.
Chang, S., Kalmenovitz, J., & Lopez-Lira, A. (2023). Follow the pipeline: Anticipatory effects of proposed regulations (SSRN Scholarly Paper No. 4360231). https://doi.org/10.2139/ssrn.4360231
Citron, D. K. (2017). Extremist speech, compelled conformity, and censorship creep. Notre Dame L. Rev., 93, 1035.
Communications Decency Act, Pub. L. No. §230, 47 U.S.C (1996).
Cooper, C. (1978). Writers Guild of America, West, Inc. v. FCC: A First Amendment Blow to FCC Jawboning. Ariz. L. Rev., 20, 315.
Corn-Revere, R. (1995). Television violence and the limits of voluntarism. The Yale Journal on Regulation, 12, 187.
Cotterill, R. W. (1999). Jawboning cereal: The campaign to lower cereal prices. Agribusiness: An International Journal, 15(2), 197–205.
Darbishire, H. (2010). Proactive transparency: The future of the right to information? (No. 56598; Governance Working Paper Series). World Bank. https://doi.org/10.1596/25031
Digital Millenium Copyright Act, 105–304 Pub. L. (1998). https://www.congress.gov/bill/105th-congress/house-bill/2281
Digital Services Act, Pub. L. No. 2022/2065, OJ L 277, 27.10.2022 1 (2022). https://eur-lex.europa.eu/eli/reg/2022/2065/oj
Directive 2000/31/CE, DOCE L 178/11 (2000). https://eur-lex.europa.eu/legal-content/ES/TXT/HTML/?uri=CELEX%3A32000L0031
Douek, E. (2022a). Content moderation as systems thinking. Harvard Law Review, 136, 526.
Douek, E. (2022b). The siren call of content moderation formalism. In L. Bollinger & G. Stone (Eds), Social media, freedom of speech, and the future of our democracy (pp. 139–156). Oxford University Press.
Duffield, W. (2022). Jawboning over social media’s handling of Hunter Biden.
Duffield, W. (2023). Judge blocks jawboning? https://policycommons.net/artifacts/4450438/judge-blocks-jawboning/5247707/
Elkin-Koren, N. (2022). Government–platform synergy and its perils. In E. Celeste, A. Heldt, & C. I. Keller (Eds), Constitutionalising social media. Hart Publishing.
Erfle, S., McMillan, H., & Grofman, B. (1989). Testing the Regulatory Threat Hypothesis: Media coverage of the energy crisis and petroleum pricing in the late 1970s. American Politics Quarterly, 17(2), 132–152. https://doi.org/10.1177/1532673X8901700202
Erfle, S., McMillan, H., & Grofman, B. (1990). Regulation via threats: Politics, media coverage, and oil pricing decisions. Public Opinion Quarterly, 54(1), 48–63. https://doi.org/10.1086/269183
European Commission. (2021a, January 26). EU Internet Forum Ministerial: Towards a coordinated response to curbing terrorist and child sexual abuse content on the internet [Text]. European Commission Press Corner.
European Commission. (2021b, January 28). Daily News 28/01/2021 [Text]. European Commission Press Corner. https://ec.europa.eu/commission/presscorner/detail/en/mex_21_283
European Commission. (2021c, November 25). Speech by Vice-President Jourová [Text]. European Commission Press Corner. https://ec.europa.eu/commission/presscorner/detail/en/speech_21_6309
European Commission. (2024a, April 22). Commission opens proceedings against TikTok under the DSA. European Commission Press Corner. https://ec.europa.eu/commission/presscorner/detail/en/ip_24_2227
European Commission. (2024b, August 5). TikTok commits to permanently withdraw TikTok Lite Rewards. European Commission Press Corner. https://ec.europa.eu/commission/presscorner/detail/en/IP_24_4161
Fisher, R. W. (1970). Labor and the economy in 1969. Monthly Labor Review, (Jan 1970), 30–43.
Frydman, B., Hennebel, L., & Lewkowicz, G. (2012). Co-regulation and the rule of law. In E. Brousseau, M. Marzouki, & C. Méadel (Eds), Governance, Regulation and Powers on the Internet (Illustrated edition). Cambridge University Press.
Funta, R. (2020). Social networks and potential competition issues. Krytyka Prawa, 12(1), 193–205. https://doi.org/10.7206/kp.2080-1084.369
Gamarra Galindo, M. A. (2019). Correos electrónicos de las cuentas oficiales de los funcionarios públicos como ámbito de protección del derecho de acceso a la información pública en el ordenamiento jurídico peruano [Emails from the official accounts of public officials as an area of protection of the right of access to public information in the Peruvian legal system] [Thesis, Pontificia Universidad Católica del Perú]. https://tesis.pucp.edu.pe/repositorio//handle/20.500.12404/13232
Glazer, A., & McMillan, H. (1992). Pricing by the firm under regulatory threat. The Quarterly Journal of Economics, 107(3), 1089–1099. https://doi.org/10.2307/2118376
Gorwa, R. (2019). What is platform governance? Information, Communication & Society, 22(6), 854–871. https://doi.org/10.1080/1369118X.2019.1573914
Grossman, A., & Shapiro, K. (2023). Shining a light on censorship: How transparency can curtail government social media censorship and more (Briefing Paper No. 168). Cato Institute. https://www.cato.org/briefing-paper/shining-light-censorship-how-transparency-can-curtail-government-social-media
Haberler, G. (1972). Incomes policy and inflation: Some further reflections. The American Economic Review, 62(1/2), 234–241.
Hagemann, R., Huddleston Skees, J., & Thierer, A. (2018). Soft law for hard problems: The governance of emerging technologies in an uncertain future. Colorado Technology Law Journal, 17(1), 37–130.
Halfteck, G. (2008). Legislative threats. Stanford Law Review, 61(3), 629–710.
Hall, P., & Hysing, E. (2019). Advancing voluntary chemical governance? The case of the Swedish textile industry dialogue. Journal of Environmental Planning and Management, 62(6), 1001–1018. https://doi.org/10.1080/09640568.2018.1457515
Hancock, A. (2024, August 13). Brussels slaps down Thierry Breton over ‘harmful content’ letter to Elon Musk. Financial Times. https://www.ft.com/content/09cf4713-7199-4e47-a373-ed5de61c2afa
Harbath, K., & Perault, M. (2023, October 4). Jawboned [Blog]. Knight First Amendment Institute at Columbia University. http://knightcolumbia.org/blog/jawboned
Haucap, J., Heimeshoff, U., & Uhde, A. (2007). Credible threats as an instrument of regulation for network industries. In P. J. J. Welfens & M. Weske (Eds), Digital Economic Dynamics: Innovations, Networks and Regulations (pp. 171–202). Springer. https://doi.org/10.1007/978-3-540-36030-8_9
Héritier, A., & Lehmkuhl, D. (2008). Introduction: The shadow of hierarchy and new modes of governance. Journal of Public Policy, 28(1,), 1–17.
Hipple, R. J., & Harkelroad, D. R. (1975). Anomalies of SEC enforcement: Two areas of concern. Emory Law Journal, 24, 696.
Horwitz, M. J. (1993). The Supreme Court, 1992 term – Foreword: The constitution of change: Legal fundamentality without fundamentalism. Harv. L. Rev., 107, 30.
Hunt, A., & Wickham, G. (1994a). Foucault and law: Towards a sociology of law as governance. Pluto Press. http://books.google.com/books?hl=en&lr=&id=jFQwTHyworUC&oi=fnd&pg=PR6&dq=%22juridical+monarchy%22%3B+in+the+premodern+era,+law+provides%22+%22and+analyzed+by+Foucault+as+a+consequence+of+discipline,+the+rise%22+%22ignoring+the+indisputable+significance+of+state+and+other+forms+of%22+&ots=I-Umu5mbNM&sig=hHzkG0o-EHJ5NU9obbylgCW6HPA
Hunt, A., & Wickham, G. (1994b). Foucault and law: Towards a sociology of law as governance. Pluto Press.
Hurwitz, J. (2019). Regulation as partnership. Journal of Law and Innovation, 3, 1.
Husovec, M. (2023). How to facilitate data access under the Digital Services Act. Available at SSRN 4452940.
Jorgensen, C. A., & Clark, J. J. (n.d.). Interlocking Directorates and Section 8 of the Clayton Act. Albany Law Review, 44(139).
Joskow, P. L., & MacAvoy, P. W. (1975). Regulation and the financial condition of the electric power companies in the 1970’s. The American Economic Review, 65(2), 295–301.
Kane, E. J. (1990). Bureaucratic self-interest as an obstacle to monetary reform. In T. Mayer (Ed.), The political economy of American monetary policy. Cambridge University Press. https://doi.org/10.1017/CBO9780511571947
Kanno-Youngs, Z., & Kang, C. (2021, July 16). ‘They’re killing people’: Biden denounces social media for virus disinformation. The New York Times. https://www.nytimes.com/2021/07/16/us/politics/biden-facebook-social-media-covid.html
Karanicolas, M. (n.d.). Squaring the circle between freedom of expression and platform law. Pitt. J. Tech. L. & Pol’y, 20(177).
Karanicolas, M. (2019). Subverting democracy to save democracy: Canada’s extra-constitutional approaches to battling ‘fake news’. Available at SSRN 3423092.
Karanicolas, M. (2021a). Authoritarianism as a service: India’s moves to weaponize private sector content moderation with the 2021 information technology rules. Indian JL & Tech., 17, 25.
Karanicolas, M. (2021b). Too long; didn’t read: Finding meaning in platforms’ terms of service agreements. U. Tol. L. Rev., 52, 1.
Keller, D. (2019). Who do you sue? State and platform hybrid power over online speech (Aegis Series No. 1902; National Security, Technology, and Law, p. 40). Hoover Institution.
Keller, D. (2023). Platform transparency and the first amendment (SSRN Scholarly Paper No. 4377578). https://doi.org/10.2139/ssrn.4377578
Kern, R. (2022, September 8). White House renews call to ‘remove’ Section 230 liability shield. POLITICO. https://www.politico.com/news/2022/09/08/white-house-renews-call-to-remove-section-230-liability-shield-00055771
Kerry, C. (2021, May 14). Section 230 reform deserves careful and focused consideration. Brookings. https://www.brookings.edu/articles/section-230-reform-deserves-careful-and-focused-consideration/
Khanna, M., & Anton, W. R. Q. (2002). What is driving corporate environmentalism: Opportunity or threat? Corporate Environmental Strategy, 9(4), 409–417. https://doi.org/10.1016/S1066-7938(02)00118-5
Kosseff, J. (2019). The twenty-six words that created the internet. Cornell University Press.
Leerssen, P. (2020). The Soap Box as a Black Box: Regulating transparency in social media recommender systems. European Journal of Law and Technology, 11(2).
Lerner, N. M. (1978). It’s all In the family: Family viewing and the First Amendment. New York University Review of Law and Social Change, 7, 83.
Linder, A. (2017). Explaining shipping company participation in voluntary vessel emission reduction programs. Transportation Research Part D: Transport and Environment, 61, 234–245. https://doi.org/10.1016/j.trd.2017.07.004
Loi No. 2020-766 Visant à Lutter Contre Les Contenus Haineux Sur Internet [Law No. 2020-766 Aimed at Combating Hateful Content on the Internet], Pub. L. Nos 2020–766, 0156 JORF (2020). https://www.legifrance.gouv.fr/jorf/id/JORFTEXT000042031970
Lyon, T. P., & Maxwell, J. W. (2004). Preempting uncertain regulatory threats. https://webuser.bus.umich.edu/tplyon/PDF/Working%20Papers/LyonMaxwellJRESept04.pdf
Marsden, C., Meyer, T., & Brown, I. (2020). Platform values and democratic elections: How can the law regulate digital disinformation? Computer Law & Security Review, 36(2020), 1–18. https://doi.org/10.1016/j.clsr.2019.105373
Maxwell, J. W., Lyon, T. P., & Hackett, S. C. (2000). Self-regulation and social welfare: The political economy of corporate environmentalism. The Journal of Law and Economics, 43(2), 583–618. https://doi.org/10.1086/467466
Mendelson, N. A. (2007). Regulatory beneficiaries and informal agency policymaking. Cornell Law Review, 92, 397.
Missouri v. Biden, 3:22-CV-01213 (United States District Court, Western District Of Louisiana, Monroe Division 4 July 2023).
Missouri v. Biden, No. 23-30445 (United States District Court, Western District Of Louisiana, Monroe Division 8 September 2023).
Monahan, J. (2021). ‘Falsehood flies, and the truth comes limping after’: Combatting online disinformation in the shadow of CUSMA. Dalhousie J. Legal Stud., 30, 63.
Moreno Carrasco, D. (2015). Acceso a la información pública y correos electrónicos de los funcionarios públicos en Chile [Access to public information and emails of public officials in Chile]. Revista chilena de derecho y tecnología, 4(1), 4.
Murthy v. Missouri, 603 U.S. 43 (Supreme Court of the United States 2024).
Musk, E. (2024, July 12). The European Commission offered 𝕏 an illegal secret deal: If we quietly censored speech without telling anyone, they would not fine us. The other platforms accepted that deal. 𝕏 did not [Tweet]. Twitter. https://x.com/elonmusk/status/1811783320839008381
National Rifle Association of America v. Vullo (Supreme Court of the United States 2024).
Netzwerkdurchsetzungsgesetz [Network Enforcement Act], I S. 3352 BGBl (2017). https://www.gesetze-im-internet.de/netzdg/index.html
Noah, L. (1997). Administrative arm-twisting in the shadow of congressional delegations of authority. Wis. L. Rev., 873.
Ogus, A. (1995). Rethinking self-regulation. Oxford Journal of Legal Studies, 15(1), 97–108.
Patten, D. M., & Trompeter, G. (2003). Corporate responses to political costs: An examination of the relation between environmental disclosure and earnings management. Journal of Accounting and Public Policy, 22(1), 83–94. https://doi.org/10.1016/S0278-4254(02)00087-X
Perdiguero, J. (2004). Precios de la gasolina bajo amenaza regulatoria [Petrol prices under regulatory threat]. https://www.researchgate.net/publication/266593602_Precios_de_la_gasolina_bajo_amenaza_regulatoria
Posner, E. A., & Vermeule, A. (2010). The executive unbound: After the Madisonian Republic. Oxford University Press.
Post, R. (2010). Theorizing disagreement: Reconceiving the relationship between law and politics. California Law Review, 98(4), 1319–1350.
Price, M., & Verhulst, S. (2000). The concept of self-regulation and the internet. In J. Waltermann & M. Machill (Eds), Protecting our children on the internet. Towards a new culture of responsibility (pp. 133–198). Bertelsmann Foundation Publishers.
Rakoff, T. D. (2000). The choice between formal and informal modes of administrative regulation. Administrative Law Review, 52(1), 159–174.
Raz, J. (1986). Dworkin: A new link in the chain. CALIFORNIA LAW REVIEW, 74, 18. https://doi.org/10.2307/3480404
Reid, E. M., & Toffel, M. W. (2009). Responding to public and private politics: Corporate disclosure of climate change strategies. Strategic Management Journal, 30(11), 1157–1178. https://doi.org/10.1002/smj.796
Sander, B. (2020). Freedom of expression in the age of online platforms: The promise and pitfalls of a human rights-based approach to content moderation. Fordham International Law Journal, 43(4), 939–1006.
Sander, B. (2021). Democratic disruption in the age of social media: Between marketized and structural conceptions of human rights law. European Journal of International Law, 32(1), 159–193.
Santos, G. F. (2021). Social media, disinformation, and regulation of the electoral process: A study based on 2018 Brazilian election experience. Revista de Investigações Constitucionais, 7, 429–449. https://doi.org/10.5380/rinc.v7i2.71057
Schotland, A. (1970). Private pension funds: A guide for modern investments. The Georgetown Law Journal, 59, 355.
Short, J. L., & Toffel, M. W. (2010). Making self-regulation more than merely symbolic: The critical role of the legal environment. Administrative Science Quarterly, 55(3), 361–396. https://doi.org/10.2189/asqu.2010.55.3.361
Sinclair, D. (2020). Speak loudly and carry a small stick: Prudential regulation and the climate, energy, and finance nexus. In Criminology and Climate (pp. 47–75). Routledge.
Srinivasan, D. (2021). Why Google dominates advertising markets. Stanford Technology Law Review, 24, 55.
Suijs, J., & Wielhouwer, J. L. (2019). Disclosure policy choices under regulatory threat. The RAND Journal of Economics, 50(1), 3–28. https://doi.org/10.1111/1756-2171.12260
Susskind, L., & Van Dam, L. (1986). Squaring off at the table, not in courts. Tech. Rev., July 1986.
Tar, J. (2024, August 22). Civil society criticises Commissioner Breton’s approach to EU digital rulebook. Euractiv. https://www.euractiv.com/section/platforms/news/civil-society-criticises-commissioner-bretons-approach-to-eu-digital-rulebook/
Tusikov, N. (2016). Chokepoints: Global private regulation on the internet (1st edition). University of California Press.
van de Kerkhof, J. (2023). Jawboning content moderation from a European perspective (SSRN Scholarly Paper No. 4695233). https://papers.ssrn.com/abstract=4695233
Weiser, P. J. (2004). Introduction: A regulatory regime for the internet age. J. on Telecomm. & High Tech. L., 3, 1.
Writers Guild of America, West, Inc. v. FCC, 423 F. Supp 1064 (C.D. Cal. 1976). https://law.justia.com/cases/federal/district-courts/FSupp/423/1064/2393546/
Wu, T. (2011). Agency threats. Duke Law Journal, 60, 1841.
Zittrain, J. (2003). Internet points of control. BCL Rev., 44, 653.
Zuckerberg, M. (2024, August 26). Meta’s letter to the Committe of the Judiciary [Letter]. https://x.com/juanof9/status/1828245345635311990
Footnotes
1. On the internet, jawboning has been defined as “informal government efforts to persuade, cajole, or strong-arm private platforms to change their content-moderation practice” (See https://knightcolumbia.org/blog/channel/jawboning). It has also been defined as “statements by policymakers that threaten possible action, as opposed to announcing actual action” (Weiser, 2004), as “the threat of future regulation” (Keller, 2019, p. 5), as “informal means of persuasion and coercion, including the threat of regulation, to persuade platforms to adopt certain policies” (Leerssen, 2020).
2. I thank this observation to Linda Weigl.
3. The effort meant gathering data from public records, classifying them according to issues where Informal methods of Internet governance were likely, and create a unified timeline of events that included public officials actions and corporate reactions, with the purpose of identifying correlations between the former and the latter. The public records that were gathered and classified were the United State Bills introduced in both the House and the Senate on Internet regulation (n=300), the public speeches gathered in The American Presidency Project based in the University of California at Santa Barbara, available at https://www.presidency.ucsb.edu/ (n=3154); the Federal Trade Commission actions and speeches (n=322 and 70, respectively), the Internet-related data found in Eur-Lex database, available at https://eur-lex.europa.eu/homepage.html (n=396); the European Commission’s press corner database, available on https://ec.europa.eu/commission/presscorner/ (n=6915); the data of CELE’s Observatorio Legislativo, covering bills from nine countries in Latin America, available at https://observatoriolegislativocele.com/ (n=555); the data of the project Letra Chica that recorded changes to Meta, X, and YouTube terms and conditions (no longer publicly available, n=690); the database of Platforms and Publishers of the Tow Center for Digital Journalism, available at https://tow.cjr.org/platform-timeline/ (n=1245); and the data found in the policy blogs of Google, Meta, and X.com (formerly Twitter, n=1839). We gathered the data, classified it according to keywords and regular expressions, and brought it together to form a unique data frame that could be visualized in a timeline that could be navigated for research purposes. The analysis presented in this section is based on this database and visualization. The database can be found in: https://ramiroau.github.io/threats-timeline/.
4. The coding was made by capturing all events that, in the District Court decision, are dated. Repeated events were coded only once.
5. (Missouri v. Biden, 2023b), 25 (“Are you guys fucking serious? I want an answer on what happened here and I want it today”)
6. Available at https://ec.europa.eu/commission/presscorner/home/en
7. See Breton, T. [@ThierryBreton] (2022), Breton, T. [@ThierryBreton] (2023a). Some letters such as the ones Breton sent to X/Twitter and Meta on October 2023 created some pushback from civil society. See Breton, T. [@ThierryBreton] (2023b), Breton, T. [@ThierryBreton] (2023c), Access Now, Article 19, et al. (2023).
8. Breton, T. [@ThierryBreton] (2023d), Breto, T. [@ThierryBreton] (2024a), Breton, T. [@ThierryBreton] (2024c), Breton, T. [@ThierryBreton] (2024d).
9. Breton, T. [@ThierryBreton] (2024b)
10. Breton, T. [@ThierryBreton] (2024f), Breton, T. [@ThierryBreton] (2024h).
11. Breton, T. [@ThierryBreton] (2024e)
12. European Commission Press Corner (2024a), European Commission Press Corner (2024b), Breton, T. [@ThierryBreton] (2024g).