Beyond GAFAM: How size-or-silo regulation fails to account for organisational diversity in the platform economy

Nicolas Friederici, Alexander von Humboldt Institute for Internet and Society, Berlin, Germany, nicolas.friederici@hiig.de
Inge Graef, Tilburg Institute for Law, Technology, and Society (TILT), Tilburg University, Netherlands, I.Graef@tilburguniversity.edu

PUBLISHED ON: 11 Nov 2021

If Facebook whistleblower Frances Haugen’s recent visit to Brussels has made one thing clear, it is that there is now wide consensus that digital platforms need to be more strictly regulated. She was warmly received by a number of prominent EU politicians including Thierry Breton, the European Commissioner for the Internal Market. Many posed with her and shared photos on social media. Such a public display indicates that regulators have become keenly aware of the platform economy and its unwanted outcomes concerning democratic participation, but also the future of work, control over digital innovation, and national sovereignty.

While we welcome this increased attention, we are concerned that current attempts at platform regulation will fall short if they continue to focus on platform size, or if they remain limited to particular normative silos. The result may be that smaller and niche platforms may continue to be neglected and fall further behind.

We detect that most regulatory and public debates are unbalanced in that they focus on a rather particular slice of the platform economy. To most of us, the term “platform” has become synonymous with GAFAM (Google, Apple, Facebook, Amazon, and Microsoft), plus a few large consumer-facing platforms like AirBnB, Uber, Spotify, or Zalando. Another set of platforms that have become part of the public consciousness are delivery and other gig work platforms like Delivery Hero or Helpling, again, because consumers are familiar with them and because they have led to overt social disputes. Intuitively drawing on these examples, public and regulatory discourse has settled on the narrative that the problem with platforms is that they are powerful gatekeepers that exploit users who depend on the platform for market participation (gig workers, merchants, app developers, etc.).

Implicitly or explicitly, the narrative of platforms-as-gatekeepers is based on some notion of market power, either towards other platforms or towards platform users. To determine whether a given platform has market power, regulation typically uses size, measured in terms of user base, market share or revenue. Notable regulatory efforts have set explicit size thresholds beyond which tougher restrictions apply. The EU’s proposed Digital Markets Act (DMA) explicitly targets gatekeepers that would be, for instance, prohibited from combining personal data across their services without user consent (Article 5(a)) and from treating their own services more favourably in rankings than services offered by third parties (Article 6(1)(d)). The proposed Digital Services Act (DSA) introduces obligations for ‘very large online platforms’ to manage systemic risks stemming from the functioning and use of their services, such as the dissemination of illegal content and negative effects for the exercise of fundamental rights (Article 26). Similarly, the German NetzDG prescribes that social media platforms with more than two million users in Germany are obligated to take down hateful content and to publicly report about complaints they receive.

The limitations of current size-or-silo regulation

We do not want to argue that these efforts are generally misguided. Platform size is a relatively clear and objective yardstick to set apart truly giant market actors, like GAFAM and their handful of peers. Because these mega platforms have an incomparably larger impact on users and markets, they should be subjected to stricter obligations than smaller ones (Graef & Van Berlo, 2021).

However, we posit that current platform regulation is limited in two critical respects. First, a separation of platform companies into “large” and “small” is reductionist and drowns out awareness for the nuances of platform business and governance models (in the context of the DSA, see Broughton Micova, 2021). At the minimum, regulators need to discern between advertising-based platforms, transaction and matchmaking-based ones, and operating systems or innovation platforms (Caffara & Morton, 2021). Large European e-commerce platforms have recently argued along similar lines, noting that they should not be categorised as gatekeepers based on the number of online visitors but instead based on completed transactions. Even beyond broad business model categorisations, specific elements of platform governance may be the origin of problematic outcomes rather than size per se. As the Facebook files demonstrated quite clearly, this particular platform’s problematic outcomes are certainly augmented by its reach, but they originally stem from particular algorithmic management decisions that are core to its business model: Facebook’s content selection maximises engagement regardless of harmful consequences to users and public discourse, as argued by Lauer (2021) and Vaidhyanathan (2018).

Second, regulations are siloed while GAFAM are meta organisations (Kretschmer et al., 2020). Different regulations typically try to address different normative issues (such as data protection, competition, mergers and acquisitions, content moderation, sovereignty, labour protections, and so on) separately and independently of any particular market actor. Yet, for meta platforms, those normative issues are in fact interconnected and can enhance each other. Siloed regulations and enforcement provide platforms with discretion to trade off key public interests normally balanced by legislators or regulators, such as the level of privacy, competition, and innovation available in the market (Geradin, Katsifis, & Karanikioti, 2021). In the absence of more integrated enforcement across legal regimes, this discretion can also allow platforms to use the parallel application of different regimes to their own advantage, for instance by relying on the need to protect the privacy of users as a justification to keep their platforms closed and limit competition from rivals (see the UK CMA’s investigation into Google’s Privacy Sandbox). In addition, the rationale that underlies most meta platforms’ business model is to maximise consumer convenience at all cost, thereby scaling fast and becoming the market leader that achieves user lock-in. But the maybe more important and defining feature of GAFAM is the successful creation of product ecosystems and, with them, ecosystem-wide data-driven value creation. Meta platforms’ combined dominance over individual product markets and the ecosystem as a whole is what makes regulating them both necessary and difficult (Bourreau & De Streel, 2019). As such, there seems to be a mismatch between the fragmented way platforms are currently regulated and the way that dominant platforms behave in reality, continuing to expand and integrate new services in their ecosystems to the effect that different normative concerns become intertwined.

Ironically, both regulation by size and by normative silo can ultimately put smaller platforms at a disadvantage and thus have the opposite-than-intended effects. While large platforms are able to exploit the gaps between different regulatory domains, smaller players may decide not to experiment by introducing new products or services because of uncertainty about the exact boundaries of the law. Furthermore, when ambitious standards are designed with GAFAM in mind, less well-resourced actors may struggle more to implement them than the original targets of regulation, as has been shown for the GDPR (Johnson, Shriver, & Goldberg, 2020). As a consequence, current regulatory regimes may perpetuate the already existing divide between large, powerful platforms and small, niche platforms that has resulted from the characteristics of these markets, like network effects, economies of scale and scope, and user lock-in.

The many modes of platform governance

We believe that such unintended consequences are avoidable and that European societies can do better. Normative approaches1 to the platform economy could be much more sophisticated and appropriate if they were founded in a deeper understanding of its organisational diversity. Beyond the purview of mass consumer audiences and mainstream regulatory debate, a plethora of platform governance models have emerged which deviate significantly in the ways that value is created and power is accumulated and distributed.

Platform cooperatives (or coops) represent a kind of opposite extreme to GAFAM’s model. They seek to distribute value more fairly among users and even external stakeholders, typically by letting them participate in the platform’s decision-making. Many (if not most) platform coops identify as part of the global platform cooperativism movement, broadly seeking to create alternatives to ‘Big Tech’ and large corporations, to decentralise power and achieve fairer outcomes for all stakeholders. Due to platform coops’ conceptual and idealistic appeal, the movement has gathered a lot of attention within expert and academic circles.

While GAFAM and platform coops represent platform governance extremes (emphasis on size and ecosystem control vs. decentralisation and redistribution), there are certainly a lot of other models in-between. Often, such models are responses to changing consumer preferences, particular dynamics in some platform markets, or to stakeholder coordination problems. Examples can be found across various sectors. Niche social networks rely on active and community-based content moderation as a means to involve users and generate trust among them (Dinar, 2021). In the industrial internet of things, supplier consortia co-govern specialised innovation platforms, allowing suppliers to maintain platform-based online shops under their own brands (Friederici et al., 2020). Traditional firms that platformise their business tend to develop complicated and specialised governance structures that consider suppliers and internal forces, even where they directly compete with GAFAM (Fürstenau et al., 2019). Complex and institutionally dense sectors like healthcare require multi-stakeholder platform governance that balances diverse interests while adhering to sector-specific regulations. Even just among marketplace platforms, more convoluted governance models, which attribute important roles to intermediaries and off-platform interactions, often prevail in specialised business-to-business markets (Meier et al., forthcoming).

Our concern is that, even where such platforms are not the explicit target of regulation, they can be indirectly affected. Moreover, individually and collectively, these platforms may deserve the opposite of restrictions: it may make sense for national and European digital industrial policy to support the diversity and scalability of various forms of platform governance, especially where it offers more decentralised and open approaches. Notably, many such platforms are not all about size and dominance; they often try to solve niche-specific coordination problems (like Resonate, Stocksy, or Quartz OA) or offer more ethical alternatives for idealistic reasons (like FairBnB or Ecosia) (Karanovic, Berends, & Engel, 2020). These platforms may be satisfied to remain at medium size or they may want to stay small. While platforms that deviate from the scaling-at-all-cost and ecosystem control model are not, and probably will not become, as large as GAFAM, they oftentimes create various forms of value (markets, interconnection, flexibility, autonomy, etc.) without the same problematic outcomes, and may therefore warrant active promotion.

Public institutions for holistic and active platform regulation, oversight, and promotion

Realistically, regulation by size and by regulatory domain will remain part of the policy and legislative toolbox, and some level of fragmentation may be inevitable. However, gaps and tensions should be reduced as much as possible. Ultimately, neither a one-sided understanding of platforms as gatekeepers in individual product markets, nor a single-minded focus on size as a trigger for public scrutiny represent useful rationales for platform regulation. This is particularly relevant now that new legislative proposals, such as the DMA and DSA, are being adopted, which again introduce their own definitions, obligations and enforcement mechanisms. At the minimum, regulation must ensure to minimise the inadvertent limiting effects on platforms that are not actually a cause for concern, and avoid setting incentives for platforms to stay small.

Yet, regulators should be more ambitious. They have to take seriously notions of structural importance, like the concept of systemic risk in the DSA or the increasingly widespread social-scientific view of platforms-as-infrastructure (e.g., Bohn, Friederici, & Gümüsay, 2020). Notions of gatekeeping and proxies like size are a starting but not an end point in determining which particular platforms have infrastructural relevance and need to be regulated as such. Non-infrastructural platforms—even those that are very large—may be more appropriately regulated through traditional domain-specific regimes, such as on privacy or antitrust.

Beyond targeting the most powerful platforms, it is important to keep in mind that regulation can also help create opportunities for smaller, niche platforms to scale up. In this sense, regulation is often seen as creating costly obligations for the market players involved. However, regulation can also be used strategically to protect or expand one’s existing business. For instance, the General Data Protection Regulation provides individuals with a right to transfer their personal data to a new provider (Article 20). Small platforms could rely on this right to attract new customers to their business who may otherwise remain locked into the services of a large incumbent platform where they build up an entire profile (Graef, Husovec, & Purtova, 2018). While the creation and use of such rights will not be a silver bullet, this does illustrate how a wider regulatory toolbox and better awareness or more proactive use of existing legal possibilities can contribute to more organisational diversity in the platform economy.

Ultimately, to achieve both an appropriate application of regulatory regimes and make promotion of organisational diversity more effective, in our view, there is no way around holistic assessments of individual platform companies’ business and governance models. For such an effort, we need to think beyond and outside of existing institutional structures. While the efforts of civil society actors to monitor, rate, and accredit platforms are meritable (see for instance the Ranking Digital Rights index or the Fairwork project), public institutions will eventually have to formalise and amplify them. Independent regulatory bodies to directly oversee companies (i.e., beyond the enforcement of abstract law) already exist in a number of sectors such as telecommunications, utilities, media, food production, pharmaceuticals, and defense. The time is now for regulators, preferably at EU level, to develop an institutional structure with a similar capacity and mandate for the platform economy. Whether such a body is formed by a publicly funded third party (such as the European co-operation for Accreditation) or directly as a public agency (such as the EU’s decentralised agencies), the key will be that it remains independent, has sufficient technical capacity, and has some enforcement mechanism. Only once our public institutions can cope with the organisational diversity of the platform economy, will we be able to move beyond the size-or-silo status quo and steer it towards fairer and more sustainable outcomes.

Acknowledgments

This piece was inspired by the discussion following a Lunch Talk Seminar organised by the Humboldt Institute for Internet & Society (HIIG)’s Innovation, Entrepreneurship & Society Group, for which the authors were joined by Jovana Karanovic and Christian Fieseler. Their ideas are gratefully acknowledged.

References

Bohn, S., Friederici, N., & Gümüsay, A. A. (2020). Too big to fail us? Platforms as systemically relevant. Internet Policy Review. https://policyreview.info/articles/news/too-big-fail-us-platforms-systemically-relevant/1489

Bourreau, M., & De Streel, A. (2019). Digital conglomerates and EU competition policy. ETNO. http://www.crid.be/pdf/public/8377.pdf

Broughton Micova, S. (2021). What is the harm in size? Very large online platforms in the Digital Services Act [Issue Paper]. Centre on Regulation in Europe. https://cerre.eu/wp-content/uploads/2021/10/211019_CERRE_IP_What-is-the-harm-in-size_FINAL.pdf

Caffarra, C., & Morton, F. S. (2021, January 5). The European Commission Digital Markets Act: A translation. VoxEU CEPR. https://voxeu.org/article/european-commission-digital-markets-act-translation

Dinar, C. (2021). The state of content moderation for the LGBTIQA+ community and the role of the EU Digital Services Act [E-Paper]. Heinrich Böll Stiftung. https://eu.boell.org/sites/default/files/2021-06/HBS-e-paper-state-platform-moderation-for-LGBTQI-200621_FINAL.pdf?dimension1=zora2021

Friederici, Nicolas, Krell, Tina, Meier, Philip, Braesemann, Fabian, & Stephany, Fabian. (2020). Plattforminnovation im Mittelstand. Zenodo. https://doi.org/10.5281/ZENODO.4291999

Fuerstenau, D., Rothe, H., Baiyere, A., Schulte-Althoff, M., Masak, D., Schewina, K., & Anisimova, D. (2019). Growth, Complexity, and Generativity of Digital Platforms: The Case of Otto.de. ICIS 2019 Proceedings, 2781. https://aisel.aisnet.org/icis2019/is_heart_of_innovation_ecosystems/innovation_ecosystems/12

Geradin, D., Katsifis, D., & Karanikioti, T. (2021). Google as a de facto privacy regulator: Analysing the Privacy Sandbox from an antitrust perspective. European Competition Journal, 1–65. https://doi.org/10.1080/17441056.2021.1930450

Graef, I., Husovec, M., & Purtova, N. (2018). Data Portability and Data Control: Lessons for an Emerging Concept in EU Law. German Law Journal, 19(6), 1359–1398. https://doi.org/10.1017/S2071832200023075

Graef, I., & Van Berlo, S. (2021). Towards Smarter Regulation in the Areas of Competition, Data Protection and Consumer Law: Why Greater Power Should Come with Greater Responsibility. European Journal of Risk Regulation, 12(3), 674–698. https://doi.org/10.1017/err.2020.92

Johnson, G., Shriver, S., & Goldberg, S. (2019). Privacy & Market Concentration: Intended & Unintended Consequences of the GDPR. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3477686

Karanovic, J., Berends, H., & Engel, Y. (2020). Managing the Identity-Size Paradox in Platforms: The Case of Platform Cooperatives. Academy of Management Proceedings, 2020(1), 21166. https://doi.org/10.5465/AMBPP.2020.21166abstract

Kretschmer, T., Leiponen, A., Schilling, M., & Vasudeva, G. (2020). Platform ecosystems as meta‐organizations: Implications for platform strategies. Strategic Management Journal, smj.3250. https://doi.org/10.1002/smj.3250

Lauer, D. (2021). Facebook’s ethical failures are not accidental; they are part of the business model. AI and Ethics, 1(4), 395–403. https://doi.org/10.1007/s43681-021-00068-x

Meier, P., Krell, T., & Friederici, N. (n.d.). Why quality beats quantity for B2B network effects (Forthcoming). Working Paper.

Vaidhyanathan, S. (2018). Antisocial Media: How Facebook Disconnects Us and Undermines Democracy. Oxford University Press.

Footnotes

1. We deliberately speak of normative approaches in a broad sense, including regulation through laws, oversight by government agencies and third parties, accountability mechanisms like boards and mediation, self-regulation, and codes of conduct.

Add new comment