The case for prosocial tech design governance

Lisa Schirch, University of Notre Dame, United States of America

PUBLISHED ON: 31 Mar 2025

This paper is part of Content moderation on digital platforms: beyond states and firms, a special issue of Internet Policy Review guest-edited by Romain Badouard and Anne Bellon.

Disclaimer

The author co-chairs the Council on Technology and Social Cohesion.

Most tech regulatory frameworks mandate quick removal of illegal or harmful content but do not address the underlying design issues that facilitate the spread of such content in the first place. The scale of harmful content online keeps growing, while tech companies' capacity and political will to support online trust and safety is lagging. Industrial-scale digital propaganda campaigns are increasing while massive tech layoffs are reducing content moderation teams. A political backlash over content moderation is leading to claims of censorship. Tech regulators should concentrate more on governing design rather than solely managing content.

Tech insiders have pointed out that despite massive investments and the deployment of thousands of content moderators, the current model is akin to a manic game of whack-a-mole, continuously battling against a rising tide of harmful content. This reactive model fails to address the root causes—the design elements and algorithms that incentivise divisive content. Current tech governance strategies are akin to trying to purify a contaminated river one cup at a time, a vivid image effectively portrayed by journalist Maria Ressa in her Nobel Peace Prize acceptance speech. Prosocial tech design governance offers an alternative by focusing "upstream" on tech design choices that steer human behaviour toward non-harmful or healthy communities.

Tech design is not neutral

In my research with staff at major tech companies, most attributed the problem of harmful content to users and described tech platforms as "neutral mirrors" merely reflecting the worst aspects of human behaviour without influencing it. Malicious individuals exploit these platforms, but the technology itself is not just a backdrop. These design choices are not neutral; they significantly impact public discourse and societal cohesion. All tech design persuades users to behave in certain ways. Tech design can encourage harm or discourage it. Design decisions impact what users can do, can not do, and/or what users are subtly encouraged to do through elements like endless scrolls or Like buttons.

Companies employ sophisticated psychological techniques to engage users in specific ways that often prioritise advertising and engagement over safety. Some of these mechanisms exploit human vulnerabilities, encouraging behaviours that maximise user engagement but compromise the quality of public discourse. In some cases, the most extreme voices on tech platforms gain the most algorithmic attention, thus deepening societal divisions. Some tech designs are like digital gladiator arenas where an audience watches a few people argue. While likely not intentional, designing tech to foster understanding is missing. Instead, some tech platforms are like digital towers of Babel, where people simply cannot hear or understand each other.

Toward prosocial tech design

Prosocial tech design primarily focuses on shaping digital spaces to reduce harm and encourage positive social interactions. Several civil society organisations have already started to pave the way with initiatives to foster better digital spaces. The Prosocial Design Network, for instance, offers a library of evidence-based design interventions. New_Public supports communities in creating more inclusive digital public spaces. In February 2023, a coalition of groups organised a conference in San Francisco titled “Designing Technology for Social Cohesion,” which was attended by more than 200 experts from the tech community and professionals in bridge building and peacebuilding.

Prosocial tech design governance offers a complementary but distinct approach to content governance regarding the problem of harmful content online. Design governance focuses on the design principles that determine how the platform operates and what it “affords” users. This includes algorithms that do not discriminate or amplify divisive content and user interfaces that encourage thoughtful interaction rather than impulsive reactions.

Effective regulatory approaches could include mandating basic tech building codes. For example, the Neely Design Code based at the University of Southern California's Neely Center for Ethical Leadership and Decision-Making was developed by experts and practitioners to identify evidence-based minimum standards for companies that host online social interactions. One of the provisions in the Neely Design Code would ensure that tech platforms allow users to accessibly opt-out of revenue-maximising design features (e.g., optimising for time spent, infinite scroll, auto-play) that encourage greater usage. Another provision would have platforms offer privacy by default rather than requiring users to make manual changes to protect their privacy.

More advanced prosocial tech designs would offer more nudges and coaches to foster prosocial norms on platforms. Other ideas are to include user verification tags, and a variety of reaction buttons that could create new incentives for users to respect views or learn from people with whom they disagree.

Borrowing from the sustainability movement, a "LEED Standard" tech design code could include features that facilitate a functional public square, enabling public decision-making and fostering trust among community groups and between the public and institutions. Civic tech is already demonstrating that it is possible to design tech platforms that benefit democracy and social cohesion. Deliberative technologies like Pol.is and Remesh have shown that it is possible to enhance public discourse by focusing on common ground rather than differences, using algorithms that highlight consensus and public problem-solving rather than conflict.

Regulators could also require platforms to demonstrate how their design choices align with public interest standards, potentially through a new framework of ‘design audits.’ Regulations could require some level of algorithmic transparency to ensure they do not favour sensationalist content. Additionally, introducing “algorithmic impact assessments,” similar to environmental impact assessments, could be a proactive measure to ensure new technologies do not harm social cohesion before deployment.

I co-chair the Council on Technology and Social Cohesion, a global network of tech and peacebuilding experts. Together with many other members in our network, we call for a comprehensive approach to prosocial tech design governance, incorporating government incentives, community-led processes, and international collaboration. Our goal is not just to mitigate the harms of current technology designs but to reimagine the framework within which technology is created and governed. By focusing on designs prioritising human values and societal well-being, we can transform the landscape of digital interaction to support a more democratic, peaceful, and cohesive society.