Introduction to the special issue: The craft of interdisciplinary research and methods in public interest cybersecurity, privacy, and digital rights governance
Abstract
Rather than treating collaboration and method as a background detail, it foregrounds the innovative, often messy, and reflexive practices through which interdisciplinary research on public interest cybersecurity, privacy, and digital rights is carried out. By treating “craft” as an object of analysis, this collection illuminates the dynamics of partnership and translation that shape the social and political impact of this work.
Our main argument is that by opening the craft of interdisciplinary method to more explicit scrutiny, this collection provides a novel space to examine how knowledge in these domains is made, contested, and reshaped. While it does not claim to resolve the monumental, planetary wickedness of contemporary governance challenges, it does seek to equip scholars, practitioners, and policymakers with deeper insights into the strategies that favour more effective responses.
Papers in this special issue
-
Introduction to the special issue: The craft of interdisciplinary research and methods in public interest cybersecurity, privacy, and digital rights governance
Adam Molnar, Sociology and Legal Studies, University of Waterloo, Canada
Diarmaid Harkin, School of Humanities and Social Sciences, Deakin University, Australia
Urs Hengartner, David R. Cheriton School of Computer Science, University of Waterloo, Canada
-
Avoiding the kitchen sink: A guide to mixed methods approaches within digital rights governance
Gabrielle Lim, University of Toronto
Noura Aljizawi, University of Toronto
Shaila Baran, University of Toronto
Nicola Lawford, MIT
-
Beyond silos: Bridging the gap between law and software engineering – challenges, successes, and lesson drawing
Martina Siclari, University of Luxembourg
Salomé Lannier, University of Luxembourg
Olivier Voordeckers, University of Luxembourg
Stanisław Tosza, University of Luxembourg
Sallam Abualhaija, University of Luxembourg
Marcello Ceci, University of Luxembourg
Nicolas Sannier, University of Luxembourg
Domenico Bianculli, University of Luxembourg
-
Governing phygital spaces: Human rights by design meets speculative design
Tehilla Shwartz Altshuler, Israel Democracy Institute
Rachel Aridor Hershkovitz, Israel Democracy Institute
Romi Mikulinsky, Aalto University
Boris Müller, Fachhochschule Potsdam
-
Calibrating collaboration: Interdisciplinarity in security research
-
Colliding ideas: Artistic explorations of data surveillance and data protection
Lucy Royal-Dawson, University of Ulster
Katherine Nolan, Technological University Dublin
Eugene McNamee, Ulster University
Laura O’Connor, Ulster University
Emma Campbell, Ulster University
Anna Pathé-Smith, The Open University
Kyle Boyd, Ulster University
Daniel Philpott, Ulster University
-
Developing the Citizen Summit Method: Understanding citizens’ views on digital surveillance technologies
Sally Dibb, Manchester Metropolitan University
Kirstie Ball, University of St Andrews Business School
Sara Degli-Esposti, Institute of Philosophy of the Spanish Research Council (CSIC)
-
Leveraging interdisciplinary methods for evidence collection in enforcement: Dark patterns as a case study
Johanna Gunawan, Maastricht University
Colin M. Gray, Indiana University
Cristiana Santos, University of Utrecht
Nataliia Bielova, Université Côte d’Azur
-
Troubling translation: Sociotechnical research in AI policy and governance
Serena Oduro, Data & Society Research Institute
Alice E. Marwick, Data & Society Research Institute
Charley Johnson, Data & Society Research Institute
Erie Meyer, Georgetown Institute for Technology & Law
1. Introduction: Beyond interdisciplinary findings, toward interdisciplinary craft
Every governance challenge today is, to varying degrees, a challenge of technology governance. From security, to healthcare, to democratic participation, the infrastructures that sustain public life are increasingly digital, networked, and data-centric. Datafication, surveillance, and artificial intelligence in the form of algorithmic decision-making systems (ADS) both large and small permeate work, health, security, and form a vital bedrock of democratic participation and mediate the boundaries of human rights themselves. This pressing reality has placed questions of cybersecurity, privacy, and digital rights, not as merely peripheral technical or legal addenda, but central to the practical exercise of governance itself.
These challenges are also profoundly wicked problems. First attributed to Rittel and Webber (1973), wicked problems describe social and policy challenges that resist clear, attainable solutions. Such problems have no definitive formulation, as the way the problem is defined shapes its potential solution (pp. 161-162). Framing privacy as a technical issue, for instance, leads to vastly different interventions than framing it as a basic human right rooted in social values (Roessler and Mokrosinska, 2015). They are further defined by the often-contradictory goals and struggles of diverse interest holders and the fact that they are ongoing problems with no clear end solution (p. 162). Challenges like data privacy risks are not problems that can be permanently solved, but wicked problems that evolve with technology and demand constant intervention. This concept is crucial for technology governance because it highlights the inherent limits of technological solutionism - he flawed belief that purely technical fixes can remedy complex, digitally-mediated harms (Morozov, 2013).
As wicked problems, these governance dilemmas involve the weaving together of technical architectures, legal norms and jurisdictions, economic incentives, public/private authority, and cultural practices. This complex and often conflicting interplay creates profound challenges for any single discipline and demands a comprehensive, interdisciplinary scholarship that can account for these deeply interconnected dynamics. Such scholarly accounts are, in turn, vital for constructing effective policy. Indeed, regulatory interventions that target one domain, whether through a new technical protocol, a legislative reform, or a market incentive, can inadvertently reverberate across others, shifting risks, generating new harms (Bennett Moses, 2013), or producing fresh tensions about ‘who gets what, when, and how’ (Lasswell, 2018 [1936]).
Precisely because of this complex wickedness of ‘socio-technical problems’, interdisciplinary research has flourished across law, computer science, engineering, and the social sciences, to map and expose harms, clarify the limits of legal frameworks, and better inform prospects of regulatory design, and social justice-oriented interventions.
Paradoxically, however, while research on these issues is burgeoning across law, computer science, the social sciences, and advocacy communities, our collective ability to more systematically understand and address them in critically reflexive ways is still nascent. Advancing a holistic understanding geared for policy impact relies not just on the more conventionally familiar challenge of translating scholarly knowledge into policy environments (see Weiss, 1979), but on bridging the diverse approaches inherent to our disciplines as a vital prerequisite for doing so effectively. In the absence of such integration, the story often goes something like this: social scientists offer refined understandings of a range of social impacts but may lack ‘code-level’ awareness; law scholars focus on doctrinal methods but can miss technical realities and social harms; and computer scientists, while fluent in the technological environment, often adhere to engineering rationalities that overlook cultural biases, the role of private authority, and broader regulatory contexts.
This challenge of disciplinary isolation is not merely an abstract problem, played out through these familiar anecdotes; it is one we have encountered firsthand on multiple projects. As guest editors of this issue, our own interdisciplinary collaborations – spanning studies of employee monitoring applications (also known as bossware) (Thompson and Molnar, 2023, others forthcoming) and consumer spyware (also known as stalkerware), the use of technology in domestic and family violence (Molnar and Harkin, 2019; Harkin and Molnar, 2021; Parsons et al., 2019) – have sat at this precise, difficult intersection. Our projects required deep methodological (and data) triangulation, combining computer security methods (including code-level analysis and interface level analyses using HCI-based assessments) with evaluations of harm across a range of social science disciplines, and assessments of the practical adequacy of existing relevant laws and regulations across a range of legal areas and jurisdictions. Crucially, this work was not done in an academic vacuum. It was built on deep partnerships with civil society, including domestic and family violence services, labour unions, and civil liberties and privacy associations. This complete ‘lab-to-policy’ process – from technical analyses, to user-focused studies, to legal/regulatory analysis, to briefing regulators – taught us firsthand that there is no established playbook for bridging these divides. Rather, it is a negotiated, practical, and often difficult ‘craft’ in its own right. It is this direct experience with the methodological, relational, and epistemological challenges of integration that motivated us to assemble this special issue.
These epistemological differences are also expressed as contrasting normative orientations. The distinction made by critical social theorist Nancy Fraser (1995) between ‘affirmative’ and ‘transformative’ remedies to social injustice helps us clarify how different fields orient towards interpreting and responding to socio-technical problems. According to Fraser, “affirmative remedies for injustice” refers to responses that aim to correct “inequitable outcomes of social arrangements without disturbing the underlying framework that generates them” (1995, p. 82). By contrast, “transformative remedies” refers to responses that aim to correct inequitable outcomes by “restructuring the underlying framework that generates them” (1995, p. 82). The key focus of this distinction is “end-state outcomes versus the processes that produce them” (1995, p. 82). While Fraser was writing in the context of cultural misrecognition and economic inequity, this analytical framework is central to understanding the political limits of diverse fields converging on technology governance. In essence, it provides a crucial vocabulary for interrogating whether our proposed interventions are affirmative fixes or transformative responses to these wicked problems, while acknowledging that both are necessary – often with affirmative remedies serving as vital stop-gap measures on the path to deeper, structural change.
This distinction provides a useful lens for examining how various fields engage with technology governance, whether their interventions offer immediate, corrective fixes or seek to transform the systems that produce harm. Computer science interventions, for example, frequently advance affirmative change. They mitigate risks by patching vulnerabilities or driving the adoption of technical standards, without altering the underlying structures (such as perverse market incentives) that generate them. Legal scholarship often defaults to an affirmative stance as well, relying on frameworks like contract law and “informed consent” to legitimise business models that are fundamentally extractive (Sadowski, 2019). However, the law can also straddle both domains, offering pathways for transformative reforms such as banning surveillance advertising. Yet, its transformative potential can also be constrained; by ‘centering’ the structural logic of the nation-state, it may affirm colonial logics and undermine indigenous norms of data sovereignty, such as the principles of Ownership, Control, Access, and Possession (OCAP®) (FNIGC, 2024; Fullenwieder and Molnar 2018). The social sciences, with their critical orientation, may align most closely with transformative change by seeking to uncover and respond to the root causes of surveillance and other digital harms, such as framing technology-facilitated violence not exclusively as a technical problem but as a phenomenon rooted in harmful gender norms (e.g., Bailey et al., 2019), or critiquing how so-called ‘Privacy-preserving AI Techniques’ as a solution to employee surveillance, while remaining lawfully compliant, work to obscure transformative solutions rooted in collective rights, structural accountability, and democratic participation (Longiaru et al., 2025).
Left in isolation, these powerful lenses can produce contrasting or even counter-productive outcomes. This makes the craft of integration – the collaboration, translation, and negotiation central to developing knowledge and effective governance – a critical site of inquiry. Yet it is precisely this craft that has received far less attention, rarely becoming the focus of analysis in its own right.
This reveals a critical first gap: a lack of reflexive attention to how interdisciplinary collaborations are assembled, sustained, and sometimes troubled. A second, related gap concerns intersectoral dynamics: the engagements between researchers and civil society actors, journalists, policymakers, and even industry (albeit with great risk, see Goldenfein and Mann, 2023). While these collaborations are essential for impact, they are heavily mediated by diverging incentives and constraints, yet they too remain critically under-examined – outside of our café or pub visits, of course.
This special issue takes up these gaps directly. Rather than treating collaboration and method as a background detail, it foregrounds the innovative, often messy, and reflexive practices through which interdisciplinary research on public interest cybersecurity, privacy, and digital rights is carried out. By treating “craft” as an object of analysis, this collection illuminates the dynamics of partnership and translation that shape the social and political impact of this work.
Our main argument is that by opening the craft of interdisciplinary method to more explicit scrutiny, this collection provides a novel space to examine how knowledge in these domains is made, contested, and reshaped. While it does not claim to resolve the monumental, planetary wickedness of contemporary governance challenges, it does seek to equip scholars, practitioners, and policymakers with deeper insights into the strategies that favour more effective responses.
To that end, the articles in this collection are clustered into three main areas. The first cluster provides a conceptual groundwork to explore the interdisciplinary dimensions of methodological craft. The second turns our focus inward on the practical dimensions of calibrating interdisciplinary teams. And the final cluster explores the ‘external’ craft of bridging scholarly networks with the public and policy actors to leverage impact.
2. A guide to the contributions
2a. The foundations of methodological craft
A core challenge facing interdisciplinary research in public interest cybersecurity, privacy, and digital rights is the strategic craft of research and method design itself. Emerging socio-technical challenges demand novel approaches that existing methods from single disciplines struggle to provide. However, without a deliberate strategy on how to move forward, researchers risk a “kitchen sink” approach where the integration of more methods in ad-hoc ways does not directly translate to better methodology (or even better explanatory accounts). The first cluster of articles in this issue tackle this foundational problem: how to foreground the innovative and strategic thinking required to build sound and creative interdisciplinary research design that effectively leverages the comparative strengths of each involved discipline.
To that end, the first article from Aljizawi, Baran, Lawford, and Lim (2025) prompts us to “avoid the kitchen sink” in our design approaches. They argue that while interdisciplinary research on a range of matters relating to ‘the digital’ can encourage creative and innovative thinking from a methodological point of view, there is always a series of trade-offs involved when mixing and combining methods and research strategies. As mentioned, more methods do not necessarily lead to a better methodology, and indeed, less can be more. Aljizawi and colleagues demonstrate that ‘kitchen sink’ approaches in interdisciplinary projects can undermine the core task of reliable knowledge generation, diminish explanatory value, or muddy the takeaway message for policymakers. The authors show how multi-quant, multi-qual, and mixed-methods studies have their own intrinsic shortcomings that should be acknowledged, with greater thought given to how methods are tied together within research objectives. As the authors state, “In cybersecurity research, mixed methods help researchers not only to investigate ‘what’ happened and ‘how frequently,’ but also to ask ‘why,’ ‘how,’ ‘to whom,’ or ‘under which conditions.’“ (Alijzawi et al, 2025, p. 18). The authors provide an illuminating guide to help us think strategically about how diverse methods are selected and integrated, and importantly, what the consequences of those choices are for the research outputs and their implications for policy value.
Offering a concrete case study that responds directly to these challenges, Siclari, Lannier, Voordekers, Tosza, Abualhaija, Ceci, Sannier, and Bianculli (2025) detail a collaboration between law and software engineering researchers while developing a tool for assessing GDPR compliance of data collection and processing in Fintech mobile applications. They present a structured methodology and best practices for bridging these specific fields, providing a practical framework for overcoming common hurdles. Their discussion highlights early-stage challenges like establishing common conceptual and operational ground and defining a realistic research scope. They detail their hybrid process for translating abstract legal provisions, noting that a “crucial aspect of this process was the establishment of clear drafting instructions and a structured format for documenting requirements” (p. 16) and that “[e]nsuring traceability to the legal source for each requirement was particularly valuable” (p. 16). The result is a replicable framework that successfully bridges the gap between abstract legal analysis and concrete technical observations.
Finally, in a contribution that directly embodies this special issue’s focus on methodological craft, Altshuler, Aridor-Hershkovitz, Mikulinsky, and Müller (2025) explore the governance of emerging “phygital spaces” – hybrid environments created by smart glasses and AI. Arguing that existing frameworks like Privacy-by-Design, Human-Rights-By-Design, Ethics-by-Design, and ‘Responsible AI’, are insufficient to address the relational and collective harms of these technologies, the authors discuss how they built a new interdisciplinary approach from the ground up, embracing disciplinary differences as “productive frictions”. Through a phased methodology that combines legal analysis with speculative design, storytelling, and gamified role-playing workshops, they reveal regulatory blind spots. The result is what they call the ethics of interactions framework, a novel conceptual tool that emerged directly through the scenario-based role play with regulatory and ethical mapping. Their work serves as a notable demonstration of how forward-looking methods are not just tools for analysis, but can work as generative instruments for creating new regulatory vocabularies suited for the wicked problems of our AI-permeated future. While their focus on ensuring equitable terms of engagement within these phygital spaces is a crucial contribution, it raises a parallel question about the choice of engagement itself. What of the right to not interact, and a community’s choice to reject a phygital future entirely? Exploring this right to refusal invites a different interdisciplinary craft – one that prioritises community self-determination and data sovereignty over designing fairer systems of interaction.
Collectively, these articles establish that integrated methodological design is not a passive precursor to research but is itself a primary site of craft with notable implications. They demonstrate how strategic, creative, and even speculative approaches to methods are essential for generating novel insights and fine-tuning regulatory vocabularies.
2b. The internal craft of calibrating interdisciplinary teams
Even with a rigorous and appropriately scoped research / methodological design, the craft of interdisciplinary research is about much more than effective research design and well-scoped integration of methods. The ‘doing’ of interdisciplinary research into public interest cybersecurity, privacy, and digital rights also comes down to the practical working dynamics of an interdisciplinary team. The challenge of bridging misaligned disciplinary languages, epistemological assumptions, and conflicting institutional incentives can derail even the most promising projects. The papers in this cluster provide an enriching opportunity to reflect on topics that are almost uniformly excluded from the finished product of interdisciplinary work (the journal article or policy report), that is, specifically, the practical conduct and challenges of making teams work.
The challenges that impinge on taking the ‘next step’ in systematising interdisciplinary tech research are several. As McGregor and Thompson (2025) point out in their analysis of interdisciplinary security research teams, drawing on literature and original interviews with experts, their work brings concreteness to the practical difficulties of these partnerships. They distill these challenges into four key themes: the need to establish a shared vocabulary; the challenge of reconciling different epistemological assumptions; the importance of balancing the distinct “currencies” or valued outputs of different fields; and the management of divergent resource expectations. Their contribution aptly reminds us that making the craft of interdisciplinary research is as much an enterprise in constructive team building and collaborative project management. And when nothing else can be planned for, an experimental “leap of faith” into the interdisciplinary-unknown is still the best recommendation.
Building on this idea of experimentation, Royal-Dawson, Nolan, McNamee, O’Connor, Pathé-Smith, Boyd, and Philpott (2025) offer a compelling case study of what such a leap can achieve. Their “LawTech Collider” project brought together digital technology law scholars and creative artists in a collaboration that culminated in a public art exhibition. Conceived as a “collision of ideas,” the project deliberately embraced uncertainty to generate unexpected outcomes for both collaborators and audiences. The artists’ ‘craft’ wasn’t just about the law; it became a method to investigate, mediate, and translate it. They transformed dense Terms & Conditions into accessible visual designs, created physical metaphors – like a restrictive medical brace – to represent the constraints of biometric data, and built a living installation where a hydroponic system was actively controlled by the artist’s own fertility tracking data. The article highlights the value of this high-risk, experimental approach, offering practical guidance for interdisciplinary team-building, goal-setting, and ongoing evaluation. In this way, they show how an artistic craft can function as a powerful method of public legal education, one aimed at not just raising awareness but inspiring tangible changes in digital literacy and civic action.
Together, these contributions reveal the often-hidden ‘internal’ craft of collaboration. They demonstrate that the success of interdisciplinary research hinges not just on soundly integrated methodology, but on the practical, relational, and even experimental work of building and sustaining the human dynamics of the team itself.
2c. Bridging research and public impact
Finally, for public interest research to achieve its goal, it must travel beyond the university. Yet, this “last mile” – translating complex findings for policymakers, regulators, and the public – is perhaps the most challenging and least understood aspect of the research lifecycle, full of its own distinct practical and political hurdles. This cluster addresses the craft of intersectoral partnership, and the strategies required to create meaningful policy impact – cutting across topics such as leveraging civil society for research and policy impact (Dibb et al., 2025), leveraging design, HCI, and computer science to assist law enforcement and regulatory compliance efforts (Gunawan et al., 2025), and on furnishing effective knowledge translation between academics and policymakers in the field of AI regulation.
Showcasing such strategic innovation, Dibb, Ball, and Esposti (2025) detail their work on how a tool borrowed from policymaking – the citizen summit – was transformed into a robust “citizen summit method” (CSM) to help guide research into digital surveillance technologies. For these authors, interdisciplinarity – while not even necessarily between engineering and social sciences disciplines but within and across social sciences – could still prove central to the CSM’s redesign, which integrated theoretical constructs into a model that ran through the methods’ different elements. The result is a powerful example of how to creatively adapt existing practices to generate in-depth and nuanced evidentiary baselines on complex topics like citizens’ views on digital surveillance, privacy, and security. This is not merely an academic contribution; it is a demonstration of how to generate the kind of evidence essential for advancing publicly-grounded technology policy.
Moving from citizen participation to criminal and regulatory administrative engagements, Gunawan, Gray, Santos, and Bielova (2025) use their case study into the pressing issue of “dark patterns,” to directly examine the gap between academic research and regulatory enforcement. Spanning design, HCI, computer science, and law, their team analyses how scientific research methods and evidence types may influence the growing body of regulatory actions against manipulative design. They reflect on how scholarly evidence can be better leveraged to support regulators, offering recommendations for strengthening the crucial – and often underdeveloped – collaborative relationship between these two communities.
And finally, as concerns around AI produce a proliferating range of forums and advisory groups, Oduru, Marwick, Johnson, and Meyer (2025) provide timely reflections from the inside of such efforts. Detailing their professional experiences with the Public Technology Leadership Collaborative and the National Institute of Standards and Technology’s (NIST) US AI Safety Institute Consortium, the authors provide vital commentary on how interdisciplinary research and expertise ‘translates’ within conversations with policymakers. They argue that the success of such efforts is not merely based on the quality of our research findings. Instead, it depends on the quality of relationship-building, trust, and taking advantage of strategic opportunities for enmeshing researchers with policymakers. Their paper is an important reminder that data and research do not simply speak for themselves but require an explicit strategy of ‘translation’ for decision-makers who are driving cutting-edge policymaking.
This final cluster explicitly addresses the vital movement from interdisciplinary to intersectoral, focusing on the 'external' craft of achieving public impact. What these articles do so well to highlight is how the craft of inquiry does not end at publication but emanates into the deliberate, strategic work of translating findings and building relationships with citizens, regulators, and policymakers to create meaningful change.
Conclusion and future directions
Drawing these diverse contributions together, a set of common themes emerges. First, the collection highlights the necessity of methodological reflexivity, an idea grounded in the understanding that research methods are not passive tools but active, world-making forces. As sociologists Law and Urry (2004) have powerfully argued, methods help to enact social realities rather than simply describing them. Acknowledging this performative capacity is central to the craft of inquiry. It forces us to recognise the choices we make – in avoiding the “kitchen sink” (Aljizawi et al., 2025) or in designing speculative futures (Altshuler et al., 2025) – are not just technical decisions. They are interventions that actively shape the boundaries of a wicked problem and, in turn, delimit the imaginative space for its potential solutions.
Second, the contributions reveal interdisciplinarity as a site of productive friction and political negotiation. The “relational work” this requires is far more than team management; it is the difficult epistemic labour of bridging deep-seated epistemological and normative divides. Here, the work of Star and Griesemer (1989) is particularly insightful. They show that collaboration across different “social worlds” is often made possible by boundary objects – shared concepts, standards, or artifacts that allow for cooperation without consensus.
The articles in this issue can be seen as a study in the crafting of such objects, from legal GDPR requirements translated for software engineers (Siclari et al., 2025) to scientific evidence leveraged for regulatory enforcement (Gunawan et al., 2025). However, this process is fundamentally political. Boundary objects are not neutral meeting grounds; they are sites of contestation where different disciplinary values and priorities are negotiated. The way an object like “privacy” as a set of knowledges and policy apparatuses is defined and operationalised in a collaborative project will inevitably encode and privilege certain worldviews – be they legal, technical, or social – while marginalising others (Heemsbergen and Molnar, 2020). This process has profound implications, framing the political subject at the heart of the issue: does ‘human-centred’ refer to a consumer to be protected, a human rights holder to be empowered, or an activist to be shielded? This craft of negotiation, therefore, is where the political stakes of interdisciplinary work truly lie, actively shaping whose knowledge counts and which futures are made possible.
These insights bring the overarching normative stake of our collaborative craft into sharp relief. If our methods help to enact social realities (Law & Urry, 2004) and our collaborations are sites of epistemological, normative and whether we explicitly acknowledge it or not – political negotiation (Star & Griesemer, 1989), then the central question becomes: to what end are we directing this power? This is, again, where Fraser’s (1995) conceptual heuristic is a helpful diagnostic tool. By making the distinction between affirmative and transformative goals more explicit in our work, we can foster a more honest and potent form of inquiry, one that is clear-eyed about whether it aims for mitigation, reform, or systemic change. An affirmative focus on debiasing an algorithm, for instance, can obscure the transformative questions about the colossal energy consumption of the data centres that train it, the extractive supply chains that build it, and the carbon-intensive future it helps to lock in.
The craft of interdisciplinarity is an ongoing negotiation between affirmative remedies that mitigate harms within existing systems and transformative ones that seek to restructure them. By making this distinction explicit, we can move from a general acknowledgment that inquiry is never neutral to a more precise and politically potent interrogation of our work, clarifying whether our interventions are ultimately aimed at reform in ways that fail to disturb perverse incentives to ones that are about redrawing the underlying structures that give rise to these incentives in the first instance.
Looking forward, this raises a crucial set of questions for the ‘tech research’ community writ large. While we have focused on the craft of interdisciplinary research, this collection has centered primarily on academic, government, and civil society partnerships. We acknowledge that this same interdisciplinary ‘craft’ is being actively honed in other vital sectors, such as in investigative data journalism, where technical, legal, and media skills are powerfully combined to hold power to account (see TheMarkUp.Com and 404media.co). More work is needed on the institutional conditions that enable all of this challenging work. How can universities and funding agencies better support such efforts, particularly in scholarly contexts where longitudinal studies offer vital knowledge? If we were to build coordinated research infrastructure to support systematic interdisciplinary research, what would it look like? Such infrastructure might move beyond ad-hoc project grants to establish dedicated translational hubs that institutionalise relational labour across disciplines and create novel institutional pathways that reward the slow, high-risk craft of deep integration. Sustaining this work also requires cultivating an information environment that is itself a shared sociotechnical accomplishment – one that makes the infrastructural conditions necessary for collaboration more durable over time.
And as we move forward with these questions, we must also consider the broader global context in which our interdisciplinary work unfolds. The current era – marked by rising authoritarianism, widening socio-economic inequality, and accelerating environmental crisis – shapes the institutional priorities, funding landscapes, and normative horizons of research itself. These global conditions are not external to the craft of inquiry; they clarify its urgency and its stakes. Even when research aspires to be purely technical or descriptive, it participates in the making and maintenance of particular social worlds – structuring which problems are seen as urgent, which publics are addressed, and which futures are imagined. Recognising this is not about prescribing a political stance, but about acknowledging how interdisciplinary scholarship, whether we like it or not, is irrevocably implicated in the normative and material orders it helps to sustain. In this landscape, the reflexive, interdisciplinary practices explored in this issue are not merely academic exercises but vital resources for orienting our collective inquiry toward the defence of public interest values and sustainable futures.
Confronting the wicked problems of our digital world requires more than just novel findings, they demand a deeper reflexivity about the craft of our inquiry. The articles in this special issue have sought to foreground this craft, moving collaboration and method from the background to the centre of analysis. By bringing these diverse contributions together, this special issue demonstrates that the stakes of this interdisciplinary engagement are not merely methodological, but profoundly normative. From our own perspective, a central task in this endeavour is not simply to combine perspectives, but to interrogate whether the resulting interventions reinforce existing arrangements of power or contribute to their transformation. In this sense, the craft of interdisciplinarity is itself a normative practice, requiring us to ask not only how we conduct our research, but to what end.
References
404 Media. (n.d.). 404 Media. https://www.404media.co/
Aljizawi, N., Baran, S., Lawford, N., & Lim, G. (2025). Avoiding the kitchen sink: A guide to mixed methods approaches within digital rights governance. Internet Policy Review, 14(4). https://doi.org/10.14763/2025.4.2044
Altshuler, T., Aridor Hershkowitz, R., Mikulinsky, R., & Muller, B. (2025). Governing phygital spaces: Human rights by design meets speculative design. Internet Policy Review, 14(4). https://doi.org/10.14763/2025.4.2048
Bailey, J., Steeves, V., Burkell, J., Shade, L. R., Ruparelia, R., & Regan, P. (2019). Getting at equality: Research methods informed by the lessons of intersectionality. International Journal of Qualitative Methods, 18, 1609406919846753. https://doi.org/10.1177/1609406919846753
Bennett Moses, L. (2013). How to think about law, regulation and technology: Problems with ‘technology’ as a regulatory target. Law, Innovation and Technology, 5(1), 1–20. https://doi.org/10.5235/17579961.5.1.1
Dibb, S., Ball, K., & Degli Esposti, S. (2025). Developing the citizen summit method: Understanding citizens’ views on digital surveillance technologies. Internet Policy Review, 14(4). https://doi.org/10.14763/2025.4.2045
First Nations Indigenous Governance Centre. (2024). A First Nations guide to the Privacy Act. https://fnigc.ca/wp-content/uploads/2024/08/FNIGC_FN_Guide_Privacy_Act_EN-1.pdf
Fraser, N. (1995). From redistribution to recognition? Dilemmas of justice in a “post-socialist” age. New Left Review, 212, 68–93.
Fullenwieder, L., & Molnar, A. (2018). Settler governance and privacy: Canada’s Indian residential school settlement agreement and the mediation of state-based violence. International Journal of Communication, 12, 1332–1349. http://ijoc.org/index.php/ijoc/article/view/7042
Goldenfein, J., & Mann, M. (2023). Tech money in civil society: Whose interests do digital rights organisations represent? Cultural Studies, 37(1), 88–122. https://doi.org/10.1080/09502386.2022.2042582
Gunawan, J., Gray, C. M., Santos, C., & Bielova, N. (2025). Leveraging interdisciplinary methods for evidence collection in enforcement: Dark patterns as a case study. Internet Policy Review, 14(4). https://doi.org/10.14763/2025.4.2047
Harkin, D., & Molnar, A. (2021). Operating-system design and its implications for victims of family violence: The comparative threat of smart phone spyware for Android versus iPhone users. Violence Against Women, 27(6–7), 851–875. https://doi.org/10.1177/1077801220923731
Heemsbergen, L., & Molnar, A. (2020). VPNs as boundary objects of the internet: (Mis)trust in the translation(s). Internet Policy Review, 9(4). https://doi.org/10.14763/2020.4.1513
Lasswell, H. D. (2018). Politics: Who gets what, when, how. Pickle Partners Publishing.
Law, J., & Urry, J. (2004). Enacting the social. Economy and Society, 33(3), 390–410. https://doi.org/10.1080/0308514042000225716
Longiaru, M., Negrón, W., Chen, B. J., Nguyen, A., Patel, S. N., & Calacci, D. (2025). The “privacy” trap: How “privacy-preserving AI techniques” mask the new worker surveillance and datafication [Policy Brief]. Data & Society. https://datasociety.net/library/the-privacy-trap
McGregor, S. E., & Thompson, D. E. (2025). Calibrating collaboration: Interdisciplinarity in security research. Internet Policy Review, 14(4). https://doi.org/ 10.14763/2025.4.2046
Molnar, A., & Harkin, D. (2019). The consumer spyware industry: An Australian-based analysis of the threats of consumer spyware. Australian Communications Consumer Action Network.
Morozov, E. (2013). To save everything, click here: The folly of technological solutionism. PublicAffairs.
Oduro, S., Marwick, A. E., Johnson, C., & Meyer, E. (2025). Troubling translation: Sociotechnical research in AI policy and governance. Internet Policy Review, 14(4). https://doi.org/10.14763/2025.4.2043
Parsons, C., Molnar, A., Dalek, J., Knockel, J., Kenyon, M., Haselton, B., Khoo, C., & Deibert, R. (2019). The predator in your pocket: A multidisciplinary assessment of the stalkerware application industry (Issue 119)) [(Citizen Lab Research Report]. The Citizen Lab. https://citizenlab.ca/docs/stalkerware-holistic.pdf
Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Sciences, 4(2), 155–169.
Roessler, B., & Mokrosinska, D. (Eds). (2015). Social dimensions of privacy: Interdisciplinary perspectives. Cambridge University Press.
Royal-Dawson, L., Nolan, K., McNamee, E., O’Connor, L., Campbell, E., Pathé-Smith, A., Boyd, K., & Philpott, D. (2025). Colliding ideas: Artistic explorations of data surveillance and data protection. Internet Policy Review, 14(4). https://doi.org/10.14763/2025.4.2049
Sadowski, J. (2019). When data is capital: Datafication, accumulation, and extraction. Big Data & Society, 6(1), 2053951718820549. https://doi.org/10.1177/2053951718820549
Siclari, M., Lannier, S., Voordeckers, O., Tosza, S., Abualhaija, S., Ceci, M., Sannier, N., & Bianculli, D. (2025). Beyond silos: Bridging the gap between law and software engineering - Challenges, successes, and lesson drawing. Internet Policy Review, 14(4). https://doi.org/10.14763/2025.4.2042
Star, S. L., & Griesemer, J. R. (1989). Institutional ecology, `translations’ and boundary objects: Amateurs and professionals in Berkeley’s museum of vertebrate zoology, 1907-39. Social Studies of Science, 19(3), 387–420. https://doi.org/10.1177/030631289019003001
The Markup. (n.d.). The Markup. https://themarkup.org/
Thompson, D. E., & Molnar, A. (2023). Workplace surveillance in Canada: A survey on the adoption and use of employee monitoring applications. Canadian Review of Sociology/Revue Canadienne de Sociologie, 60(4), 801–819.
Weiss, C. H. (1979). The many meanings of research utilization. Public Administration Review, 39(5), 426. https://doi.org/10.2307/3109916