Calibrating collaboration: Interdisciplinarity in security research
Abstract
Interdisciplinary collaboration and mixed-methods research is gaining recognition as a necessary design for advancing security and privacy research and is increasingly encouraged by publishing venues and funding bodies. Yet the execution of interdisciplinary partnerships and research presents a number of fundamental challenges that have implications for its success, impact, and sustainability. Drawing on literature around interdisciplinary science research as well as original research interviews with experts outside of computing who have successfully partnered with security and privacy researchers, this paper examines and expands upon four themes unique to interdisciplinary research teams: 1) the need to identify, elaborate, and document a shared vocabulary; 2) the need to unpack and reconcile the epistemological assumptions and methodological requirements within the participating fields; 3) the importance of naming and balancing the distinct ‘currencies’ – that is, the outputs that are allocated value – of different disciplines and communities; and 4) differences in resource expectations, availability, and allocation. Through a review of the literature and interviews with practitioners, this research aims to bring visibility and concreteness to the practical challenges of working in cross-disciplinary research teams and identify the approaches and supports that can help increase the sustainability and impact of these efforts.
This paper is part of The craft of interdisciplinary research and methods in public interest cybersecurity, privacy, and digital rights governance, a special issue of Internet Policy Review, guest-edited by Adam Molnar, Diarmaid Harkin, and Urs Hengartner.
Introduction
Interdisciplinary research is increasingly recognised as an important mechanism for advancing cybersecurity and privacy research and encouraged by publishing venues and funding bodies (Leigh & Brown, 2021; Marcolin & Saunders, 2015; Payne et al., 2021). While potentially leading to innovative and creative solutions to complex and intersectoral challenges (Jacob, 2015; Nissani, 1997), building successful interdisciplinary partnerships and research projects presents a number of fundamental challenges that are often unaddressed by traditional academic and professional training (McKee et al., 2021; Gewin, 2014) and are still far from the norm (Outhit, 2025).
In this paper, we seek to outline the unique challenges to interdisciplinary cybersecurity and privacy research, while offering recommendations to help practitioners and institutions bring the full range of disciplines doing this type of research into deeper conversation with one another. By exploring how interdisciplinary cybersecurity and privacy research challenges have manifested for – and been addressed by – researchers who specifically come from outside the domains of engineering and computer science, this work aims to increase the robustness of future interdisciplinary security and privacy research and collaboration. Drawing on our expert interviews we identify four common themes related to the interdisciplinary cybersecurity and privacy which we also contextualise within the literature on interdisciplinary research more broadly. Specifically:
- the need to identify, elaborate, and document a shared vocabulary within interdisciplinary cybersecurity and privacy research teams,
- the need to unpack and reconcile the epistemological assumptions and methodological requirements within the participating fields,
- the importance of both articulating and balancing the distinct “currencies”1– that is, outputs that are already allocated disciplinary-specific value – in order to make interdisciplinary work sustainable, and
- differences in resource expectations, availability, and allocation within interdisciplinary cybersecurity and privacy research teams that can both create opportunities as well as generate challenges.
Finally, we conclude with a discussion of recommendations for building and sustaining effective interdisciplinary partnerships, as well as the limitations of our exploratory study and areas for future research.
Background and motivation
Until the advent of the commercial internet, computer security and privacy were a concern largely reserved for governments and private companies. Today, however, the ubiquity of networked computing technologies has made privacy and cybersecurity into realtime issues that individuals must manage almost continuously. While partly a reflection of how pervasive these technologies have become, this fact also reflects the degree to which the “defaults” of networked systems continue to conflict with analog privacy and security expectations, practices, and social norms. Public reaction to once-typical digital data collection practices – such as the personal data collection of Facebook users by Cambridge Analytica – has also drawn attention to the substantial gaps between public perception and institutional guidelines for ethical research (Schneble et al., 2018). More recently, the fallout from the 2019 release of IBM’s “diversity in faces” dataset, in which Flickr photographs were used to train a facial recognition model, illustrated the ongoing challenges of applying legal concepts from one technological era to another (Solon, 2019), as data protection laws are often outpaced by technological development (Brownsword, 2021).
Privacy and security research in computing
While the radical change in the scope and application of computing technologies has led to new areas of focus and practice in computer security and privacy research, the research approaches deployed in these disciplines often remain largely both inductive and applied. Importantly, we note that there is not a single broadly-accepted computing philosophy that addresses how computing technologies should function in the world.
Certainly, the achievements like Latanya Sweeney’s (2000) groundbreaking work on digital privacy risks, Whitten & Tygar’s (1999) foundational work on usable security and Buolamwini & Gebru’s (2018) revelations regarding the fundamental (un)fairness and insecurity of machine learning systems have expanded how the field of computing delineates legitimate security and privacy research. While issues such as usability, privacy, and fairness have largely been “brought into the fold” of computing research through dedicated conferences, symposia (e.g. SOUPS, FAccT) and through incorporation into leading publication venues (e.g., Choski et al., 2024; Sohrawardi et al., 2024; Kohno et al., 2023; McLeod et al., 2024), security and privacy research in computing remains largely reactive and measurement-focused. Exceptions exist, such as the use of differential privacy in the 2020 US census (Abowd, 2018), but these efforts are rarely uncontroversial (e.g. Kenny et al., 2021). More broadly, the emergent nature of technological systems makes it nearly impossible for researchers in any single discipline or sector to fully anticipate how the integration of computing may create new security concerns or violate the existing privacy norms of an existing system.
Privacy and security research in the social sciences
In general, the social sciences – such as sociology, criminology, science and technology studies (STS), and surveillance studies – approach privacy and security as socio-technical phenomena that are shaped by technology, institutions, cultures, and everyday practices. These researchers examine how these concepts are constructed and negotiated, and how they shape issues of governance, control, and social inequality. In the past five decades, conceptualisations of privacy as an individual right (Westin, 1970), have shifted towards understandings of privacy as a social and political value (Bennett, 2011; Steeves, 2009).
In the 1980s, surveillance studies drew on Foucault’s (1977) theorising of the panopticon and disciplinary power to develop an understanding of privacy as rooted in social sorting and power dynamics. Gary T. Marx (1988, 1998) was one of the first scholars to theorise about surveillance through digital technologies as a form of social control, arguing they constituted a “new surveillance.” By the 1990s David Lyon (1994, 2001a, 20013, 2007) began to frame surveillance as a normalised aspect of the “information society”, whose culture of surveillance through the collection of consumer data, workplace surveillance, and government security infrastructures created significant implications for privacy and social justice. At the same time, Science and Technology Studies (STS) scholars were examining how information infrastructures are embedded within systems of power and social control. They showed how these systems rely on infrastructural logics of categorisation, visibility, and governance, which in turn reproduce social inequalities and shape access to resources through technical practices (Bowker & Star’s, 1999).
Norris and Armstrong’s (1999) work on the proliferation of CCTV in the UK highlighted the ways in which surveillance technologies reorder social life and reinforce existing power structures and inequalities. At the intersection of surveillance and critical security studies, scholars like Bigo (2006) have theorised about the ways in which surveillance can be used to restrict the free movement of those deemed to be “risky,” underscoring how negotiations of privacy and security are shaped by broader logics of surveillance, governance, and social control.
More recently, Shoshana Zuboff’s (2019) critical examination of the digital economy, articulates how “surveillance capitalism” extracts and commodifies human experience as data, creating a new economic order in which privacy becomes a question of autonomy rather than individual control. Newer technologies like artificial intelligence (AI) encode and reproduce inequalities through bias embedded in their data and algorithmic design (Benjamin, 2019). Research on workplace surveillance examines how employee monitoring applications that extract and synthesise worker data into productivity and risk metrics reshape labor relations and erodes worker privacy and autonomy (Thompson & Molnar, 2023; Outhit, 2025).
Social science research has been pivotal in advancing our understandings of privacy and security as socio-technical phenomena that are embedded in institutions, governance, and power relations. Social scientists direct our attention to the societal implications of technological advancements including inequality and democratic participation. Yet, just as the approaches of the social sciences are necessary for understanding how technological systems are constructed and negotiated as a part of social life, an appreciation for the theoretical and practical limitations of computing technologies is essential to understanding what types of tools and solutions are achievable. The work occurring in the social sciences therefore complements the essential work that is also occurring in the computer sciences, and points towards the value of true interdisciplinary collaboration to fully examine the technical, social, ethical, and political dimensions of privacy and security.
The need for interdisciplinary cybersecurity and privacy research
Despite advances within specific disciplines, there remains a substantial need for interdisciplinary cybersecurity and privacy research that truly integrates the concepts, theories, methodologies, and analytical approaches of disciplines across computer science, engineering, social sciences, law, and others. In addition to highlighting the multifaceted nature of security and privacy issues, successful interdisciplinary collaboration in cybersecurity and privacy research can inspire innovative new tools, contribute to informed decision making and responsible design, ensure societal relevance, and expand appreciation for the true capacities, and limits, of computing technologies. While at minimum, such work can help ensure that new technical “solutions” do not inadvertently create news risks and harms, in our view however, the potential is much greater: successful interdisciplinary collaboration in cybersecurity and privacy research can both inspire innovative new computational tools and more effective, rights-supporting policy and education. Especially given the rapid and ongoing integration of artificial intelligence (AI) technologies into essential systems and services, the need for this type of innovation is only becoming more essential. As such, understanding how to support its best, most successful forms is, in our view, more important than ever.
Methods
In order to generate both conceptual and practical insights into successful interdisciplinary cybersecurity and privacy research, we conducted semi-structured interviews with a purposively sampled group of participants whose primary affiliation or background was not in a computational field, but who had recently coauthored research papers at high-profile, peer-reviewed computer security and privacy venues (USENIX Security, Symposium on Usable Privacy and Security (SOUPS), Conference on Computer and Communications Security (CCS), Conference on Human Factors in Computing Systems (CHI), Conference on Computer-Supported Cooperative Work and Social Computing (CSCW)). Drawing on aspects of “key informant” research approaches (Pahwa et al., 2023), we then interviewed a total of 5 researcher-practitioners including a legal scholar-litigator, a social scientist at a teaching college, a journalism faculty member at a research university, a civil society-based human rights defense expert, and a UX/UI expert at a security-focused non-profit. Collectively, our participants’ co-authored interdisciplinary cybersecurity and privacy research has been cited 44 times since 2020 across top computer security and privacy venues (USENIX Security, CHI, and IEEE S&P).
To complement and situate the themes constructed through our interview analysis, we conducted a narrative review of the literature to locate articles addressing the process of interdisciplinary collaboration in the fields of cybersecurity and privacy. Literature searches concluded in winter of 2025. A set of keyword searches using terms such as “interdisciplinary”, “multidisciplinary”, “transdisciplinary”, “privacy”, “security”, “collaboration”, and “partnership” were performed on OMNI, the University of Waterloo’s online library catalogue, and Google Scholar. Our search approach revealed “lessons learned” across interdisciplinary teams including those from biology, psychology, economics, communications, environmental studies, engineering, information systems, and sports management. Notably, articles from interdisciplinary teams across the social sciences were not substantially represented in our search results, further underscoring the need for truly interdisciplinary collaborations in this space.
Interview procedures and analysis
All of the study’s recruitment and interview procedures were reviewed and approved by the first author’s Internal Review Board (IRB), the university organisation that oversees the ethical practice of human subjects research2. Interviews were conducted and only audio was recorded via Zoom. Initial audio transcriptions were generated using Trint, a third-party online automated transcription service. Quotations were corrected, where needed, during the coding process.
Interview data was coded by the first author using reflexive thematic analysis (Braun & Clarke, 2006; Braun et al., 2023). After conducting the interviews, the first author familiarised herself with the data by listening to the audio of each interview again and making notes, followed by an initial round of coding and theme generation using relevant data extracts. These anonymised extracts and initial themes were then collaboratively reviewed and discussed by both authors, to refine the themes and to bring them into conversation with the terminology and approaches identified through our literature review. The three final themes are outlined and discussed below.
Positionality statement
The first author of this study has extensive experience conducting interdisciplinary computer security and privacy research, and reflections on her own collaborations and practices were foundational to the design of this work. The second author brings significant additional expertise conducting interdisciplinary social science research in the area of workplace surveillance and privacy.
Theme 1: Identifying and building shared vocabularies
It is widely known that different disciplines employ their own specialised vocabularies to discuss the research process, meaning that interdisciplinary teams are especially likely to differ in the ways they “communicate, express themselves, and learn” (Leigh & Brown, 2021, p.425). Finding a common language across fields is often a challenge due to stark disciplinary differences in terminology and language, nature and concepts of research, and even assumptions about how the research process is to unfold (Leigh & Brown, 2021). These challenges may take the form of overloaded terms (i.e., the same word has different, well-defined meanings in different fields), multiple meanings (i.e., the same term has multiple understandings and interpretations within and across fields), and disparate connotations (i.e., a term or concept has a precise meaning in one field and is a general use/non-specialty term in another).
Existing interdisciplinary research has underscored the need for the development of a “shared knowledge base that provides a common conceptual language” to bridge domain gaps across interdisciplinary team members (Cooper et al., 2023, p.1). The development of a shared vocabulary is thought to foster a common understanding, reduce confusion, and help integrate data, knowledge, tools, and theories across disciplines (Liang 2021). A recent interdisciplinary workshop on Generative AI and Law, for example, illustrated vastly different understandings of what constitutes “differential privacy” in computer science (i.e., a mathematically provable guarantee about the inability to identify an individual’s data) and law (i.e., a holistic approach that encompasses a range of privacy interests) (Cooper et al., 2023). Building a shared vocabulary is therefore crucial for the success of interdisciplinary teams; however constructing this shared language can be challenging due to the time and emotional work it requires (Gibson et al., 2019), and is often only developed after several conversations and numerous ‘false starts’ (Cooper et al., 2023). While ideally this process would occur at the outset of every research project during a phase in which each member learns about the other disciplines, the high-pressure nature of academia – in which staying up to date even in one’s own area of expertise can prove challenging – makes allocating this time and effort especially difficult (Baveye, 2013).
For our interview participants, discipline-specific vocabularies presented unique challenges to collaboration with computer security and privacy experts. In some cases, our participants’ observations related quite literally to vocabulary, such as word choice or interpretation. At other times, participants described how vocabulary choices embodied assumptions about priorities and requirements that had to be made explicit for their interdisciplinary partnerships to succeed.
Challenges. Some vocabulary challenges mentioned by research participants were just that: differences in word meanings that were either not understandable to or not aligned with the expectations of their collaborators. The following legal scholar commented on their experiences collaborating with a security expert:
“One of the big challenges is just knowing how to talk to people outside your discipline when…you don't have a translator. We don't always know the right questions to ask; we don't know the right language to use. We don't always know what's possible on the other side.” (P2)
As a result, the particular collaboration was both invaluable and incredibly challenging: “It was an arduous process…He spoke the language of technology, and it was a slog to try to help him express his conclusions in a way that a court would understand” (P2). A civil society expert, meanwhile, recalled an experience early in her career when a colleague highlighted the specific terms (not) used by a community she was facing: “[He] just sat me down one day and was like, Hey, let me explain a few things, right?...Like, Don’t use ‘cyber’; people get very weirded out by that” (P4). Though these might appear to be minor differences, they can still lead to lost, or nearly-missed, opportunities.
Knowledge gaps often extended beyond solely vocabulary differences, with a user experience (UX) expert acknowledging misunderstandings related to “unspoken requirements”: “There are sometimes unspoken requirements… We all think that there are things that are obvious, but actually I often understand that it's really not obvious for my colleagues” (P5).
The civil society expert we interviewed noted that her hiring process with a security- and privacy-focused organisation was nearly derailed by vocabulary differences: “[A hiring manager] even told me directly... ‘When we were interviewing people, most of the hiring panel thought that you weren't “qualified” because on paper I wasn’t using the language that those folks were looking for’” (P4).
During her own successful tenure with that organisation, she did a similar service for an African colleague who was applying for a fellowship through a US organisation:
“He did not know how to use the word ‘I’. So everything was ‘We did this.’ And I’m like, ‘No, this is not how it works, you’re dealing with a different culture.’…It was a big learning thing because…certain cultures are more collective. And that clashes a lot because the security field is mostly very [individualistic].” (P4)
Because vocabulary choices can reflect both disciplinary and sociopolitical culture, failure to recognise these differences also has the potential to lead to miscommunication and conflict.
Strategies. Although research on how to overcome these challenges is somewhat limited, two perspectives emerged from interviews and review of the literature. First, all members of interdisciplinary research teams must be allocated the time and space to reflect on and communicate about disciplinary differences (Leigh & Brown, 2021; Liang et al., 2021). This may occur through regularly scheduled meetings, but should be carried out during the entirety of the research process (Leigh & Brown, 2021). Second, a “cross-disciplinary translator” or “liaison” who is familiar with the content and methodologies of all the disciplines involved can also facilitate collaboration (Manata et al., 2024), as also noted by one of our interview participants. These efforts can also support early clarification of the epistemologies and “currencies” of each contributing discipline, such as plans for data use or manuscript contributions (Liang et al., 2021).
Many of our interview participants described strategies to identify and resolve interpretive gaps on interdisciplinary teams. Though often asking questions was key, even this had to be approached carefully. For example, the UX expert, who worked with security-focused developers, said: “I would say my strategy was to ask if we could be more efficient…I would say, ‘Can we make this shorter?’” (P5) Although apparently straightforward, this participant was careful to frame questions in terms salient to her engineering collaborators (“efficient”), and that acknowledged the importance of security in the overall project: “I make it very clear when I ask a question that I am not trying to interfere with their scope [as security specialists]... I would ask questions in a way that they can always say no to” (P5).
Other participants described more structured approaches to building shared understandings. For example, one funder that had assembled a group of interdisciplinary teams provided training in “design thinking”:
“[We] were all trained together in design thinking, and I think that was so helpful…..[so that] you’re not coming from it with, ‘Well, in computer science, I would do this,’ right? It’s ‘Here's the problem.’” (P1)
Discussion. In line with the existing literature (e.g. Baveye, 2013), our participants found opportunities to build shared vocabularies and approaches independent of either computer science or their own discipline especially valuable. This suggests that having all members of interdisciplinary teams learning novel skills together on equal footing can be a powerful safeguard against Baveye’s (2013) concern that interdisciplinary research will devolve instead into multi- or cross-disciplinary projects, where true integration is reduced.
Likewise, our participants’ observations highlight the ways that language choices reflect and construct both social and disciplinary norms and identities. For example, while “cyber” is commonly used in policy and military circles when discussing digital technology risks and defenses (e.g. “cybersecurity”), academic venues tend to use either terms like “computer security” or more granular language (e.g. “network security”, “coordinated harassment”). Human rights defense groups and other NGOs, meanwhile, may use terms like “digital security” or “information security.” In addition to tacitly signaling membership in or awareness of these communities’ norms and practices, these vocabulary choices arguably have epistemological significance, as we’ll discuss in the next section: “cyber” positions itself in contrast to “analog,” whereas “information security” centers what is being secured rather than the mechanism or technology of attack or defense. As such, while some instances of building shared vocabulary may be more about disambiguation (e.g. agreeing on a definition for using the term “anonymous”), many vocabulary differences will indicate deeper differences in perspective and priority, as further elaborated below.
Theme 2: Reconciling epistemological assumptions and methodological requirements
Interdisciplinary research teams hold the potential to produce more creative designs and richer findings than teams from a single discipline (Tobi & Kampe, 2018). At the same time, different disciplines organise themselves according to different philosophical and procedural methods. Law and policy, for example, are principally concerned with how social and democratic values intersect with institutional and social processes, while computing reflects a combination of formal (mathematical) and practical constraints. Even within disciplines, meanwhile, conflicting understandings of what constitutes “science” can pose a substantial challenge, as experts tend to conflate their own discipline with science while deeming others less scientific (Kovacic & Marcos-Valls, 2023). Even when researchers acknowledge the legitimacy of other disciplines, practical concerns such as differences in language (Kovacic & Marcos-Valls, 2023; Turner et al., 2015), epistemology, and ontology are still common (Turner et al., 2015).
In their work on intersectoral partnerships, both Cederbladh et al. (2024) and Marcolin and Saunders (2015) suggest discussing epistemological and ontological approaches – including problem formation, topics, methods, theoretical frameworks, and levels of analyses – early in the research design process. Partnerships between academia and industry/civil society can reveal differing approaches to problem formation, methodologies, and valued results (Cederbladh et al., 2024; Gersdorf et al., 2019), as shown in Schensul et al. (2006), where some team members placed more value on outcomes (i.e., helping residents) and others emphasised the methodological process (i.e., validity and consistency of instruments). These types of epistemological and ontological tensions can also contribute to the longer time frames of many interdisciplinary research projects (Vantard et al., 2023).
For interdisciplinary teams, attempting to integrate the constructivist approaches of the social sciences (i.e., reality is subjective/constructed by individuals and can be best understood through qualitative research methods) and the positivist approaches of the physical sciences (i.e., reality is objective and can be observed/predicted through quantitative research methods) can be especially challenging (Turner et al., 2015). As noted above, some of the vocabulary differences noted by our interview participants signaled more than the need for a simple “translation,” and instead reflected epistemological perspectives that influence attitudes not only about what can be known, but also how new knowledge can legitimately be produced. These epistemological differences between disciplines can be especially challenging to reconcile, since this often means acknowledging – and compromising with – a different set of normative values.
Challenges. In general, our interview participants described two distinct sets of challenges with respect to epistemological and normative gaps between their own field and that of computer security and privacy. For participants collaborating with security and privacy experts on applied technologies, challenges often centered around negotiating what factors should be considered in “scope” as part of system security. As one participant put it:
“The big risk is that if I design a highly secure platform: Sure it’s highly secure. But if it’s a pain to use it, then users won’t use it…And then there is a bigger risk in terms of security.” (P5)
In her “bargaining” with security experts, this participant’s goal was to contextualise the security of the system being designed within a broader landscape of communication options – options that their intended users might turn to if their own system proved insufficiently usable. For another participant, context was not just about other “security” systems, but broader systems of use and experience: “If tools or strategies are not …[aligned with] the context, they’re not effective. And that …could be the technical context. It could be the cultural context, it could be UX…[Considering context] really is a question of efficiency” (P4).
For participants who were collaborating with security and privacy partners on more conceptual work, challenges arose around not just “what” was knowable, but how. As a legal specialist put it:
“What precisely is it that [computer security and privacy experts] can demonstrate or establish through their research?...The confidence level that you might need for evidence in law might be different than the confidence level that might be obtainable in some other field.” (P2)
For a journalism researcher, epistemological challenges sometimes presented themselves not as knowledge gaps, but as presumed areas of knowledge overlap: “Everyone thinks they know how journalism should work…And I don’t think they understand the research of journalism at all” (P1). She went on:
“I had a profound moment in [an interdisciplinary course] where the computer science students…thought writing a lead was bias[ed]…Sometimes people feel like, here’s the data. It speaks for itself, right? [But] that doesn’t work.” (P1).
Strategies. For our participants, reconciling epistemological differences always involved some degree of reframing or compromise from both disciplinary perspectives, and so usually centered on communicative approaches rather than lines of argumentation. For the UX expert, there was particular value in visuals: “Using prototypes is very useful for me because it’s not ideological.” (P5)
Just as importantly, for this participant – and others who focused on applied security and privacy solutions – there was a recognised “higher authority” than either discipline: end users themselves. The UX designer explained: “We [researchers] have diverse backgrounds and objectives, but the main goal is to make great platforms [for users], secure ones.” (P5)
Discussion. Tobi and Kampe (2018) offer a framework for addressing the technical design of interdisciplinary projects, including questions related to the study and instruments, sampling plan, and analysis that can be used to help interdisciplinary teams address epistemological and ontological differences early on in the collaboration process. At the same time, our participants’ comments reflect that the shape of workable interdisciplinary partnerships often involves crafting approaches that embed and acknowledge the compromises required. For example, our human-rights expert described the need for context as an issue of “efficiency,” deliberately using this term to emphasise that creating contextually-relevant tools was aligned with the values of the computing community.
For the communications expert, the untempered positivism of computer science students was especially jarring. While communications as a discipline assumes the interpretative nature of meaning-making, the computer science students’ charge of “bias” reflects a computing perspective that often treats data as “facts” with an inherent meaning, rather than interpretive artifacts. While this latter perspective accurately reflects the binary requirements of computing systems, it also fails to acknowledge them explicitly. Such conceptual differences go far beyond vocabulary (though ‘bias’ is often regarded statistically, rather than normatively, in computing), and instead constitute the disciplinary assumptions that must be identified and unpacked for interdisciplinary research to proceed.
Theme 3: Exchanging and balancing “currencies” in the interdisciplinary research process
The nature and value of the “currencies” associated with certain research efforts – that is, the professional value attributed to certain types of outputs or impacts – may vary drastically from one discipline to the next. As a result, interdisciplinary teams face unique challenges in balancing their disciplinary timelines, incentive structures, research objectives, and intended outputs (Garousi et al., 2016). For example, some academic research timelines may be incompatible with industry and civil society constraints (Towfighi et al., 2020), which often operate on timelines of weeks or months rather than years (Marcolin & Saunders, 2015). Intersectoral incentive structures can also differ. In academia, for example, publication counts and citations are a key output measure for advancement (Marcolin & Saunders, 2015), whereas industry actors may be more interested in profitability, return on investment (ROI), and advancing products and services (Cederbladh et al., 2024; Marcolin & Saunders, 2015). Civil society organisations, meanwhile, may focus on measurable community impacts (e.g., Open Technology Fund), policy, human rights (e.g., 5Rights), and democratic rights advancement (e.g., CIPPIC). Similarly, while academics may see value in publishing negative results, industry researchers may be hesitant to do so (Cederbladh et al., 2024). Even within a sector like academia, what is considered a “creditable contribution” or differences in the meaning of authorship order can create challenges that single-discipline teams may be less likely to face (e.g., Liang et al., 2021). The value placed on single versus co-authored publications, peer-reviewed conference proceedings, and interdisciplinary journal outlets can also vary by discipline, such that even top-tier publications or competitive research funding awards may not be given weight academic advancement decisions if the venues and review processes are outside a candidate’s “home” discipline (Liang et al., 2021).
Though the literature is largely in agreement that the best strategy for effectively balancing the “currencies” of each discipline within interdisciplinary teams is early and continued communication about timelines, incentive structures, and objectives/outputs, the need for tradeoffs is often unavoidable. Regular meetings and open conversations about team disagreements (Gersdorf et al., 2019), can help address publication opportunities, paper authorship order, and target journals, as can the use of tools that help track team contributions and outputs (Liang et al., 2021). Open communication early on in the research process – ideally before the project’s research questions are formulated (Marcolin & Saunders, 2015) – may also help to close the gap between an academic focus on scientific contributions and industry/community focus on applied contexts and solutions (Garousi et al., 2016). Even where intersectoral or interdisciplinary teams are largely aligned in their broad orientation (e.g. industry and civil society organisations’ interest in applied solutions, or academics’ focus on research publications), differences in sectoral and disciplinary constraints and objectives can still lead to gaps and conflicts. For example, industry partners may be governed by legal nondisclosure agreements and shareholder obligations that make rights-preserving approaches infeasible absent regulations that require them. In academia, venues that are high-value in one discipline may be unknown in another, requiring team members to alternate work on what – for them – are professionally lower-value outputs in order to maintain the feasibility of the broader interdisciplinary collaboration. As such, the need for open communication continues throughout the entirety of the research project, as the natural evolution of research projects as well as unforeseen circumstances can change their scope – and their outputs – over time (Cederbladh et al., 2024).
Even when interdisciplinary teams have succeeded in building a shared vision and approach, they still have to operate within the constraints and legitimating priorities set by their organisations, funders, or broader communities of practice. As such, interdisciplinary teams must often both “trade” currencies in terms of their outputs, and/or make the value of their work legible to stakeholders with sometimes disparate or conflicting priorities. All of our interview participants described ways in which they actively negotiated the “currencies” of their own discipline team members, superiors and other stakeholders in order to make clear the value of their interdisciplinary collaborations.
Challenges. For some participants, their disciplinary “currency” was publications in certain venues, or obtaining certain types of funding awards; for others, it was winning a legal case or the content of an impact report. In all cases, showcasing the value of their interdisciplinary work to stakeholders within their own discipline had to be balanced with the needs of their security and privacy collaborators. Even for a social scientist in a highly interdisciplinary – but journal-dominated – department, this meant: “I might have to explain…the fact that most of the high profile [computer security] publishing venues are conference proceedings; that's very unique to computer science and computer security” (P3). For another participant, institutional attitudes and challenges often came from both her collaborators’ department and her own:
“When I was getting started, I felt like computer science…[viewed interdisciplinary work] as a hobby…they didn’t see the value in it.” (P1)
At the same time, funding also posed a challenge:
“The flip side is that I come from [journalism] where they don’t understand grants. And so the fact that I got an NSF grant with a team…it was like I didn’t do anything.” (P1)
In this case, being selected for US National Science Foundation (NSF) grant – which is awarded through a highly competitive, peer-reviewed process – represented a literal and figurative currency that was high-value for this interdisciplinary team’s computer science members, but was considered low-value from the journalism side.
Strategies. For most of our participants, managing disparate incentive structures depended on all parties being up front about their needs, and then working to ensure that everyone’s obligations were met. For the social scientist, venue decisions were decided “based on who’s the audience and…who would most benefit from this information?”(P3), similarly to the process of resolving epistemological differences by appealing to the needs of a particular community or set of users. These challenges can also be addressed through real-time, micro-level negotiations, as described by the UX expert: “[Collaboratively designing/programming] is also a moment where the roles can be a little bit interchanged. I can ask questions about the code. [Security experts] can challenge my design” (P5).
Discussion. Managing and exchanging the “currencies” of the various disciplines – or sectors – represented on an interdisciplinary research team typically takes place at multiple levels. Within the team, many of these challenges may be manageable through early and comprehensive communication, which can reveal differences in the types of outputs, in particular, that are of value to team members from distinct disciplines or sectors. Where team members have the influence and authority to determine what “counts,” the open communication described in the literature and more granular approaches to negotiating priorities may be sufficient.
For many teams, however, determining the value of a particular “currency” is beyond their control, forcing interdisciplinary teams to either cycle through producing outputs within each of their disciplines, or to advocate for specific interdisciplinary work to be valued in their domain. The latter approach, while in some ways ideal, may be strategically difficult – especially in academia – where specialisation is an important factor in determining compensation (Leahey, 2007). As such, persuading faculty, funders, or other stakeholders of the value of interdisciplinary work may be especially difficult, especially for junior faculty and researchers (Nelson, 2011; Gewin, 2014, Park et al., 2023).
Theme 4: Resource expectations, availability, and allocation
Across our interviews, differences between the expectations and availability of certain kinds of resources for research, as well as different constraints, surfaced repeatedly. While working with computing researchers sometimes brought novel and valuable resources, they could also create new challenges.
Funding asymmetries. External funding from government, corporate, and private sources is often a sustaining source of support for university-based computer security and privacy researchers, which may contrast starkly with the practices and expectations of other departments. Our social scientist participant put it simply, “I don’t have pressure for external funding” (P3). While we acknowledge that the freedom from funding pressures that she experienced was likely because she was based at a teaching institution and is not representative of the social sciences in its entirety, this meant that she could pursue intellectual partnerships without regard to funding for herself or her collaborators. For our journalism participant, however, the disjoint funding expectations between her department and a potential computational collaborator was enough to derail an otherwise promising partnership: “In my field, $1,000 is amazing. In computer science, that wasn’t enough for his department chair or his dean…And so that [partnership] kind of went sideways” (P1).
In non-academic settings, funding asymmetries can pose a different kind of challenge due to uneven access to resources. As our civil society expert described it:
“[Many] organisations are not understanding that…if your organisation has financial health…that doesn’t mean that the rest of the people that you’re trying to target are [also doing well].” (P4)
In this instance, our civil society participant noted that potential collaborators at well-resourced non-governmental organisations (NGOs) often overlooked the material circumstances of the communities they were meant to be partnering with, at times designing events that presumed access to credit cards or ready cash that partners from the Global South, for example, might not have.
Students as assets. Despite the challenges posed by their higher funding expectations, collaboration with academic computer security and privacy researchers sometimes brought human assets that were less common in their own departments: students. Whether they contributed directly to the research through data gathering, analysis and publication preparation, or went on to become interdisciplinary scholars themselves, working with students was an unanticipated benefit for multiple participants. As our college-based social scientist put it: “I’ve really come to enjoy working with grad students and postdocs…[they] are a really good group of people to be including or considering as [a] part of interdisciplinary teams” (P3). For the legal expert:
“You just get a broader set of people thinking about issues that you care about…We’ve had a lot of students who over the years have come back and said, ‘I went down this path after taking that [interdisciplinary] class.’” (P2)
In short, for participants whose role did not usually include working with students, the opportunity to connect with students provided both new insights and a distinct form of value and helped validate the effort they put into their interdisciplinary partnerships.
Access to unique expertise. Though many computing research publications are freely available, our participants articulated that direct collaboration was more valuable for their work than relying on published literature. As a legal expert pointed out:
“It’s very difficult for litigators to rely on published literature just because…you have no idea whether [a given paper represents] the state of the art, or whether in the last two years since it was published, it’s been superseded by something else. That’s even assuming we understand what the paper says, which in a lot of these fields we won’t.” (P2)
Although not necessarily confined to computer security and privacy research, the observation that understanding “the state of the art” in another discipline without a “translator” (Manata et al., 2024) especially given the fast-paced, networked and contextualised nature of contemporary scholarship (Baveye, 2013), remains important for distilling relevant knowledge and generating of new ideas.
Modeling effective teamwork. Several of our participants noted that working alone or with only a single collaborator was the norm in their discipline, and they appreciated the way that their computer security and privacy collaborators assembled and managed teams. As one social scientist said:
“[Computer security and privacy experts have] just got, I think, a better approach to pulling people in…[and] working in a team-based setting.” (P3)
The typically team-based nature of computer security and privacy research meant that our participants’ computing partners often brought with them mature, well-defined systems for communicating and collaborating as a team. While the initial learning curve for these tools could be a burden, the overall reduction in ‘start-up’ effort was broadly viewed by our participants as beneficial.
Increased real-world salience. For some participants, the opportunity to engage in interdisciplinary collaborations allowed for crucial intellectual cross-pollination that further enriched their research and its reach. For example, collaborating with computer security and privacy experts helped some connect their work to concerns outside the academy; this was especially true for participants whose disciplines had been most radically impacted by technological change.
Our journalism participant, for example, said: “I like doing impactful research. I think too many studies in [journalism] are boring and not solving any problems, or we’re not asking good questions. And I think a lot of good questions are interdisciplinary and applied” (P1). For our legal scholar, the underlying validity of their own legal work turned on the accuracy of their technological understanding, which required robust partnership with computer security and privacy experts:
“A lot of the legal and also legal policy questions around new technology – you have to understand the technology. You can’t wish [something] into existence…you have to understand [the] complications if you want to be an effective participant in that conversation.” (P2)
At the same time, partnering with technologists also extended the reach of their legal perspectives: “A really fascinating product of [working closely with a technologist was that] he could bring [legal] values to a totally different audience and maybe have an impact in a context where those values otherwise had no place” (P2). Similar to his experience working with students, this participant found that robust interdisciplinary work was transformative for all parties involved.
Recommendations for interdisciplinary computer security and privacy research and development
For most of our participants, interdisciplinary work represented a career choice nearly as defining as their primary discipline. All of our participants were part of interdisciplinary collaborations with computer security and privacy experts for several years; four of the five had maintained such partnerships for well over a decade. This suggests that, despite its challenges, interdisciplinary work provides sufficient benefits to our participants to make it worthwhile. Despite their varied backgrounds, a number of themes emerged as they reflected on the qualities that made these partnerships both internally and externally successful and sustainable.
Perhaps unsurprisingly, most of their recommendations revolved around practices that established and supported meeting the needs of all collaborators. As one participant put it: “Listen. And also do your homework…when you go to people, [be] humble enough to say, ‘I’m going to listen. Let me know what you do, what you need.’” (P4). Another participant expressed a similar concept in the form of finding potential collaborators who were genuinely “curious”: “A really big characteristic of all my collaborations is that I work with really curious people…They want to solve practical problems that are really applied” (P1).
Our findings also clarified the need for flexibility around everything from communication platforms and methods to publishing tools, even though these could be especially challenging. As the social scientist we spoke with explained:
“I don't understand this, but fine, we’re going to have these really hard deadlines. The paper has to be in on this exact date, at this exact time. I had to learn Overleaf. I hated it, right? These kinds of things I have experience navigating and I complain about all the time to [my CS collaborators], which I think they think is funny.” (P3)
Though it is possible that this participant’s computing collaborators found her research methods equally inexplicable, this comment highlights that sometimes collaborating means accepting another way of doing things. The importance of this type of flexibility was also articulated by another participant, who said: “One of the meta tips I would have is being flexible with modes of communication. Like, ‘LaTeX or Google Docs?’…Figuring out how the team works and how different people choose to communicate, because there’s different norms, is good” (P1). Perhaps most important, however, was finding the space to make a human connection, whether before trying to establish an interdisciplinary partnership, or when already in one. In the end, as one participant put it, “Going to grab a coffee with somebody in the other department is incredibly effective” (P1).
Just as our participants had positive insights to share from their collective decades of working in interdisciplinary computer security and privacy teams, they also shared pitfalls for those from computing disciplines to avoid when seeking collaboration outside their field. One participant described the signs that a potential partner is looking for a true partnership: “[When] people come to you because they want access to a particular network or particular people that you’re bringing together, but it’s all about them. There’s no concern about what your needs are; there’s no transparency” (P4). For computer security and privacy researchers, familiarising themselves with best practices in methodologies like community-based research partnerships (e.g. Minkler, 2005) may be a useful first step.
Finally, we note that organisations and funders can support interdisciplinary teams by supporting activities like the design-thinking workshop mentioned by one of our participants. Explicitly articulating how interdisciplinary funding and publications will be viewed and weighed as part of teaching, promotion, tenure, and impact evaluations can help researchers, in particular, more effectively design their collaborations and allocate their efforts.
Limitations
Although we intentionally designed our participant sample to represent the perspectives of individuals from a variety of disciplines and sectors, the qualitative and exploratory nature of this research means that our findings do not represent all experiences and may not generalise to other populations. Our potential pool of interview participants, in particular, was limited by our goal of specifically learning from individuals who did not hold an appointment in engineering, computer science or information science, and who had also recently authored papers in competitive computer security and privacy conferences. Restricting our participants to those who did not hold computational appointments, in particular, excluded many individuals who hold joint appointments, or whose doctoral work, for example, is in a field other than computing (e.g. philosophy) but who may currently hold an appointment in a computing department. Even with a larger sample, of course, there is no way to enumerate all possible challenges that interdisciplinary researchers may encounter, nor to reasonably quantify those which might be most prevalent or significant. Still, by intersecting findings from related literature with original insights from researchers who have collaborated successfully with computer security and privacy experts at the highest levels of academia and civil society, this exploratory work seeks to add detail and nuance to the challenges that these partnerships face in an attempt to provide useful insights to those interested in or already doing this type of work.
Conclusions & future work
While our focus has been on how interdisciplinary teams manage differences in disciplinary vocabularies, epistemologies, currencies, and resources, our findings also illustrate the role that funders and institutions can play in making these collaborations possible.
Because research and development work in computer security and privacy is often quickly commercialised or publicly disseminated, interdisciplinary perspectives on the real-world utility and impact of computational systems is an essential bulwark against their unintentionally eroding essential democratic rights and values. As such, conducting high-quality computer security and privacy work increasingly requires robust conceptual and practical understandings of how collaborations can be developed and sustained across both disciplinary and sectoral lines. Our goal with this work has been to identify key thematic areas where challenges tend to emerge for interdisciplinary computer security and privacy research teams, in the hopes that future work may address the development and evaluation of both concrete instruments and theoretical framings that can reduce the severity of those challenges for future collaborators. While far from conclusive, our findings suggest that a range of efforts – from structured reflection exercises for identifying possible vocabulary differences to workshops where teams build shared paradigms (such as the “design thinking” workshop described by one participant) – have the potential make interdisciplinary computer security and privacy research more successful and sustainable. Likewise, ongoing research that helps identify the organisational characteristics and policies where interdisciplinary research flourishes could provide a useful model both for other organisations and for interested researchers who are seeking a supportive environment.
While this work highlights that much remains to be done, we nonetheless encourage anyone considering this type of collaboration to take on board a final thought from one of our participants: “If I have any advice, it’s just do it…You just have to have a leap of faith and see how it works” (P2).
References
Baveye, P. C. (2013). Addressing key challenges to interdisciplinary research on water-related issues: Biologists’ engagement and funding structure. Biologia, 68(6), 1087–1088. https://doi.org/10.2478/s11756-013-0280-5
Benjamin, R. (2019). Race after technology: Abolitionist tools for the New Jim Code. Polity.
Bennett, C. J. (2011). In defense of privacy: The concept and the regime. Surveillance & Society, 8(4), 485–496. https://doi.org/10.24908/ss.v8i4.4184
Bigo, D. (2006). Security, exception, ban and surveillance. In D. Lyon (Ed.), Theorizing.
Bowker, G. C., & Star, S. L. (1999). Sorting things out: Classification and its consequences.
Brayne, S. (2021). Predict and surveil: Data, discretion, and the future of policing.
Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of Machine Learning Research 81, 1–15. https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf
Cederbladh, J., Eramo, R., Muttillo, V., & Strandberg, P. E. (2024). Experiences and challenges from developing cyber‐physical systems in industry‐academia collaboration. Software: Practice and Experience, 54(6), 1193–1212. https://doi.org/10.1002/spe.3312
Choski, M., A., L. Q. M., Lloyd, T., Tao, R., Grimmelman, J., & Naaman, M. (2024). Under the (Neighbor)hood: Hyperlocal Surveillance on Nextdoor. Proceedings of the CHI Conference on Human Factors in Computing Systems 2024, 1–22. https://doi.org/10.1145/ 3613904.3641967
Cooper, A. F., Lee, K., Grimmelmann, J., Ippolito, D., Callison-Burch, C., Choquette-Choo, C. A., Mireshghallah, N., Brundage, M., Mimno, D., Choksi, M. Z., Balkin, J. M., Carlini, N., Sa, C. D., Frankle, J., Ganguli, D., Gipson, B., Guadamuz, A., Harris, S. L., Jacobs, A. Z., … Zeide, E. (2023). Report of the 1st workshop on generative AI and law (No. arXiv:2311.06477). arXiv. https://doi.org/10.48550/arXiv.2311.06477
Foucault, M., & Sheridan, A. (1977). Discipline and punish: The birth of the prison. Penguin.
Garousi, V., Petersen, K., & Ozkan, B. (2016). Challenges and best practices in industry-academia collaborations in software engineering: A systematic literature review. Information and Software Technology, 79, 106–127. https://doi.org/10.1016/j.infsof.2016.07.006
Gersdorf, T., He, V. F., Schlesinger, A., Koch, G., Ehrismann, D., Widmer, H., & Von Krogh, G. (2019). Demystifying industry–academia collaboration. Nature Reviews Drug Discovery, 18(10), 743–744. https://doi.org/10.1038/d41573-019-00001-2
Gewin, V. (2014). Interdisciplinary research: Break out. Nature, 511(7509), 371–373. https://doi.org/10.1038/nj7509-371a
Gibson, C., Stutchbury, T., Ikutegbe, V., & Michielin, N. (2019). Challenge-led interdisciplinary research in practice: Program design, early career research, and a dialogic approach to building unlikely collaborations. Research Evaluation, 28(1), 51–62. https://doi.org/10.1093/reseval/rvy039
Jacob, J. W. (2015). Interdisciplinary trends in higher education. Palgrave Communications, 1(1), 15001. https://doi.org/10.1057/palcomms.2015.1
Kenny, C. T., Kuriwaki, S., McCartan, C., Rosenman, E. T. R., Simko, T., & Imai, K. (2021). The use of differential privacy for census data and its impact on redistricting: The case of the 2020 US Census. Science Advances, 7(41), eabk3283. https://doi.org/10.1126/sciadv.abk3283
Kerr, I. R., Steeves, V. M., & Lucock, C. (2009). Lessons from the identity trail: Anonymity, privacy and identity in a networked society. Oxford university press.
Kohno, T., Acar, Y., & Loh, W. (n.d.). Ethical frameworks and computer security trolley problems: Foundations for conversations. USENIX Symposium on Usable Privacy and Security (SOUPS) 2023, 1–41.
Kovacic, Z., & Marcos-Valls, A. (2023). Institutionalising interdisciplinarity in PhD training: Challenging and redefining expertise in problem-oriented research. Environmental Education Research, 29(3), 473–488. https://doi.org/10.1080/13504622.2023.2174252
Leahey, E. (2007). Not by productivity alone: How visibility and specialization contribute to academic earnings. American Sociological Review, 72(4), 533–561. https://doi.org/10.1177/000312240707200403
Leigh, J., & Brown, N. (2021). Researcher experiences in practice-based interdisciplinary research. Research Evaluation, rvab018. https://doi.org/10.1093/reseval/rvab018
Liang, C., Mooney, S., Kurkalova, L., Roper, D. K., & Hashemi-Beni, L. (2021). Best practices and lessons learned in grant writing for ag/applied economists to engage in interdisciplinary studies. Agricultural & Applied Economics Association, 3(2), 1–18.
Lyon, D. (1994). The electronic eye: The rise of surveillance society. University of Minnesota.
Lyon, D. (2001). Surveillance society: Monitoring everyday life. Open University.
Lyon, D. (2003). Surveillance after September 11. Polity Press in association with Blackwell.
Lyon, D. (2007). Surveillance studies: An overview. Polity.
Manata, B., Bozeman, J., Boynton, K., & Neal, Z. (2024). Interdisciplinary collaborations in academia: Modeling the roles of perceived contextual norms and motivation to collaborate. Communication Studies, 75(1), 40–58. https://doi.org/10.1080/10510974.2023.2263922
Marcolin, B. L., & Saunders, W. C. (2015). A strategic roadmap for navigating academic-industry collaborations in information systems research: Avoiding rigor mortis. ACM SIGMIS Database: The DATABASE for Advances in Information Systems, 46(3), 23–51. https://doi.org/10.1145/2804075.2804078
Marx, G. T. (1988). Undercover: Police surveillance in America. University of California Press.
Marx, G. T. (1998). Ethics for the new surveillance. The Information Society, 14(3), 171–185. https://doi.org/10.1080/019722498128809
McKee, K. E., Serrano, D., Girvan, M., & Marbach-Ad, G. (2021). An integrated model for interdisciplinary graduate education: Computation and mathematics for biological networks. PLOS ONE, 16(9), e0257872. https://doi.org/10.1371/journal.pone.0257872
McLeod, J., Zhang-Kennedy, L., & Stobert, E. (2024). Comparing teacher and creator perspectives on the design of cybersecurity and privacy educational resources. USENIX Symposium on Usable Privacy and Security (SOUPS) 2024, 1–18.
Minkler, M. (2005). Community-based research partnerships: Challenges and opportunities. Journal of Urban Health: Bulletin of the New York Academy of Medicine, 82(2_suppl_2), ii3–ii12. https://doi.org/10.1093/jurban/jti034
Nelson, B. (2011). Interdisciplinary studies: Seeking the right toolkit. Nature, 476(7358), 115–117. https://doi.org/10.1038/nj7358-115a
Nissani, M. (1997). Ten cheers for interdisciplinarity: The case for interdisciplinary knowledge and research. The Social Science Journal, 34(2), 201–216. https://doi.org/10.1016/S0362-3319(97)90051-3
Norris, C., & Armstrong, G. (1999). The maximum surveillance society: The rise of CCTV. Berg.
Outhit, J. (2025, May 30). Do you work from home? University of Waterloo researchers raise red flags over remote surveillance. The Record. https://www.therecord.com/news/waterloo-region/do-you-work-from-home-university-of-waterloo-researchers-raise-red-flags-over-remote-surveillance/article_671e0d9b-e52b-549a-9f00-ddff7b93f1ea.html
Pahwa, M., Cavanagh, A., & Vanstone, M. (2023). Key informants in applied qualitative health research. Qualitative Health Research, 33(14), 1251–1261. https://doi.org/10.1177/10497323231198796
Park, M., Leahey, E., & Funk, R. J. (2023). Papers and patents are becoming less disruptive over time. Nature, 613(7942), 138–144. https://doi.org/10.1038/s41586-022-05543-x
Payne, B. K., He, W., Wang, C., Wittkower, D. E., & Wu, H. (2021). Cybersecurity, technology, and society: Developing an interdisciplinary, open, general education cybersecurity course. Journal of Information Systems Education, 32(2), 134–140.
Schensul, J. J., Robison, J., Reyes, C., Radda, K., Gaztambide, S., & Disch, W. (2006). Building interdisciplinary/intersectoral research partnerships for community-based mental health research with older minority adults. American Journal of Community Psychology, 38(1–2), 23–25. https://doi.org/10.1007/s10464-006-9059-y
Schneble, C. O., Elger, B. S., & Shaw, D. (2018). The Cambridge Analytica affair and internet‐mediated research. EMBO Reports, 19(8), e46579. https://doi.org/10.15252/embr.201846579
Smith, G. J. D. (2002). Behind the screens: Examining constructions of deviance and informal practices among CCTV control room operators in the UK. Surveillance & Society, 2(2/3). https://doi.org/10.24908/ss.v2i2/3.3384
Sohrawardi, S. J., Wu, Y. K., Hickerson, A., & Wright, M. (2024). Dungeons & Deepfakes: Using scenario-based role-play to study journalists’ behavior towards using AI-based verification tools for video content. Proceedings of the CHI Conference on Human Factors in Computing Systems, 1–17. https://doi.org/10.1145/3613904.3641973
Solon, O. (2019, March 12). Facial recognition’s 'dirty little secret’: Millions of online photos. NBC News. https://www.nbcnews.com/tech/internet/facial-recognition-s-dirty-little-secret-millions-online-photos-scraped-n981921
Steeves, V. (2024). Reclaiming the social value of privacy. In I. Kerr, V. M. Steeves, Lucock, S. J. Sohrawardi, Y. K. Wu, A. Hickerson, & Wright, M. (Eds), Dungeons & Deepfakes: Using scenario-based role-play to study journalists’ behavior towards using AI-based verification tools for video content (pp. 1–17). ACM.
Sweeney, L. (n.d.). Simple demographics often identify people uniquely. Health (San Francisco), 671(2000), 1–34.
Thompson, D. E., & Molnar, A. (2023). Workplace surveillance in Canada: A survey on the adoption and use of employee monitoring applications. Canadian Review of Sociology/Revue Canadienne de Sociologie, 60(4), 801–819. https://doi.org/10.1111/cars.12448
Tobi, H., & Kampen, J. K. (2018). Research design: The methodology for interdisciplinary research framework. Quality & Quantity, 52(3), 1209–1225. https://doi.org/10.1007/s11135-017-0513-8
Towfighi, A., Orechwa, A. Z., Aragón, T. J., Atkins, M., Brown, A. F., Brown, J., Carrasquillo, O., Carson, S., Fleisher, P., Gustafson, E., Herman, D. K., Inkelas, M., Liu, W., Meeker, D., Mehta, T., Miller, D. C., Paul-Brutus, R., Potter, M. B., Ritner, S. S., … Yee, H. F. (2020). Bridging the gap between research, policy, and practice: Lessons learned from academic - public partnerships in the CTSA network. Journal of Clinical and Translational Science, 4(3), 201–208. https://doi.org/10.1017/cts.2020.23
Turner, V. K., Benessaiah, K., Warren, S., & Iwaniec, D. (2015). Essential tensions in interdisciplinary scholarship: Navigating challenges in affect, epistemologies, and structure in environment - society research centers. Higher Education, 70(4), 649–665. https://doi.org/10.1007/s10734-015-9859-9
Vantard, M., Galland, C., & Knoop, M. (2023). Interdisciplinary research: Motivations and challenges for researcher careers. Quantitative Science Studies, 4(3), 711–727. https://doi.org/10.1162/qss_a_00265
Westin, A. F. (1970). Privacy and freedom (1st edn). Atheneum.
Whitten, A., & Tygar, J. D. (1999). Why Johnny can’t encrypt: A usability evaluation of PGP 5.0. USENIX Security Symposium, 348, 169–184.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. Public Affairs.
Footnotes
1. For example, in academic circles peer-reviewed publications may constitute a higher value “currency” in career advancement terms than participation in a high-impact public outreach campaign; for non-profit researchers, the opposite may be true.
2. As our findings will address, reconciling IRB processes is a key challenge in interdisciplinary work, and one that this interdisciplinary team of authors also encountered in the course of this work. As a result, the first author was responsible for recruitment, interviews, and initial coding, while the second author conducted a review of the literature.