Cryptoparties: empowerment in internet security?

: Cryptoparties (CPs) are a global movement of forums where citizens can come to learn how to improve their digital privacy and security. The present paper is one of the few empirical studies on CPs and is based on participant observation of three CPs. I demonstrate that the organisers of CPs strive for an egalitarian space for teaching and learning. Even though this goal is not always achieved, CPs might still serve as an example of citizen education in a technological society where every citizen needs to deal with complex technological issues. In addition, this paper contributes to the emerging debate on ‘doing internet governance’, broadening our focus to include user-based and decentred practices. I argue for the political relevance of CPs showing how they enact decentred threat-scenarios to a non-expert public.


Introduction
This paper starts from the assumption that understanding the governance of networked technologies and related societal values such as privacy and security requires us to go beyond a focus on legal and institutional aspects. Indeed, a more comprehensive understanding of the politics of the internet 'requires unpacking the micro-practices of governance as mechanisms of distributed, semi-formal or reflexive coordination, private ordering, and use of internet resources' (Epstein, Katzenbach, & Musiani, 2016, p. 4). The success of privacy and security is not only determined by legislation but is heavily dependent upon the use of individual practices to counter surveillance and retain privacy (Bauman et al., 2014;Marx, 2015;Bellanova, 2017). However, these practices are often perceived as highly complex, and end users are often hesitant about using seemingly complicated tools. The focus of this article lies on one practice -cryptoparties -which attempt to combat these anxieties and teach privacy and security tools to the layperson. In doing so, this paper offers a valuable empirical study into a largely unknown phenomenon (but see Kannengießer, 2019) and corroborates previous pleas in this journal (Epstein, Katzenbach & Musiani, 2016) that more attention must be paid to micro-practices.
Cryptoparties (CPs) are not actually 'parties' but rather open meetings where individuals can get the help they need to improve their digital privacy and security.
These meetings happen all around the world, mostly in public spaces such as cafes or universities. CPs originated in 2011 in Australia, but today most of them occur in Europe (CryptoParty, 2013, p. 13). Sigrid Kannengießer summarises the rationale of these activities as practices that 'aim to empower ordinary people by on the one hand informing them about critical aspects of datafication processes and on the other hand enabling them to engage with their digital media technologies, encrypt those and online communication processes' (Kannengießer, 2019, p. 12;see Loder, 2014, p. 814). CPs create an entry point for studying everyday practices and how they fit into wider political debates. In the case of CPs, we can observe how mundane practices such as choosing a secure browser or implementing better passwords are key to enacting privacy. Being a 'mechanism of civic engagement' , CPs qualify as a practice of internet governance under the definition of Epstein and colleagues (Epstein, Katzenbach, & Musiani, 2016, p. 7).
This article brings the political significance of CPs into the spotlight and explains the role of CPs in the broader development of privacy and security controversies.
Political science has been rather silent on the topic of CPs, most studies instead focusing more broadly on encryption and internet governance (Herrera, 2002;Monsees, 2020;Schulze, 2017;Myers West, 2018). Shifting the focus to the practices of CPs also allows the conceptual focus to shift away from institutions and towards more mundane and decentered practices.
I demonstrate throughout the article how CPs enact a diffuse kind of security politics where neither threat nor countermeasures work through one central institution but through mundane, decentred practices (Huysmans, 2016). In the following section, I draw on more recent contributions in the field of internet governance and international relations in order to argue that a sensitivity towards mundane practices is crucial for understanding the creation of internet security and privacy.
The empirical study is mainly based on participant observation, the methodology of which I lay out in the second section. The main part of the paper presents the results of this empirical study. I argue that the specific format of cryptoparties allows them to teach relevant privacy tools and adapt to both the abilities of end users and a changing socio-technical environment. I demonstrate how CPs themselves are decentred and can adapt over time: CPs enact a decentred threat scenario that focuses less on institutions and more on individuals and their needs.
The paper therefore provides novel empirical insights while at the same time showing how a shift in perspective to mundane security practices can enrich the study of internet governance.

Internet governance: on the political significance of decentred practices
Internet governance (IG) is usually analysed as a form of 'multistakeholderism' , which is defined as: two or more classes of actors engaged in a common governance enterprise concerning issues they regard as public in nature, and characterized by polyarchic authority relations constituted by procedural rules (Raymond & DeNardis, 2015, p. 573, see also Hofmann, 2016. Private entities, NGOs and hybrid organisations such as ICANN, in addition to national governments, are all involved in the governance of the global infrastructure that constitutes the internet (DeNardis, 2010;Mueller, 2010). However, in line with more recent contributions that try to broaden the empirical and analytical focus of IG, I demonstrate in this paper the value of looking at less institutionalised practices. 1 For example, a special issue of this journal has shown the need to focus on 'doing internet governance' (Epstein, Katzenbach, & Musiani, 2016). Much IG research remains on the institutional level which 'largely overlooks the mundane practices that make those institutions tick, thus leaving important blind spots in both conceptual and substantive understanding of the practices and power arrangements of IG' (Epstein, Katzenbach & Musiani, 2016, p. 4). Taking it even further, van Eeten and Mueller argue that the constitution of the field of IG creates systemic blind spots: the specific boundaries between IG and other fields limit the scope and prevent deeper engagement with other fields. There is 'a tendency to think of governance as being produced by, or taking place in, formal organizations with explicitly institutionalized rules and procedures' (van Eeten & Mueller, 2012, p. 727). IG is, then, always linked to institutional settings in which IG is ' explicitly the topic of discussion' (ibid.). Limiting the analytical focus leads to a biased understanding of these formal structures and underestimates the political significance of informal practices ( van Eeten & Mueller 2012, p. 730). Consequently, taking an analytical view to seemingly insignificant practices gives us a more thorough understanding of how 'privacy' or 'security' are enacted (Christensen & Liebetrau 2019). As I will argue throughout the article, we can then see how decentred practices are, in fact, politically significant.
Cryptoparties are not anchored in one central organisation but are rather a decentralised, global form of technological activism and education. My article provides both a much-needed empirical study of CPs and an illustration of the value in expanding the scope of IG. I investigate how activists and citizens come together and how knowledge about privacy and security, core aspects of IG, circulates and is put into practice. Such a 'bottom-up perspective focuses on the mutual adjustments we make in our daily social life' (Hofmann, Katzenbach, & Gollatz, 2016, p. 1414, thereby illustrating the ordering effects of 'day-to-day practices that organize our social lives' (idem). Such a shift in perspective illuminates how, for example, 'security' results from a multiplicity of practices, actors and technologies, and not solely from governmental institutions and legal regulations (Hofmann, Katzenbach, & Gollatz, 2016). Mikkel Flyverbom expanded on these insights by drawing on the field of science and technology studies. For him, a crucial issue that has been neglected by IG concerns 'the entanglement of technology and social practices and the ordering effects of processes of digitalisation and datafication' (Flyverbom, 2016, p. 2). Flyverbom argues that an understanding of regulation as 'institutionalised, deliberate and goal-oriented interventions by public or private actors' is too narrow. Indeed, the de facto enactment of 'bigger' issues such as privacy and security is not only a result of governance efforts but also largely relies on individual actions. The political significance of privacy and security lies not only in a particular institutional set-up but in the micro-practices of individuals (Solomon & Steele, 2016;Isin & Ruppert, 2015). These practices are important for maintaining security but also shape the insecurities people experience (Guillaume & Huysmans, 2013;Selimovic, 2019). Mundane practices such as firewalls or spam-filters are equally important to legal regulations when it comes to securing networked technology.
These insights motivate the scope of this article in analysing informal, decentralised practices that shape privacy and security. The objective of this article is not to weigh the relative importance of institutionalised vs non-institutionalised practices. The aim is to provide a more thorough understanding of how the actions of users are shaped by more than formal rules and legislation.

Methodology and description of the field
From my theoretical discussion on the importance of a bottom-up perspective on decentred practices, it follows that I needed a methodological toolkit that allowed me to capture these practices. I combined document analysis, participant observation and informal interviews (Gillespie & Michelson, 2011). ing my questions. I was able to observe the CPs and ask questions. I also had the chance to ask more detailed questions before the CPs and at one meeting where they planned the next CP. This allowed me to get background information about the organisers and their views on CPs and how they developed.
I attended three CPs which took place in two different cities and one more meeting of a planning event for a CP. 2 The meetings lasted between one and four and a half hours and took place between July and November 2019. 3 I took only written notes while attending the meeting. Since my interviewees were sensitive to privacy, I was unable to record any interviews, but I took detailed notes which I expanded on immediately after the end of the CPs. These field notes form the base of the 2. I also attended two hackerspaces in the hope of receiving access to people formerly involved in CPs. Even though these visits were helpful in gathering insights into the culture of the hacker scene (see Kubitschko, 2015;Coleman, 2010) they did not give me further access to people involved in CPs.
3. In order to protect the privacy of the participants, I will refrain from mentioning the locations of the CPs. results described below. I only use direct quotes for the statements that I was able to write down directly during the CP (see Emerson, Fretz, & Shaw, 2001 about being rather overwhelmed by the complexity of internet security (for an exploration of gender stereotypes in the hacker scene see Tanczer, 2016). They also described a feeling of anxiety and the need to 'start somewhere' because they lacked an overview of possible threats. 6

The conduct of cryptoparties
An Australian woman working under the pseudonym of Asher Wolf initiated the first CP out of an interest in digital privacy (Poulsen, 2014;Radio Free Europe, 2012). It is interesting to note that she was not a 'hacker' or an expert but started the movement out of an interest in learning about privacy and security practices.
Today, CPs are organised around the world with the most regular parties occurring in Europe (CryptoParty, 2019d). CPs do not rely on one centralised organisation.
Many are organised by people who were previously active in the hacker scene, and most organisers work in IT.
4. One CP that I attended indeed had zero participants. Since people do not need to sign up beforehand, this can happen.
5. I am aware that my selection is not representative in quantitative terms. However, based on my interviews and the existing literature I could confirm that the CPs I visited were typical in a ' qualitative' sense.
6. Women also face challenges such as harsher and more violent forms of trolling (Herring et al., 2002) One can generally distinguish between two kinds of CPs. Many CPs are organised by activists and advertised on a wiki which is at the central website with all information about how to organise a CP (CryptoParty, 2019b). Often, these CPs reoccur on a monthly or bi-weekly basis in the same space, often using public spaces such as cafes, cultural centres or hackerspaces. However, some are conducted by political parties, interest groups, academic conferences or other types of independent organisations. These CPs are not publicised to potential participants through the main website but through the specific network of that organisation.
The CPs themselves differ in their particular way of 'teaching' technological tools.
One CP might mainly provide mostly one-on-one tutorials, whereas others split the whole group into smaller discussions, whereas others might focus on one spe- an organiser know in advance if they need help with something specific. An organiser will then help with that issue. The small groups cover a range of issues from basic 'safe surfing tools' to a more abstract introduction into 'how the internet works' to detailed explanations of email encryption or how to use programmes such as Tor or Tails 7 to protect anonymity to a larger degree. Participants learn, for example, about add-ons like 'https-everywhere' or learn about the advantages of certain web browsers when it comes to privacy. The organisers call this 'digital self-defence' or the development of a 'security culture' . 8 A CP lasts a few hours, and the atmosphere is very informal and relaxed, allowing participants to ask their questions and raise specific concerns.
My fieldwork shows that certain ideas are commonly mentioned (e.g., 100% security is impossible) and that certain tools are frequently taught (e.g., selecting a safe browser for surfing the internet). These commonalities go back, at least in part, to a code of conduct which is published on the central wiki (CryptoParty, 2019a). All interview partners referred (at least implicitly) to the Code of Conduct. At two of 7. Tor (The Onion Router) allows for anonymous surfing by bouncing a user's data and requests through a set of relay servers. References to the 'dark web' usually indicate browsing via Tor. Tails is a programme that allows one to boot from, for example, a USB stick and relies on Tor for even greater privacy. However, the programme as such is more time intensive. The installation process, done at one CP, took several hours.
the CPs that I attended, the Code of Conduct was explained in the beginning. 9 The Code specifies that harassment is not tolerated and that CPs should be open to the public. However, there are also more specific rules such as 'Other People's Keyboards Are Lava -Don't touch anyone's keyboard, but your own' (CryptoParty, 2019a). This rule is based on the pedagogical reasons that the participants learn more if they have to do everything on their own. For privacy reasons it is also considered a bad habit to use other people's devices.

The politics of cryptoparties Diffused politics
The previous section outlined the conduct of CPs in detail. In this section, I will detail specific aspects of CPs to analyse how we can understand these activities as politically significant practices that are relevant to internet governance.
CPs first developed in response to a very specific controversy around Australian legislation but later spread in the context of global controversy about commercial and state-led mass-surveillance (Poulsen, 2014). They saw renewed interest in the aftermath of the Snowden revelations. Indeed, some of my informants told me that they held CPs with several hundred participants immediately after the Snowden revelations. Today, internet security and privacy are part of most people's daily routines: entering passwords, shielding cameras, and deleting cookies are just a few of the most relevant practices. We can see that the context of CPs evolved from a very particular concern with a piece of legislation to a more diffuse understanding of where the problem lies, including the realisation that there is no "one-size-fitsall" recipe to pick the best privacy tool. One organiser illustrated this by emphasising that every participant has different needs when it comes to security measures 10 and that one needs to develop a security culture. 11 Security culture refers to the idea that security is always situational and is always both affecting other peo-  (Musiani & Ermoshina, 2017, p. 54).
What is striking, however, is that very specific legislation and events are rarely mentioned. For example, I expected the Snowden revelations to be a core event for most participants but when asked about it, they said that they were either already active at the time or only joined the CPs later. 14 A consistent reason given for activism at the CPs and the two hackerspaces I visited was a general concern that politicians, on the whole, do not have much tech expertise. It is striking that the concern seems to focus on politicians as a group and the general political context but not on specific people or events. The desire to learn about technological tools is thus motivated by the larger societal context rather than a reaction to a distinct experience. There is an observable set of diffused controversies and threat-scenarios around surveillance and privacy (for a discussion on the role of dispersion in surveillance society see Huysmans, 2016). CPs react to a type of 'political situation' (Barry, 2012) in which digital practices of internet security and practice become a matter of concern. Importantly, this political situation is not characterised by only one particular problem, but a constellation of security issues: government surveillance, data collection by private companies, phishing and targeted hacking attacks. CPs are a result of and deeply embedded in public controversies revolving around internet security, privacy and the roles of both global ICT companies and secret services.
The relevance of CPs for understanding internet governance lies in the way they illuminate the importance of mundane practices (and not only top-down steering) in the enactment of privacy and security on a broad scale. As discussed in the conceptual section of this paper, a bottom-up perspective gives us a more thorough understanding of the role CPs play in enacting a specific understanding of security and privacy. Activists and experts alike acknowledge that users need to account for their own personal threat-scenario. Hence there exist no universal ideal, technologies or practices, but only solutions appropriate to each individual situation (see Musiani & Ermoshina 2017, p. 69;Ermoshina & Musiani, 2018). Hence, internet security and privacy are not only seen as a function of legal regulation but also something that needs to be established anew by each individual in every situation.
It also becomes clear that CPs are spaces in which diffused politics are enacted.
14. In the context of the Snowden revelations, Gürses et al. (2018, p. 581) observed that while technologies were contested, the larger economic and socio-structural questions were hardly debated. Hence, they argue that ultimately the encryption debate after Snowden had depoliticising effects (for a different assessment see: Monsees, 2020).
Rather than constructing a centralised threat-scenario (the state! Facebook!), what emerges is both a diffuse and decentred image of prevailing threats and also its solution.

'Experts' and 'participants'
One core issue for CP organisers is the relationship between those that teach tools during CPs and those that seek to learn them. The general idea underlying CPs is that anybody can organise one, and my conversation with the organisers revealed that they would prefer it to be the case. In practice, very few people organise CPs and they tend to be the ones with expertise in the field. This is relevant since the initial intention for CPs was not that a few ' experts' teach non-experts but that citizens come together in order to learn together. Asher Wolf, the founder of CPs, was not an expert herself. Those who teach tools are called 'angels' , (Cryptoparty, 2019c) a term that emphasises their helpful, friendly manner rather than characterising them as ' geeky' experts. Currently, CPs are not as egalitarian as originally imagined. In reality, the organisers guide participants through the implementation and use of technology, sometimes even in a lecture-style format. 15 On a more fundamental level, founder Asher Wolf quit CPs because of the persistent misogyny that, she felt, devalued the perspectives of women and laypeople (mati & Wolf, 2012). Less drastically, Kannengießer observes that 'there are strong hierarchies persisting between "teachers" and "students"' (Kannengießer, 2019, p. 7). The tension between the ideal of a self-organised communal effort and the actual practice of learning in more hierarchical ways is crucial for understanding the rationale of CPs.
Whereas CPs cannot function without some kind of hierarchy, the organisers explicitly work against their status in order to create an open space, resisting the tendency CPs have of defaulting to experts. One informant told me he deliberately intends cryptoparties to 'not look too professional' . 16 Another man, explaining the tenets of email encryption to a small group of people, fostered a discussion by deliberately limiting his lecturing. 17 One episode from the last CP I attended illustrates nicely how the original idea of CPs as a communal space of learning persists: a woman who had only attended a few CPs and otherwise did not have much prior knowledge announced in the opening round that she would be leading a small group on some issues she was familiar with. She said she could teach how to 15. CP 1 and CP planning meeting.
16. CP 2 17. CP2 create secure passwords and some basic knowledge about how the internet works.
In her own words, this was 'pure empowerment ' . 18 This episode occurred at a CP that was only open to women, trans and non-binary people, and contributes anec- In the previous section, I showed how CPs contribute to the emergence of decentred threat-scenarios and simultaneously offer a solution. In this section, I looked at how ideas about relational risks also feature in how CP participants relate to each other. Again, the political relevance of CPs does not primarily lie in the way in which they feed back into governmental decision-making processes or their impact on new legal regulation. Rather, their relevance lies in the way they '[embed] concepts such as security and privacy' (Ermoshina & Musiani 2018, p. 18) in a wider context and thereby influence both our perception of these concepts and the practices we deem appropriate in enacting them. The next section zooms in on the issue of privacy.

Cryptoparties without encryption?
While privacy is consistently one of the core goals of CPs, the practices and tools used to achieve and improve it have changed over time. In order to understand the political significance of these technological changes, a short detour into the history of privacy and encryption is necessary.
Cryptoparties, as the name suggests, originally revolved mainly around encryption.
PGP (Pretty Good Privacy; also GNUPG) is the traditional way to encrypt email, based on strong public-key cryptography. 19  since the US government tried to constrain its usage. The objective in regulating encryption is to determine who has access to what kind of information. Cryptography was first a military technology, but its applications multiplied with the emergence of the internet (Kahn, 1997;Singh, 2000). Governments around the world, but especially the US, tried to regulate the use of strong encryption (Diffie & Landau, 2007). Crypto Wars is the umbrella term for controversies around who gets to decide what kind of encryption is available for public use (FindLaw Attorney Writers, 2012;Levy, 1994). The primary question in these early debates was whether encryption should be strong enough to prevent government access to digital communication (Diffie & Landau, 2007;Levy, 2002). Diana Saco has shown that activists fighting for stronger encryption were part of a libertarian hacker scene that was interested in keeping the state out of the internet (Saco, 2002). Ultimately, the use and spread of email encryption programmes such as PGP was legalised. Even though regulations loosened, the hopes of activists for more widespread usage of encryption by end users did not materialise (Diffie & Landau, 2007). Today, there is renewed interest in encryption and data protection due mainly to the revelations by Edward Snowden (Schulze, 2017). 20 The resulting debates and pressure by end users have led to the establishment of new products and services such as encrypted messaging services.
This shift to a broader concern with privacy and data protection mirrors the conduct of CPs. Whereas early CPs focused heavily on email encryption and the use of PGP as an end user solution (CryptoParty, 2013), participants in the CPs I visited showed concern for a variety of vulnerabilities: the collection of data by corporate actors, secure internet banking and targeted hacking attacks. As discussed in the previous two sections, this shift dovetails with the way in which CPs enact decentred threat-scenarios. The changing societal context goes hand in hand with the changing availability of products. But there also seems to be a more technical reason for why PGP and email encryption are no longer a core privacy technology. If used incorrectly, PGP is harmful, and hence is not always taught at CPs. Indeed, email encryption does not constitute the main part of CPs anymore. Only two organisers still consider PGP the best (and only) tool to send email securely. According to them, despite PGP being complicated, it is still the most valuable tool for privacy in a digital environment. Even hackers consider PGP too complicated (Whitten & Tygar, 1999). During my visit at two hackerspaces only one person 19. For a more in-depth description of the principle of public key cryptography, see Monsees (2020, pp. 61-63).
20. That is why I expected the Snowden revelations would be identified as a crucial event by the activists. However, the importance of his revelations was played down in the interviews. This section showed how the political situation, the assessment of encryption technology and the diverse needs of participants all require comprehensive tools to enhance privacy and security. In contrast to earlier battles in the 1990s, the current issues can no longer be understood as a simple controversy about one particular technology such as PGP. Coming back to the insights from the first empirical section, we can see how learning and adapting to an ever-changing political and technological landscape is a core feature of CPs.

Conclusion
The present article is one of the few empirical studies on cryptoparties to date. still present in the current discourse (Schulze, 2017). Cryptoparties started with a strong focus on teaching email encryption, but my empirical observation revealed that current CPs focus on a multiplicity of issues. This shift coincides with observable technological changes. Presently, encryption is much more likely to be embedded as part of other tools. The focus is less on only email encryption (as it was in the controversies in the 1990s) but on how encryption can be part of, for instance, messaging tools. Indeed, encryption is only one part of the solution when thinking about safe surfing, private messaging or protecting one's anonymity.
This also means that a narrow focus on institutional aspects and legal regulations might miss the security and privacy maintenance done by end users on an everyday basis. Understanding this change in the de facto use of tools and their spread requires the study of mundane practices of end users. The focus on the practice of CPs revealed the importance of the idea of establishing a 'security culture' . For the organisers, the aim is not only to teach specific tools but to increase awareness about the multiple vulnerabilities that users might encounter. The organisers want to teach how a higher level of security is possible. Some participants were scared and overwhelmed, prompting the organisers to teach simple tools that will still help to increase privacy and security. In line with previous research, it becomes clear that the idea is not to teach some tools that establish security once and for all but make the participants aware of their own threat-model and the multiplicity of adversaries (see also Ermoshina & Musiani, 2018). This became especially visible in the small groups that discussed 'how the internet works' . Rather than teaching one specific tool, the idea was more to increase knowledge about technology and create awareness of one's specific threat-model.
This speaks to a similar observation William H. Dutton has made about the need for a 'security mindset' . According to him 'In cyber security, the risks are more difficult to communicate, given the multiplicity of risks in particular circumstances' (Dutton, 2017, p. 3), requiring us to rethink how to communicate about these threats. In the cyber context the threats are more diffuse and often not directly felt. The core task is, then, to develop a 'mindset' about beliefs, attitudes and values concerning cyber security. While I do not think that Duttton's solution of using PGP everywhere is attainable for reasons described above, his plea for more encompassing research and policies for sensitising end users is certainly valid. Both future policies and citizen engagement practices can learn from CPs when negotiating the difficult terrain of teaching complex technologies in a political situation where threats to privacy and intrusion come from everywhere. The openness and adaptability of CPs are certainly helpful in an environment which is characterised by high complexity. Especially CPs that focus on women, transgender and nonbinary participants are able to create an open environment where a diverse ensemble of laypeople feel welcome. Mirroring these insights, it becomes clear that the conduct and the study of internet governance encompasses micro-practices and their evolution, and increasingly moves beyond a focus on institutions.