Leveraging interdisciplinary methods for evidence collection in enforcement: Dark patterns as a case study
Abstract
“Dark patterns” are manipulative, deceptive design practices deployed in online services to influence users’ decisions towards undesired or negative outcomes. Interdisciplinary by nature, dark patterns implicate concepts of autonomy and choice from law, human behaviour from the psychology and social science disciplines, and design and human-computer interaction (HCI) from technical fields and industry. A body of enforcement actions and regulatory fines worldwide as discussed within this article comprise a growing effort to minimise the impact of dark patterns. However, despite this regulatory momentum, it remains unknown to what extent scientific research methods and evidence types may influence regulatory decisions, which is relevant for effective evidence-based enforcement.
As such, dark patterns present a case study for reflecting upon narrowing the academic-enforcement divide. Our team spans design, HCI, computer science, and law, and examines investigatory methodologies towards insight for strengthening collaboration between scholars and regulators. This interdisciplinary work considers investigatory methods from both academia and industry, then as inferred from dark patterns enforcement cases to relate methods used by both groups. We discuss challenges and opportunities for tightening the gap between researchers and regulators, and propose suggestions for both scholars and enforcers to tighten feedback loops. We additionally highlight informal investigation methods as an opportunity to strengthen collaboration.
This paper is part of The craft of interdisciplinary research and methods in public interest cybersecurity, privacy, and digital rights governance, a special issue of Internet Policy Review, guest-edited by Adam Molnar, Diarmaid Harkin, and Urs Hengartner.
Introduction
Both scholars and enforcers rely on evidence to support their claims. Scholarly research can inform enforcement practices implicitly, but can also serve as case-specific evidence. Further leveraging each discipline’s treatment of and collection methods for evidence towards stronger collaboration, however, remains an ongoing effort. In particular, questions remain as to what extent empirical methods—such as those frequently employed in human-computer interaction (HCI) scholarship and industry user experience (UX) research—can be more directly translated into enforcement actions, effective evidence-based enforcement, or serve as evidentiary support in legal proceedings, commonly referenced as case law. This lack of methodological clarity presents a valuable opportunity for HCI scholars and legal scholars to reflexively analyse which interdisciplinary evidentiary methodologies and collaborative dynamics regulators and legal experts might regard as relevant and sufficiently authoritative.
This paper explores the methodological innovations, successes and challenges of HCI empirical approaches for collecting evidence to identify and characterise unfair, deceptive, or anti-consumer design practices in user-facing technologies. Of particular interest are methods that could contribute to informing and supporting regulators, legal practitioners, and other relevant stakeholders in the enforcement of user-facing designs. These aforementioned design practices are often referred to as “dark patterns” (Brignull, 2010, Gray et al., 2018). Dark patterns are manipulative, deceptive design practices deployed in online services aimed at influencing the decisions of users (Mathur et al., 2021, Gray et al., 2018) about their purchases (Mathur et al., 2019), use of time or attention (Roffarello, 2023), and disclosure of personal data (Kubicek et al., 2024; Soe et al., 2020; Tran et al., 2024). These design practices bring a unique potential for influencing user behaviour, undermining user agency, leading to privacy loss, and disparately impacting vulnerable or disempowered communities, among other harms.
Research in HCI has been central to the study of dark patterns, contributing foundational knowledge on the nature and types of dark patterns, developing an ontology of dark patterns (Gray et al., 2024) that harmonises regulatory and academic taxonomies, and conducting empirical studies (such as in situ analyses, participant studies, and broader measurement studies, as will be further described in this article) that demonstrate the prevalence and impact of dark patterns in specific contexts. Such scholarship has been involved with community organising efforts (such as in-person workshops (Schraffenberger et al., 2024; Acar et al., 2025; FTC, 2022a) and ongoing remote community engagement like a centralised Slack channel), bringing together stakeholders across multiple academic disciplines, industry, and enforcement to deepen understanding and collectively address the societal risks posed by dark patterns.
Regulatory decisions and legal frameworks have increasingly focused on dark patterns over the recent years. Several new legislation, such as the EU Digital Services Act (DSA) in Recital 67 and Article 25 (Regulation 2022/2065), Digital Markets Act (DMA) in Article 13 (Regulation 2022/1925), Data Act in Recital 38 (Regulation 2023/2854), and Artificial Intelligence Act (AI Act) in Article 5 (Regulation 2024/1689), presently include provisions to prohibit dark patterns or related designs (Car & Cassetti, 2025). For example, the AI Act prohibits “subliminal” or “purposefully manipulative or deceptive techniques” within AI systems that impair user decision-making (Regulation 2024/1689, Articles 5(1)(a, b)). Most notably, the DSA provides a definition of dark patterns, including detailed specifications for certain types of prohibited behaviours (Recital 67; Article 25(3))). In the United States, states have begun to explicitly prohibit dark patterns like the California Consumers Privacy Act [CCPA] (2018) as modified by the California Privacy Rights Act [CPRA] in §1798.185.19(c)(iii) (2020) and the Colorado Privacy Act [CPA] (2021, §6-1-1301(5)(c)). Enforcers have also brought cases against dark patterns: the US Federal Trade Commission increasingly applies its extant consumer protection authority to dark patterns (e.g. FTC v. Vonage) (2022), and EU member states also increasingly enforce against dark patterns via data protection (Deliberation No. SAN-2021-023, 2021; Deliberation No. SAN-2021-024, 2021) or consumer protection (Decision ACM/22/179622, 2024) authorities.
Though the legal and academic domains naturally take different methodological approaches to suit differing needs (e.g. single- or limited-party investigatory scope for cases, versus sample size breadth that scholarship often seeks for generalisability), examining these approaches may reveal commonalities and opportunities. As such, we present the topic of dark patterns as a case study for illustrating extant and potential interdisciplinary efforts towards technology regulation and enforcement, and to encourage further collaboration between academic and regulatory practice.
This article proceeds as follows. First we introduce dark patterns scholarship and extant collaborative efforts. Next we describe our methods, bringing together academic and legal bodies of knowledge towards understanding evidentiary approaches in both; we then map these investigatory methods across disciplines. More specifically, we analyse HCI research methods for detecting, identifying, or characterising dark patterns from two sources: (1) we examine existing methodologies from dark patterns literature, building upon a prior systematic literature review (Gray et al., 2023); (2) we draw from the lessons and experience of common User Experience Research (UX) methods employed in industry settings to provide evidence for design decisions. For the legal context, we draw from EU and US dark patterns cases law, then examine the evidence and the methods employed by regulators to substantiate the presence of these patterns within the context of each case. Mapping HCI methods to these cases, we then discuss how the evidentiary methods used in case law converge or diverge from those used within HCI academic and practitioner methods. As such, we perform interdisciplinary research across law and HCI, revealing challenges and opportunities that arise from sectoral and procedural differences between scholarship and law. Finally, we discuss the challenges for utilising scholarly evidence in case law, propose recommendations for the use of scholarly methods in formal investigations, and identify opportunities for using scholarly methods and evidence in semi-formal investigations for regulators to assess organisations’ behaviours.
Dark patterns scholarship
To measure and observe dark patterns, empirical research has evaluated both users’ reactions to deceptive designs (Bongard-Blanchy et al., 2021; Gray et al., 2021b; Luguri & Strahilevitz, 2021) and quantified the impact they make on users’ decisions (Bermejo Fernandez et al., 2021; Bielova et al., 2024; Graßl et al., 2021; Habib et al., 2022; Machuletz & Böhme, 2020). A systematic review of empirical studies relating to dark patterns by Gray and colleagues. (Gray et al., 2023) notes that a diverse range of domains or use contexts are impacted by these manipulative, deceptive, or coercive practices, including mobile apps and websites (Di Geronimo et al., 2020; Gunawan et al., 2021; Mathur et al., 2019), voice assistants (Owens et al., 2022), internet of things (IoT) devices (Kowalczyk et al., 2023), mixed reality (Krauss et al., 2025), social networks (Mildner & Savino, 2021; Mildner et al., 2023; Schaffner et al., 2022), games (Aagaard et al., 2022; Niknejad et al., 2024), AI assistants (Millers et al., 2024), social robots (Lacey & Caudwell, 2019), and privacy control mechanisms like consent banners (Bouhoula et al., 2024; Gray et al., 2021b; Nouwens et al., 2020; Toth et al., 2022).
Dark patterns scholarship increasingly brings together members of many fields in transdisciplinary dialogue, both within academia (between academic disciplines) and across academia, industry, and different legal domains. An updated ontology of dark patterns (Gray et al., 2024) built by an interdisciplinary team harmonises numerous academic and regulatory taxonomical sources. A recent taxonomy of harms caused by dark patterns consolidates both regulatory and HCI literature (Santos et al., 2025). In particular, dark patterns bring together legal and HCI scholars to align regulatory guidance with the design realities of constructing digital systems (Bielova et al., 2024b; Gray et al., 2021b; Bielova, 2022). Other such work additionally discusses privacy dark patterns against the potential for legal remedies (Gunawan et al., 2022).
The usage of research methods for collecting evidence of dark patterns in legal actions has been already pointed out by leading scholars on dark patterns, including expert witness’ work using heuristic analysis (King & Stephan, 2021; King, 2012, 2016) on two cases in the United States (FTC vs. Amazon.com, 2014; FTC vs. Commerce Planet, 2011) and other work using qualitative content analysis as noted in an expert report for State of Arizona v. Google LLC (Gray, 2022).
Academic-enforcement collaboration
Avenues for mobilising scholarly work for enforcement settings (and vice versa) cover a spectrum of processes and activities. Co-productive research models, or coproduction, are one avenue (Innes et al., 2019; Large, 2024; McCabe et al., 2023; Yu, 2020); other avenues for academic-enforcement engagement range from “applied” research using regulation and enforcement as the topic in question (e.g. compliance-oriented audits seen in GDPR studies (Kubicek et al., 2024; Soe et al., 2020; Tran et al., 2024) non-exhaustively), to extensively citing scholarship in informal guidance (Journal 2021/C 526/01), to submitting scholarship to calls for public rulemaking commentary (e.g. Notices for Public Rulemaking in the US and Public Consultations in the EU), or to the explicit use of scholarly evidence in enforcement.
Regarding the latter on the use of evidence in formal proceedings, the EU Better Regulation Toolbox describes “evidence” in reference to “multiple sources of data’’. It includes ‘’information and knowledge, including quantitative data such as statistics and measurements, qualitative data such as opinions, stakeholder input, conclusions of evaluations, as well as scientific and expert advice” (European Commission, 2023, p. 8). However, enforcement authorities tend to adopt a narrower evidentiary scope. Case law customarily cites prior rulings as foundation for their interpretation and decisions, ensuring consistency and predictability in the legal system. Courts and enforcers engage in a limited way with scholarly work. In particular, Court procedural rules predicate the value of scientific works. The case law analysis methodology of the Supreme Court of Estonia notes: “numeric data collected and used for the analysis” is considered “illustrative material”—then, “if necessary and possible... Appropriate popular scientific materials may be used in moderation” (Riigikohus [Supreme Court of Estonia], n.d.). Similarly, the Statute of the International Court of Justice considers “the teachings of the most highly qualified publicists of the various nations” as “subsidiary means for the determination of rules of law” (United Nations, 1945, Article 38(1)(d)). These citations thus demonstrate that scientific research is technically suitable for citation in judicial and administrative cases, though limited to subsidiary or illustrative roles – at least in the EU. In the US, scientific methods are incorporated as the basis of expert testimony (Fed. R. of Evid. (702)), with the inclusion of scientific evidence itself still remaining a challenge (Albright, 2023). For evaluating expertise, the Daubert standard subsumes the prior Frye standard for determining the admissibility of scientific evidence and testimony: Daubert strengthens reliance on methodology and the underlying science rather than a test for general in-field acceptability as was initially set by Frye (Daubert v. Merrell Dow Pharmaceuticals, 1993; Frye v. United States, 1923).
Methods
We consulted two bodies of knowledge in this work: methods derived from academic scholarship and UX research (UXR), and decided dark patterns enforcement cases. A diagram detailing these sources is presented in Figure 1. Regarding positionality, scholarly or research-oriented literature was first examined by HCI and/or computing scholars in our team, whereas case law was examined by legal scholars. However, as our team consists of researchers with prior interdisciplinary experience, we iteratively discussed analyses across the full team.

Understanding HCI and UX Methods
We start with the Gray et al. (2023) data set of 79 dark patterns-related studies published from 2014 to 2022, which contribute to the detection or characterisation of dark patterns and spans a wide range of fields, including human-computer interaction, computing, game studies, privacy and security, and law, among others. Gray et al. (2023) provide a preliminary categorisation of empirical methods; building upon these, we further cluster identified scholarly methods into four categories, then articulate and label the method category’s primary purposes (the investigatory goals served by those methods), a non-exhaustive list of evidence types commonly derived with that method type (which may correspond both to compiled data sets for a given study, and analytical output qualifying as new evidence in its own right), unique strengths that the methodology group lends to scientific inquiry, and finally known limitations of the methodology type (typically the inverse of the methods’ strengths).
We separate these categories as distinct to facilitate our own methodological reasoning while acknowledging the rising popularity of mixed-methods or interdisciplinary approaches for contextualising dark patterns research findings (like those in this paper). However, we believe the clarity from these distinctions is helpful for the purpose of this paper’s objectives of working towards multi-disciplinary alignment and understanding.
Finally, we consider formative texts in industry UX research (UXR) methodology, given the inherent nature of the dark patterns subject being a UX topic. UXR represents a growing area of industry work focusing on the design and implementation of studies that can lead to product insights, with HCI practices commonly performed by UXR specialists in industry. We extract methods from a smaller set of two longer source documents, describing UXR methods and considering them against the scope of dark patterns scholarship. Specifically, we consider the book UX Research: Practical Techniques for Designing Better Products (Nunnally & Farkas, 2016) and the article When to use which user-experience research methods (Rohrer, 2022).
Analysing legal case studies
The law and policy side of dark patterns literature spans regulation, governmental reports, regulatory guidelines, press releases, case law, and more. To evaluate insights regarding methods of evidence collection for dark patterns, we restrict our search to case law given its inherently investigatory nature. We begin with the data set of case law from deceptive design/cases (Brignull et al., 2023), a popular and centralised repository of dark patterns-related case decisions managed by an interdisciplinary team of known dark patterns scholars. This set spans 27 unique jurisdictions across the United States and the European Union, such as regulatory enforcement agencies like the EU Data Protection Authorities (DPAs), the US Federal Trade Commission (FTC), and consumer or competition authorities that govern similar design issues. From these cases, we identify a few case studies that represent the dark patterns regulatory momentum. That is, we filter cases that either mention dark patterns by name in-text, directly cite dark patterns scholarship or studies articulating evidence for related dark pattern practices in-text.
The original deceptive.design/cases data set includes only res judicata (closed, decided) cases. For methodological consistency (e.g. maintaining “final”-status consistency between case texts and formally-published scholarly articles), we find res judicata cases appropriate and do not include undecided cases in our updated search. Our final set of legal case studies includes four EU cases across three member states’ jurisdictions, and three US cases across two federal and one district court jurisdictions. These cases are demonstrative of the dark patterns regulatory momentum, but certainly not exhaustive.
For all cases, we obtain official case texts or related English translations. Since investigatory methodologies are not often described in the same manner as in scholarship, we assess the case text for factual descriptions or concrete details regarding evidence that might be useful or commonly used by scholars as data. We then relate cases to scholarly analysis methods as inferred from the text and the way evidence–data–is described.
Mapping investigatory methods across disciplines
Empirical methods for dark patterns research
Methods from academic literature
We characterise four key clusters of methods used to identify or describe dark patterns in academic literature. Table 2 presents these methods with the list of evidence types each method group produces and the method’s non-exhaustive strengths and limitations for contextualisation across method type; this table provides a reference for mappings made throughout the remainder of the work.
Qualitative content analysis (Mayring, 2000; Schrier, 2012; Hsieh & Shannon, 2005) identifies the presence of dark patterns, including their type and function within the user journey, e.g. Gray et al. (2018) and DiGeronimo et al. (2020). This method focuses on manual detection of dark patterns through expert review, based on a shared list of dark patterns types, characteristics, and/or exemplars. Experimental design (Scott MacKenzie, 2013; Cairns, 2014; Lazar, 2017) identifies generalisable causal mechanisms relating to dark patterns that have the potential to impact user behaviour, e.g. Luguri & Strahilevitz (2021) and Bhoot (2020). This method focuses on the isolation and evaluation of specific design choices, which are then tested using an experimental and control group to identify the degree to which one or more design choices impacts user behaviour. Observation, Interview, Survey, and Case study (Lazar, 2017; Olson & Kellogg, 2014) methods identify or problematise user behaviour or awareness of dark patterns, and focus on describing how users perceive, act upon, or otherwise use their knowledge of digital systems to address or circumvent dark patterns, e.g. Gray et al. (2021c) and Bielova et al. (2024a). Finally, Web measurement (Mislove et al., 2007) identifies UI or interaction instances of dark patterns based on how these patterns manifest in code or how these patterns influence user behaviour, e.g. Mathur et al. (2019). This method focuses on capturing web data as rendered through a browser, including code elements and their display and how these elements are impacted by user interactions.
|
Method type |
Purpose | (Non-exhaustive) evidence types | Strengths | Limitations |
|---|---|---|---|---|
|
Qualitative content analysis (36) |
Expert identification, evaluation, detection, and classification |
Live screenshots and video recordings with textual description and labeled images/videos by experts | Human expertise, flexibility (e.g. via inductive approaches), ability to address patterns in both static and dynamic form, in-situ identification |
Requires expert reviewer knowledge of dark patterns, human resource-intensive, difficult to scale |
|
Experimental design (14) |
Identifying causal and correlational inferences that impact user behaviour by comparing user behaviour of control and experimental groups | Screenshots, video recordings, HTML, organisational policies, designs compared across two groups and statistical analysis reporting the differences | Statistical power, fine-grained control of conditions | Difficult to establish ecological validity in some cases, hard to identify how multiple dark patterns relate to each other interactively over time, may require substantial funding for large sample-size human subject experiments |
|
Observation, Interview, Survey, and Case study (30) |
Identify user perceptions and awareness | Qualitative and quantitative user data, user interactions, statistics on the analysed data | Human expertise, potential for ecological/in-situ validity, able to identify conditions or ecological factors that may impact experiences of dark patterns | Focus on insight-gathering rather than generalisability, difficult to compare across cases or experiences |
| Web measurement (5) | Large-scale automated detection | HTML, live screenshots, statistics on each detected dark patterns in a given data set | Objective review of code that drives the user experience, ability to trace data trails that are visible and invisible to the end user | Difficult to definitively trace or map certain types of dark patterns, high variation of code making it difficult to detect all patterns at scale, code artifacts or tools require ongoing maintenance |
Methods from UX research
Next we consider our UXR sources, which use categorisation schemes to structure methods into smaller groups. The two texts generally agree on which methods are typical in early stages (prior to product creation or substantive modification), during active design/development (to direct or organize product decisions), and during evaluation or validation (to ensure the product functions expectedly).
Early-stage methods are generative tools that are oriented towards providing insights on potential design directions, including landscape analysis, contextual inquiry, diary study, stakeholder workshops, and participatory design. These methods typically rely on a small number of participants, but generate deep qualitative insights that can enable the UX researcher to identify new product directions or gaps. Active design/development stage methods are formative tools that enable product teams to iterate upon a product in active development may rely on a range of qualitative and quantitative methods. For instance, quantitative methods may include taxonomy review, eye tracking, A/B testing, card sorting, and unmoderated product testing. Qualitative methods may include moderated or unmoderated product testing, contextual inquiry, usability testing, and heuristic analysis. Evaluation- or validation-stage methods (Nunnally & Farkas, 2016; Rohrer, 2022) are summative tools that enable a product team to continue to learn about a product based on real-world use through both qualitative and quantitative methods. Quantitative methods may draw from a range of system analytics, such as user flows, demographic characteristics, and customer feedback. Qualitative methods may include expert inspection methods such as heuristic evaluation or cognitive walkthroughs, alongside moderated or unmoderated product testing or usability testing to identify emergent bugs in existing or new user communities.
Synthesis of scholarly and practical empirical methods
We next created a synthesis specific to the identification of dark patterns, describing relevant validation procedures or audit trails that make discovered evidence meaningful and potentially probative. We summarise these methods categories below:
1. Expert evaluation (can be completed by one or more trained professional reviewers)
• Qualitative content analysis can identify the presence of dark patterns, and can be validated through an audit trail that includes: a shared vocabulary of dark pattern types and/or characteristics; acknowledgement of how new dark patterns relate to known exemplars; and an acknowledgement of the expertise of the individual or individuals conducting the review. This method is commonly used by academic researchers.
• Heuristic analysis can identify the presence of dark patterns in a generative stance (looking for design opportunities) or evaluative stance (identifying “bugs” or potential usability flaws), and can be validated through an audit trail that includes: an acknowledgment of the expertise of those conducting the review; use of an existing and validated set of heuristics; and identification of how dark patterns violate an existing heuristic standard. This method is commonly used by UX researchers working in industry and less commonly by academic researchers.
2. User studies (must allow end users to engage with a final or potential interactive system)
• Evocative user studies can identify open-ended design opportunities (e.g., participatory design, field study, focus groups, diary study) where dark patterns could be included or removed. Because the goal of this method is generative and oriented towards creativity, no audit trail characteristics are relevant; however, prompts used to trigger user responses may frame responses in a non-deterministic manner, identifying anticipatory harms or potential motivations. This method is commonly used by UX researchers and less frequently by academic researchers.
• Validation oriented user studies can identify user comprehension or sensemaking in relation to key aspects of a system in moderated (e.g., contextual inquiry, concept testing, usability testing) or unmoderated form (e.g., survey, usability testing, field studies). This method can be validated through an audit trail that includes: a set of shared user tasks or goals used to prompt user interaction; use of an interactive system that includes design choice(s) being evaluated; and collection of multiple sources of data, commonly including video, audio, interactive decisions, and demographic characteristics. Outcomes of this method should demonstrate trends or insights relating to users’ understanding of the system or impacts on user agency to be probative. This method is commonly used by UX researchers and less frequently by academic researchers.
• Experimental user studies can identify a causal link between specific design choices that incorporate dark patterns in a particular use context and changes in user behaviour. Many methods can support experimental studies, including collection of behavioural metrics (e.g., eyetracking) or responses to goals or scenarios in an unmoderated system (e.g., survey, remote or in person observation). This method can be validated through an audit trail that includes: an ecologically-meaningful framing of one or more design choices; a selection of experimental and control group conditions that addresses potential study confounds; and a sample of users randomly distributed that reflects the population being characterized. Outcomes of this method should have statistical significance to be deemed probative. This method is commonly used by academic researchers.
3. Log interaction analysis (must collect trace data through a user’s interaction with a system or other code-auditable means)
• User flow evaluation can identify common pathways through a system through analysis of log data that is captured by a live system (e.g., pages accessed, timestamps, unique identifiers). This approach can be used to optimize flows or identify areas for design improvement. This method can be validated through an audit trail that includes: a set of predefined stakeholder goals; an analysis of performance metrics (e.g., time on task, number of clicks); and an analysis of the sample(s) of users reflected in performance outcomes to address any confounds. Outcomes of this method may be methodologically triangulated with findings from other sources (e.g., user studies) to identify causal factors that might lead to final design decisions. This method is commonly used by UX researchers.
• A/B testing can identify the relative performance of two or more design choices in comparison to pre-defined goals as experienced on a live system (e.g., user retention, selection of choices, conversion to signups). This approach can be used in a fine-grained manner to test the performance of small design choices (e.g., colors, positioning of buttons) or entirely different experiences of a platform. This method can be validated through an audit trail with: a set of predefined stakeholder goals; a comparison of performance metrics; and an analysis of the sample(s) of users reflected in performance outcomes to address any confounds. Outcomes of this method may have statistical significance, or metrics may be methodologically triangulated with findings from other sources (e.g., user studies) to identify causal factors that might lead to final design decisions. This method is commonly used by UX researchers.
This synthesis shows that a range of methods and types of evidence inspected are already present to identify and characterise dark patterns.
Methods inferred in notable dark patterns cases
In this section we summarise our mapping of the studied prominent legal cases to the methods used for identifying dark patterns from academic literature and industry practices. This analysis covers two jurisdictions: the EU, which follows a civil law system and is represented in our set of cases by administrative decisions issued by data protection authorities, and the US, a common law system through decisions issued by the FTC. It is important to note that EU decisions originate from administrative bodies responsible for enforcing the GDPR (Regulation 2016/679). These bodies are not courts and operate under different procedural rules than the judiciary. In some respects, they possess greater flexibility, as they are not bound by the stringent evidentiary standards that govern judicial proceedings. We consider administrative decisions since so far no judicial decision has directly mentioned dark patterns.
Below, we analyse each decision with respect to the following aspects: the subject matter of the case with reference to dark patterns, the extent to which scholarly publications and/or other types of evidence were referenced in its decision, and the methods inferred to have been used in identifying dark patterns. Table 3 depicts our mapping.
| Case law | Category of methods used | Specific method used |
|---|---|---|
| EU case law | ||
| Case 9870014 of Garante (Italian DPA) against Ediscom, S.p.A. (2023) | Expert evaluation | Qualitative content analysis |
| PS/00080/2023 of the AEPD (Spanish DPA) against Chatwith.IO (2023) | Expert evaluation | Qualitative content analysis |
| Deliberation No. SAN-2021-023 (2021a) of CNIL (French DPA) against Google | Expert evaluation | Qualitative content analysis |
| User studies | (Cited) Experimental user studies | |
| Deliberation No. SAN-2021-024 (2021b) of CNIL against Facebook | Expert evaluation | Qualitative content analysis |
| User studies | (Cited) Web measurement | |
| US case law | ||
| FTC v. Vonage (2022) | Expert evaluation | Qualitative content analysis |
| Log interaction analysis | User flow evaluation | |
| FTC v. Epic Games (2022) | Expert evaluation | Qualitative content analysis |
| Log interaction analysis | User flow evaluation | |
| D.C. v. Google LLC (2022) | Expert evaluation | Qualitative content analysis |
EU cases
Case 9870014 of Garante per la protezione dei dati personali (Italian DPA) against Ediscom SpA
Subject matter: In the EU, the first regulatory decision to explicitly refer to dark patterns was issued by the Italian DPA against Ediscom SpA (Case 9870014, 2023) regarding Ediscom’s promotional campaigns via its websites. The decision discusses several practices related to dark patterns (e.g., lack of relevant information on one of the websites) but only consent settings are explicitly named as such.
Evidence used: The decision resorts to the formal qualification of dark patterns in the EDPB guidelines on dark patterns for "greater clarity" (European Data Protection Board [EDPB], 2022). The case documents contained visual evidence (such as screenshots) of the violating practices, and additionally mentioned company documentation as well as whistleblower and user reports. Such visualisations can help demonstrate dark patterns across legal literature and provide concrete evidentiary examples for the future decision making with the same or corresponding type of interfaces or violations. Scholarly works were not referenced.
Methods: From the case, we infer that Qualitative content analysis was used to assess interfaces for dark pattern practices, as reconciled against internal and user-reported sources. A form of User flow evaluation is also inferred by the structure in which the case describes the steps of a user’s consent flow.
PS/00080/2023 of the AEPD (Spanish DPA) against Chatwith.IO
Subject matter: The Spanish case PS/00080/2023 (2023), brought against website operator Chatwith.IO explicitly addresses the use of dark patterns in relation to users' right to object to the processing of their personal data by third parties.
Evidence used: This case directly references concrete dark patterns from the European Data Protection Board (EDPB) guidelines (particularly “overloading” and “skipping”) (EDPB, 2022). Other evidence includes the company’s legal documentation (e.g. privacy or cookie policies) to assess data collection practices. The case directly transcribes a detailed analysis of the consent banner UI and subsequent options presented to users—control panel, purposes for data collection, and the full list of third-parties. No academic sources were used.
Methods: We infer that Qualitative content analysis was used to assess interfaces, with a form of User flow evaluation.
Deliberation No. SAN-2021-023 of CNIL against Google
Subject matter: The French case SAN-2021-023 (2021a) brought against Google LLC and Google Ireland Limited investigated practices making it harder to refuse cookies than to accept them.
Evidence used: This case first includes prior rulings and detailed design instructions (for platforms to follow in order to comply with prior decisions) from previous cases and guidelines. It performs on-site audits conducted by the regulator for fact-checking. The decision describes the nature of providing consent to Google services at the time, discussing the relative simplicity between two consent options, with no screenshots provided. The regulator cites an industry study called “Privacy Barometer” (COMMANDERS ACT, 2021) for quantitative evidence of the impact of consent-based design choices and its effect on user autonomy (CNIL, 2021a para 134, 135). Additionally, it also cites a 2021 Kantar study measuring French refusals for cookie storage. These studies were used to establish tangible causal effects of explicit refusal options.
Methods: This case’ efforts to capture the number of actions for cookie acceptance in Google services maps to Qualitative content analysis, but uses results from cited Web measurement industry studies in its decision. We map the industry studies to measurement methods as can be inferred by the results and descriptions of the study; however, we see an issue with under-reporting methods in such reports.
Deliberation No. SAN-2021-024 of CNIL against Facebook
Subject matter: The French case SAN-2021-024 (2021b) was brought against Facebook Ireland Limited for obstructing users to refuse to give consent as simple as the mechanism provided for their acceptance.
Evidence used: The scholarly study by Nouwens et al. (2020) on dark patterns was mentioned (Déliberation SAN-2021-024, 2021b, para. 98) to support the regulator's interpretation of the GDPR requirement for a “reject” button next to the “accept” button on the first layer of the consent banner. The regulator provided its own analysis of the Facebook interface of consent banner, citing Nouwens et al. (2020) then mentioning that moving the “reject” button to the second layer increased the cookie acceptance rate by 23.1%. The regulator describes their own online check stepping through Facebook’s consent mechanism and describing the user interaction journey in detail (Déliberation SAN-2021-024, 2021, para. 108-109). Screenshots are not provided with these descriptions.
Methods: This case primarily relied on Qualitative content analysis, but uses results from cited Experimental user studies in its decision.
US cases
In US FTC cases, interfaces and designs have yet to be tied to any specific dark patterns by taxonomical labels. Rather, complaint documents discuss dark pattern mechanics broadly (e.g. “adding friction for friction’s sake” when quoting a UX designer), or detailing what user behaviours the dark patterns interfere with (e.g., purchasing items with minimal stopgaps) (Epic Games, In the matter of, 2022).
FTC v. Vonage
Subject matter: Vonage (Federal Trade Commission, 2022c; FTC v. Vonage, 2022) was charged with one violation of Section 5 of the FTC Act (for unfairly charging consumers without consent) and three violations of ROSCA (for failing to provide required disclosures, failing to obtain expressed informed consent before charges, and failing to provide simple mechanisms to stop recurring charges) (15 U.S.C. §45(a); 15 U.S.C. §§ 8401-8405). The term “dark patterns” is explicitly mentioned in the complaint document.
Evidence used: Cited evidence included consumer complaints sent directly to Vonage in text format or over the phone, as well as staff records and external consumer website complaints—thus both testimony and physical or visual evidence is included. This includes screenshots of Vonage interfaces and excerpts from internal company training materials, customer call records ( presumably as transcripts), internal emails, and more. Of interest here and unique to the enforcement context (versus academic contexts) are internal correspondences and call records.
Methods: We map this case to Qualitative content analysis when visual evidence was analysed and broadly to User flow evaluation when other materials like customer call records are involved to analyse users’ goals and journey through the system.
Epic Games, In the matter of
Subject matter: Epic Games was charged with the counts of Unfair Billing and Unfair Account Denial, each of which were aided by several dark pattern designs in multiple iterations over a few years (Epic Games, In the matter of, 2022; Federal Trade Commission, 2022b). The initial complaint explicitly mentions Epic Games’ use of dark patterns.
Evidence used: The complaint discusses the problematic designs in great detail, providing screenshots as referential exhibits, with additional facts detailing steps in a user’s interaction journey for in-game purchases. Other described evidence includes consumer complaints as well as inquiries to Epic Games staff or internal communications (involving responses from UX designers and product managers). The use of asymmetric designs was verified by both longitudinal evidence reported by Fortnite users and staff acknowledgement of how business goals were implicated by the specific designs under scrutiny.
Methods: We also map this case to Qualitative content analysis and potentially User flow evaluation when used to identify asymmetries between customers’ and business goals.
D.C. v. Google LLC
Subject matter: A complaint filed by the D.C. Attorney General against Google directly mentions dark patterns within the context of deceptive practices for manipulating user settings choices regarding location data (D.C. v. Google LLC., 2022).
Evidence used: While the publicly available complaint filing redacted some details prior to settlement, evidence gathered and presented to support the several allegations regarding location tracking include detailed descriptions of the implicated interfaces and how they were utilised. Here, evidence spanned several iterations of Google’s interfaces and various changes made to asymmetrically promote consent over alternative options. From what can be gathered from unredacted content, the complaint drew from consumer-accessible interfaces and documents like privacy policies or other disclosures, with some mention of internal correspondence.
Methods: We map this case to Qualitative content analysis for assessing visual evidence.
Challenges, opportunities, and recommendations
Differences in evidentiary approaches between empirical research and enforcement investigations
Case-specific scope and resultant contextual depth
Cases inherently have narrow scope: typically fact-checking one organisation’s practices (interface or user journey element, and/or a specific subset of their digital service or product). Claims and evidence are primarily supported by multiple sources, like copies, policies, records, and screenshots of the user interfaces provided by or requested to the companies. Cases may also be supported by analysis of longitudinal samples of these evidence sources over time, including consumer complaints on publicly available sites as noted in Vonage (2022) and Epic Games (2022). Any gained evidence and insights derived from cases are bound to the case-specific granularity and depth that triangulates the complainant, the defendant, and the enforcer. This approach differs from academic work, which tends to analyse data to cover more breadth across organisations, or practices from a given organisation that has multiple users, particularly in a replicable manner and from a more impartial stance to draw broader conclusions. Thus, the scope of academic research is generally not confined to just one or a few aspects of a single organisation. Despite these tensions between empirical research and enforcement investigations, leveraging research for evidence-based enforcement allows us to trace how certain identification methods used in enforcement implicitly rely on similar identification logic as those developed in scholarly and industry contexts, even if not formally acknowledged. In this way, the comparison helps surface methodological overlaps, misalignments, and opportunities for cross-fertilisation between research and enforcement practice. Academic research could aggregate evidence from multiple cases for different insights, leveraging case-specific depth as well as scholarly breadth. This presents an opportunity for regulators to release case evidence as data for scholarly uses where or when possible. However, the inherent constraints of legal procedure often mandates the destruction of evidentiary data and related copies; this remains an open challenge in navigating what evidence might be acceptably retained and aggregated for scholarly use.
Access to private or internal evidence
Enforcers can access internal evidence that scholars typically cannot, like customer calls, chat records, letters exchanged between parties, training materials for customer representatives, and more. Regulators may request these as part of formal audits per-case or through other investigatory actions, and organisations must comply with such requests when necessary conditions are met. Other private evidence may also be given to authorities (though in such situations, companies’ voluntary provision is likely intended to demonstrate that dark patterns were not implicated). This evidence may include companies’ internal practices documents or data. Such evidence is neither available to consumers nor academic researchers under normal circumstances, echoing similar challenges in studying black-boxed systems or algorithms (Gryz & Rojszczak, 2021). Although business incentives and legal requirements may keep such materials protected from external view (except when formally audited by an agency), the glaring difference in access to this information between scholars and regulators highlights a fundamental knowledge and methodological gap between academics and enforcers. Internal evidence, especially staff correspondence or materials targeted at employees, are uniquely capable of demonstrating unjust benefit and to another extent, intent to deceive or conduct unfair practices. Intent has long been an interesting caveat foiling definitions of ‘darkness’ in academic research, with scholars disclaiming that the presence of dark patterns should not be interpreted as inherently demonstrative of malicious intent on behalf of the designer (Gray et al., 2018). However, as noted in cases like Epic Games (2022) and Vonage (2022), internal documentation sometimes reveals calculated strategy, explicit instructions from managers to designers, or business goals that are supported by the use of dark patterns.
Challenges for aligning investigatory methods
The two CNIL cases citing academic studies present some promise for the future inclusion of scholarly evidence within case-law, but different jurisdictions may have varied appreciation. The “if necessary and possible” restrictions (or use of only the “most highly qualified” scholarship) as “subsidiary” limitations on scholarship-as-evidence described earlier present obstacles for more closely aligned investigatory methods between both fields. Other differences between academia and enforcement present other obstacles for easy or direct transfer of knowledge and evidence between domains in both directions.
As inferred from case documents, case law and enforcement tends to rely on Expert evaluation techniques performed by enforcers who identify violations (and sometimes mention dark patterns) via Qualitative content analysis. However, these Expert evaluation techniques and their audit trails are not disclosed (or even named as such), making it difficult to assess methodological rigour using criteria common in academic research, which may be helpful for scholars seeking to design methods more aligned with enforcement practice. While findings from research studies are arguably likely to have probative value for regulatory decisions, these studies are scattered across various disciplines, such as computer science, social science, design, and law, and may not be immediately known to regulators or enforcers at the moment of decision-making. Quality criteria and leading scientific venues differ across domains, and those responsible for enforcement may not be keenly aware of these reputational variations. Peer reviews assure this quality purpose but are inherently tied to field-specific expertise. Thus, the effort to validate scholarship may be inefficient for regulators deciding upon a case, given the requisite domain expertise necessary for evaluation. Some methods used in scholarship or UXR come from fields outside of law and regulatory practices (e.g. heuristic analysis or web measurements). Translating such work into legally useful terms or necessary procedures may be difficult, and might not meet enforcement criteria for determining dark pattern practices. Similarly, to prove a case against a potentially non-compliant platform, regulators are bound to evaluate the evidence produced therein as a matter of procedure, even if scholarship may have identified the exact behaviour by other means.
Recommendations for improving the evidentiary impact of scholarship
Both fields naturally have differing primary objectives: scholars seek to create and share their research evidence up to scientific standards of rigour and to publish their work for broader societal or scholarly impact; enforcers analyse evidence to enforce the law and sanction rogue practices. These objectives warrant different approaches, but we argue overlap might be better utilized, and gaps could be partially bridged through collaboration or improved information sharing.
Align research methods with regulatory contexts
Some organisations already incorporate scholarship, like the Spanish DPA extending the Gray et al. ontological framework (2024) and utilising its hierarchy (AEPD, 2024, p. 12). Consumer protections and advocacy organisations like the Swedish consumer organisation also cite ontological pattern names (Sveriges Konsumenter, 2024, p. 16). These efforts help align terminology and concepts across academia and law, but opportunities remain in better leveraging methods and tools across disciplines.
One way to narrow the gap would be to adapt scholarly methods for the specific context and practices used by legal practitioners within a regulatory authority. This would entail dark patterns scholarship (e.g., on privacy or e-commerce) designed to inform specific regulators (data or consumer authorities, respectively), or legal domains (data protection law, consumer protection or competition law), with methods accordingly supporting the needs of that type of regulatory body. As noted previously, some work already seeks to inform regulators and enforcers, like work that articulates and categorises normative perspectives of dark pattern darkness (Mathur et al., 2021), or the expanse of dark patterns scholarship that specifically implicates GDPR compliance (Matte et al., 2020; Nouwens et al., 2020; Soe et al., 2020; Utz et al., 2019), or autonomy violations for DSA compliance (Santos et al., 2024). We encourage the continuation of these efforts, noting that such work increasingly targets specific regulations.
Perhaps the next step along this trend would involve deeper acquaintance with the regulator's evidentiary needs and specific investigatory methods, and design experimental protocols accordingly. One example is with the nature of case facts. In our cited cases, case facts often described interface navigation steps in succession, sometimes with screenshots of problematic designs provided in filing documents. In Epic Games, the FTC even details the button mechanics on a gaming controller as pertaining to Fortnite’s purchase interfaces (2022). As such, more descriptive, contextual evidence that captures the dynamic interaction flow of a business practice may better suit the often narrative structure of case facts. Heuristic analysis, which is more commonly used in industry UX research rather than scholarly research, may be an avenue for academics to explore further in support of regulatory efforts. In fact, more recent scholarship has begun to work towards developing such methods, analysing documents from an FTC complaint and subsequently building a “temporal analysis for dark patterns” methodology intended for scholarly and legal uses (Gray et al., 2025). Future work could continue this direct engagement and inspiration in both directions: from law to scholarship and vice versa.
Provide further clarification regarding cases’ investigatory methods
Enforcers’ documentation and analysis of case facts may not articulate sufficient or replicable context for scholars to easily interpret investigatory methods, particularly for the manner in which information was collected (e.g., from consumer complaint documents, from interaction flow walkthroughs, user interfaces, etc). Decisions and complaints also offer different reporting styles across jurisdictions, which makes analysis of facts, collected evidence (e.g. screenshots), and resultant legal reasoning potentially hard to systematically distinguish for other disciplines. Variance in case documentation is an inherent part of jurisdictional differences, and may be impossible to align fully across jurisdictions or for scholarly utility (for example, if methodological/investigatory confidentiality is required by each regulator’s internal policy). However, interdisciplinary researchers seeking to better support enforcement proceedings may benefit from even partial or implicit disclosure of enforcement’s investigatory methodologies. Increasing methodological disclosure within regulatory decisions, to some extent, has potential advantages: scholars may replicate or scale investigations in ways that courts or regulators may not have resources for (including, perhaps, evaluating industry trends and consensus to identify market outliers); for regulators, published or otherwise validated work using more closely aligned methods may improve the utility of that evidence for enforcement investigations.
Strengthen and leverage semi-formal avenues for collaboration
Finally, given inherent constraints per discipline, a potentially underutilised middle ground towards collaboration might be provided by semi-formal investigatory actions (though the extent to which such semi-formal actions may be used for formal enforcement versus general market analysis remains a broader debate not within the scope of this article). Such mechanisms already have some precedent, where civil society organisations (NOYB, 2021) have conducted website audits themselves that triggered enforcement procedures and informed official guidelines (EDPB, 2023). In the EU or in international collaboration, these actions might be performed through “sweeps,” and “audits,” while in the US these might be conducted through agency studies. Concerns identified during sweeps could not only result in follow-up work, such as outreach to organisations, but may also lead to the initiation of enforcement actions to address identified concerns. Table 4 portrays the methods used in known dark pattern sweeps and maps them to the methods used for identification of dark patterns. These investigations, insofar as their approaches are less constrained to one platform at a time, could utilise a variety of scholarly methods are a promising avenue for regulators to use scientific methods, and thus for methodology transfer between academia and law or even for direct collaboration between researchers and enforcement agencies. Formalising avenues for scholars to join sweeps, or to otherwise create processes for direct academic involvement, could be implemented to benefit regulatory action. That is, earlier inclusion could leverage academics’ methodological and technical expertise towards tailored, synchronised methodological development or investigations, e.g. for novel detection techniques or tool-building (automated pattern detection, usability studies, A/B testing), or user studies measuring the impact and effect of phenomena like dark patterns. Conversely, regulators may want to inform academic taxonomies or even guide study design to yield evidence or definitions in a format that may be immediately useful to ongoing investigations.
| Investigating party | Methods described in text | Mapping to scholarly methods |
|---|---|---|
| European Commission (EU Commission, 2022) | Findings from a behavioural policy study; website/app audit trail based on three dark pattern types |
Experimental user studies, Qualitative content analysis, User studies |
| Global Privacy Enforcement Network (GPEN, 2024) | Findings from a policy study; questionnaire; website/app audit trail based on five indicators | User studies |
| International Consumer Protections Enforcement Network (ICPEN, 2024) | Findings from a policy study; questionnaire; website/app audit trail based on OECD indicators (OECD, 2022) | Qualitative content analysis |
|
DPA of Lower Saxony (Der Landesbeauftragte für den Datenschutz Niedersachsen, 2023) |
Consent banner UI analysis; technical analysis of trackers used; questionnaire; subscription models analysis | Qualitative content analysis |
| Article 29 Data Protection Working Party (A29 DPWP, 2015) | Statistical review of cookies; analysis of technical properties; manual review of cookie information and consent mechanisms | Qualitative content analysis |
| Australian Consumer Policy Research Centre (Consumer Policy Research Centre, 2024) | Website and app audit trail; analysis of privacy policies; word count |
Experimental user studies, Qualitative content analysis |
Conclusion
Our interdisciplinary team of researchers spanning human-computer interaction, design, computer science, and law presents dark patterns as a case study by which we reason through methodologies across disciplines. We begin by analysing groups of methods from scholarly literature and discuss their purpose, resultant evidence types, strengths, and limitations, then synthesise these methods against broader UX research methods and frame scholarly methods along different stages in product development or release. We then consider key, decided dark patterns enforcement cases and provide inferred mappings to scholarly methods from the cases provided. We also discuss other evidentiary sources available to enforcers.
Finally, we reflect upon our analyses and mappings considering the ongoing challenges of academic-enforcement collaboration for dark patterns. We propose suggestions for both scholars and enforcers to tighten feedback loops, and highlight informal investigation methods as an opportunity to strengthen collaboration.
References
Aagaard, J., Knudsen, M. E. C., Bækgaard, P., & Doherty, K. (2022). A game of dark patterns: Designing healthy, highly-engaging mobile games. CHI Conference on Human Factors in Computing Systems Extended Abstracts, 1–8. https://doi.org/10.1145/3491101.3519837
Acar, G., Bielova, N., Shafiq, Z., & Borgesius, F. Z. (2025). Online privacy: Transparency, advertising, and dark patterns. Dagstuhl Seminar 25042. Dagstuhl. https://www.dagstuhl.de/en/seminars/seminar-calendar/seminar-details/25042
Agencia Española de Protección Datos. (2024). Addictive patterns in the processing of personal data: Implications for data protection (Guides of the AEPD). https://www.aepd.es/guides/addictive-patterns-in-processing-of-personal-data.pdf
Albright, T. D. (2023). A scientist’s take on scientific evidence in the courtroom. Proceedings of the National Academy of Sciences, 120(41). https://doi.org/10.1073/pnas.2301839120
Article 29 Data Protection Working Party. (2015). Cookie sweep combined analysis report. https://ec.europa.eu/justice/article-29/documentation/opinionrecommendation/files/2015/wp229_en.pdf
Bermejo Fernandez, C., Chatzopoulos, D., Papadopoulos, D., & Hui, P. (2021). This website uses nudging: MTurk workers’ behaviour on cookie consent notices. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1–22. https://doi.org/10.1145/3476087
Bhoot, A. M., Shinde, M. A., & Mishra, W. P. (2020). Towards the identification of dark patterns: An analysis based on end-user reactions. IndiaHCI ’20: Proceedings of the 11th Indian Conference on Human-Computer Interaction, 24–33. https://doi.org/10.1145/3429290.3429293
Bielova, N. (2022). Survey of academic studies measuring the effect of dark patterns on acceptance consent rate of users in consent banners. Laboratoire d’Innovation Numérique de la CNIL. https://www.cnil.fr/sites/default/files/atoms/files/full_2022-12-02_v2.pdf
Bielova, N., Litvine, L., Nguyen, A., Chammat, M., Toubiana, V., & Hary, E. (2024). The effect of design patterns on (present and future) cookie consent decisions. USENIX Security Symposium. https://www.usenix.org/system/files/usenixsecurity24-bielova.pdf
Bielova, N., Santos, C., & Gray, C. M. (2024). Two worlds apart! Closing the gap between regulating EU consent and user studies. Harvard Journal of Law & Technology, 37.
Bongard-Blanchy, K., Rossi, A., Rivas, S., Doublet, S., Koenig, V., & Lenzini, G. (2021). ”I am definitely manipulated, even when I am aware of it. It’s ridiculous!”—Dark patterns from the end-user perspective. Designing Interactive Systems Conference 2021, 763–776. https://doi.org/10.1145/3461778.3462086
Bouhoula, A., Kubicek, K., Zac, A., Cotrini, C., & Basin, D. (2024). Automated, large-scale analysis of cookie notice compliance. USENIX Security Symposium. https://www.usenix.org/system/files/sec23winter-prepub-107-bouhoula.pdf
Brignull, H. (2010). Deceptive patterns (previously dark patterns at darkpatterns.org). Deceptive Patterns. https://www.deceptive.design
Brignull, H., Leiser, M., Santos, C., & Doshi, K. (2023). Legal cases. Deceptive Patterns. https://www.deceptive.design/cases
Cairns, P. (2014). Experimental methods in human-computer interaction. In Interaction Design Foundation - IxDF (Ed.), The encyclopedia of human-computer interaction (2nd edn). https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/experimental-methods-in-human-computer-interaction
California Consumer Privacy Act (CCPA), Cal. Civ. Code. §1798.100 et Seq. (2018). https://leginfo.legislature.ca.gov/faces/codes_displayText.xhtml?division=3.&part=4.&lawCode=CIV&title=1.81.5
California Privacy Rights Act of 2020 (CPRA), Cal. Civ. Code. §1798.106, 121, 146 & 148, and 1798.199.45–90 & 1798.199.100 (2020).
Car, P., & Cassetti, F. (2025). Regulating dark patterns in the EU: towards digital fairness. European Parliamentary Research Service: At A Glance: Digital issues in focus. https://www.europarl.europa.eu/RegData/etudes/ATAG/2025/767191/EPRS_ATA(2025)767191_EN.pdf
Case 9870014. (2023). Provvedimento prescrittivo e sanzionatorio nei confronti di ediscom S.p.A. - 23 febbraio 2023 (9870014) [Prescriptive and sanctioning provision against Ediscom S.p.A. – 23 february 2023]. Garante per la protezione dei dati personali [Garante]. https://static-r.giuffre.it/PORTALE/310/provv.%2023%20febbraio%202023,%20n.%2051.pdf
Colorado Privacy Act [CPA], CO. Rev. Stat. §6-1-1301 (2021). https://leg.colorado.gov/sites/default/files/2021a_190_signed.pdf
Commanders Act. (2021). Baromètre privacy édition 2021 [Privacy barometer 2021]. CommandersAct. https://www.commandersact.com/en/barometre-privacy-2021/
Consumer Policy Research Centre. (2024). The cost of managing your privacy. https://cprc.org.au/wp-content/uploads/2024/07/CPRC_Cost-of-PrivacyReport_Final.pdf
Daubert v. Merrell Dow Pharmaceuticals, Inc., No. 509 U.S. 579 (1993).
D.C. v. Google LLC, Case No 2022-CA-000330 B [Complaint] (Superior Court of the District of Columbia 2022). https://oag.dc.gov/sites/default/files/2022-01/DCv.Google%281-24-22%29.pdf
Decision ACM/22/179622. (2024). Decision of the Netherlands Authority for Consumers and Markets regarding the imposition of fines and a binding instruction on Epic Games International S.à.r.l due to violations of the Dutch Unfair Commercial Practices Act. Netherlands Autoriteit Consument & Markt. https://www.acm.nl/system/files/documents/sanctiebesluit-fortnite_en.pdf
Deliberation No. SAN-2021-023. (2021). Deliberation of the restricted committee No. SAN-2021-023 of 31 December 2021 concerning GOOGLE LLC and GOOGLE IRELAND LIMITED [Courtesy translated title]. Commision Nationale de l’Informatique et des Libertés. https://www.cnil.fr/sites/cnil/files/atoms/files/deliberation_of_the_restricted_committee_no._san-2021-023_of_31_december_2021_concerning_google_llc_and_google_ireland_limited.pdf
Deliberation No. SAN-2021-024. (2021). Deliberation of the restricted committee No. SAN-2021-024 of 31 December 2021 concerning FACEBOOK IRELAND LIMITED. Commision Nationale de l’Informatique et des Libertés. https://www.cnil.fr/sites/default/files/atoms/files/deliberation_of_the_restricted_committee_no._san-2021-024_of_31_december_2021_concerning_facebook_ireland_limited.pdf
Der Landesbeauftragte für den Datenschutz Niedersachsen [Data Protection Commissioner of Lower Saxony]. (2023). Prüfung von Medienwebseiten in Niedersachsen abgeschlossen [Examination of media websites in Lower Saxony completed]. LfD Niedersachsen. https://www.lfd.niedersachsen.de/startseite/infothek/presseinformationen/prufungvon-medienwebseiten-in-niedersachsen-abgeschlossen-223637.html
Di Geronimo, L., Braz, L., Fregnan, E., Palomba, F., & Bacchelli, A. (2020). UI dark patterns and where to find them: A study on mobile applications and user perception. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–14. https://doi.org/10.1145/3313831.3376600
Directive 2005/29. (2005). Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) [UCPD]. European Parliament and Council. European Parliament and Council. https://eur-lex.europa.eu/eli/dir/2005/29/oj/eng
Directive 2011/83. (2011). Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. European Parliament and Council. European Parliament and Council. https://eur-lex.europa.eu/eli/dir/2011/83/oj/eng
Epic Games. (2022). In the matter of (Complaint No. FTC Matter 192-3203). https://www.ftc.gov/system/files/ftc_gov/pdf/1923203EpicGamesComplaint.pdf
European Commission. (2022). Sweep on dark patterns. https://commission.europa.eu/live-work-travel-eu/consumer-rights-and-complaints/enforcement-consumer-protection/sweeps_en#ref-2022--sweep-on-dark-patterns
European Commission. (2024). Digital fairness fitness check. https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/13413Digital-fairness-fitness-check-on-EU-consumer-law_en
European Data Protection Board. (2022). Guidelines 3/2022 on dark patterns in social media platform interfaces: How to recognise and avoid them [Tech. rep.]. https://edpb.europa.eu/our-work-tools/documents/publicconsultations/2022/guidelines-32022-dark-patterns-social-media_en
European Data Protection Board. (2023). Report of the work undertaken by the Cookie Banner Taskforce. https://www.edpb.europa.eu/system/files/2023-01/edpb_20230118_report_cookie_banner_taskforce_en.pdf
European Union. (2021). Guidance on the interpretation and application of Directive 2005/29/EC of the European Parliament and of the Council concerning unfair business-to-consumer commercial practices in the internal market. Official Journal of the European Union, C 526(64). https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=oj:JOC_2021_526_R
Federal Trade Commission. (2022a). Bringing dark patterns to light [Staff report]. https://www.ftc.gov/system/files/ftc_gov/pdf/P214800%20Dark%20Patterns%20Report%209.14.2022%20-%20FINAL.pdf
Federal Trade Commission. (2022b). Fortnite video game maker Epic Games to pay more than half a billion dollars over FTC allegations of privacy violations and unwanted charges. https://www.ftc.gov/news-events/news/press-releases/2022/12/fortnite-video-game-maker-epic-games-pay-more-half-billion-dollars-over-ftc-allegations
Federal Trade Commission. (2022c). FTC action against Vonage results in 100 million to customers trapped by illegal dark patterns and junk fees when trying to cancel service. https://www.ftc.gov/news-events/news/press-releases/2022/11/ftc-actionagainst-vonage-results-100-million-customers-trapped-illegal-dark-patterns-junkfees-when-trying-cancel-service
Federal Trade Commission v. Amazon.Com, No. 2:14-cv-01038-JCC (W.D. Wash. 2014). https://law.justia.com/cases/federal/district-courts/washington/wawdce/2:
Federal Trade Commission v. Commerce Planet, No. 8:09-cv-01324-CJC(RNBx) (S.D. Cal. 2011).
Federal Trade Commission v. Vonage, No. FTC Case 3:22-cv-06435 [Complaint] (2022).
Frye v. United States, No. 293 F. 1013 (D.C. Cir. 1923).
Global Privacy Enforcement Network [GPEN]. (2024). GPEN sweep on deceptive design patterns. https://www.privacyenforcement.net/content/2024-gpen-sweep-deceptive-designpatterns
Graßl, P., Schraffenberger, H., Zuiderveen Borgesius, F., & Buijzen, M. (2021). Dark and bright patterns in cookie consent requests. Journal of Digital Social Research, 3(1), 1–38. https://doi.org/10.33621/jdsr.v3i1.54
Gray, C. M. (2022). Expert report of Colin M. Gray, Ph.D. Public redacted version (State of Arizona v. Google LLC) (Tech. Rep. Nos CV2020-006219). https://www.azag.gov/sites/default/files/2025-05/Excerpt%20Supplemental%20Report%20of%20Colin%20M.%20Gray%20Ph.D..pdf
Gray, C. M., Chen, J., Chivukula, S. S., & Qu, L. (2021). End user accounts of dark patterns as felt manipulation. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1–25. https://doi.org/10.1145/3479516
Gray, C. M., Chivukula, S. S., Melkey, K., & Manocha, R. (2021). Understanding “dark” design roles in computing education. Proceedings of the 17th ACM Conference on International Computing Education Research, 225–238. https://doi.org/10.1145/3446871.3469754
Gray, C. M., Mildner, T., & Gairola, R. (2025). Getting trapped in Amazon’s ‘Iliad Flow’: A foundation for the temporal analysis of dark patterns. Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems, 1–10. https://doi.org/10.1145/3706598.3713828
Gray, C. M., Sanchez Chamorro, L., Obi, I., & Duane, J.-N. (2023). Mapping the landscape of dark patterns scholarship: A systematic literature review. Designing Interactive Systems Conference, 188–193. https://doi.org/10.1145/3563703.3596635
Gray, C. M., Santos, C., Bielova, N., Toth, M., & Clifford, D. (2021). Dark patterns and the legal requirements of consent banners: An interaction criticism perspective. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 1–18. https://doi.org/10.1145/3411764.3445779
Gray, C. M., Santos, C. T., Bielova, N., & Mildner, T. (2024). An ontology of dark patterns knowledge: Foundations, definitions, and a pathway for shared knowledge-building. Proceedings of the CHI Conference on Human Factors in Computing Systems, 1–22. https://doi.org/10.1145/3613904.3642436
Gryz, J., & Rojszczak, M. (2021). Black box algorithms and the rights of individuals: No easy solution to the “explainability” problem. Internet Policy Review, 10(2). https://doi.org/10.14763/2021.2.1564
Gunawan, J., Choffnes, D., Hartzog, W., & Wilson, C. (2021). A comparative study of dark patterns across mobile and web modalities. Conference Companion Publication of the 2021 Computer Supported Cooperative Work and Social Computing. https://doi.org/10.1145/3479521
Gunawan, J., Santos, C., & Kamara, I. (2022). Redress for dark patterns privacy harms? A case study on consent interactions. Proceedings of the 2022 Symposium on Computer Science and Law, 181–194. https://doi.org/10.1145/3511265.3550448
Habib, H., Li, M., Young, E., & Cranor, L. (2022). “Okay, whatever”: An evaluation of cookie consent interfaces. CHI Conference on Human Factors in Computing Systems, 1–27. https://doi.org/10.1145/3491102.3501985
Hsieh, H.-F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288. https://doi.org/10.1177/1049732305276687
Innes, M., Davies, B., & McDermont, M. (2019). How co-production regulates. Social & Legal Studies, 28(3), 370–391. https://doi.org/10.1177/0964663918777803
International Consumer Protection and Enforcement Network. (2024). ICPEN Sweep finds majority of websites and mobile apps use dark patterns in the marketing of subscription services. ICPEN. https://www.icpen.org/news/1360
King, J. (2016, November 10). Expert report of Jennifer King, FTC v. Amazon.com, Inc., No. 2:14-cv-01038, 2015 WL 11252957 (W.D. Wash. Nov. 10, 2016).
King, J., & Stephan, A. (2021). Regulating privacy dark patterns in practice—Drawing inspiration from the california privacy rights act. Georgetown Law Technology Review, 5.
Kowalczyk, M., Gunawan, J. T., Choffnes, D., Dubois, D. J., Hartzog, W., & Wilson, C. (2023). Understanding dark patterns in home IoT devices. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–27. https://doi.org/10.1145/3544548.3581432
Krauß, V., Saeghe, P., Boden, A., Khamis, M., McGill, M., Gugenheimer, J., & Nebeling, M. (2024). What makes XR dark? Examining emerging dark patterns in augmented and virtual reality through expert co-design. ACM Transactions on Computer-Human Interaction, 31(3), 1–39. https://doi.org/10.1145/3660340
Kubicek, K., Merane, J., Bouhoula, A., & Basin, D. (2024). Automating website registration for studying GDPR compliance. Proceedings of the ACM Web Conference 2024, 1295–1306. https://doi.org/10.1145/3589334.3645709
Kyi, L., Mhaidli, A., Santos, C. T., Roesner, F., & Biega, A. J. (2024). “It doesn’t tell me anything about how my data is used”: User perceptions of data collection purposes. Proceedings of the CHI Conference on Human Factors in Computing Systems, 1–12. https://doi.org/10.1145/3613904.3642260
Lacey, C., & Caudwell, C. (2019). Cuteness as a ‘dark pattern’ in home robots. 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 374–381. https://doi.org/10.1109/HRI.2019.8673274
Large, D. (2024). The limits of co-production: Linking regulatory capacity to co-production of authoritative knowledge for environmental policy. Science and Public Policy, 51(5), 978–991. https://doi.org/10.1093/scipol/scae038
Lazar, J., Feng, J. H., & Hochheiser, H. (2017). Research methods in human computer interaction (Second). Morgan Kaufman. https://www.sciencedirect.com/book/9780128053904/research-methods-in-human-computer-interaction
Luguri, J., & Strahilevitz, L. J. (2021). Shining a light on dark patterns. Journal of Legal Analysis, 13(1), 43–109. https://doi.org/10.1093/jla/laaa006
Machuletz, D., & Böhme, R. (2020). Multiple purposes, multiple problems: A user study of consent dialogs after GDPR. Proceedings on Privacy Enhancing Technologies. https://doi.org/10.2478/popets-2020-0037
MacKenzie, S. I. (2013). Human-computer interaction: An empirical research perspective (1st edn). Morgan Kaufmann Publishers Inc.
Mathur, A., Acar, G., Friedman, M. J., Lucherini, E., Mayer, J., Chetty, M., & Narayanan, A. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–32. https://doi.org/10.1145/3359183
Mathur, A., Kshirsagar, M., & Mayer, J. (2021). What makes a dark pattern... Dark?: Design attributes, normative considerations, and measurement methods. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 1–18. https://doi.org/10.1145/3411764.3445610
Matte, C., Bielova, N., & Santos, C. (2020). Do cookie banners respect my choice? : Measuring legal compliance of banners from IAB Europe’s Transparency and Consent Framework. 2020 IEEE Symposium on Security and Privacy (SP), 791–809. https://doi.org/10.1109/SP40000.2020.00076
Mayring, P. (2000). Qualitative content analysis. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 1(2). https://doi.org/10.17169/FQS-1.2.1089
McCabe, A., Parker, R., Osegowitsch, T., & Cox, S. (2023). Overcoming barriers to knowledge co-production in academic–practitioner research collaboration. European Management Journal, 41(2), 212–222. https://doi.org/10.1016/j.emj.2021.11.009
Mildner, T., & Savino, G.-L. (2021). Ethical user interfaces: Exploring the effects of dark patterns on Facebook. Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, 1–7. https://doi.org/10.1145/3411763.3451659
Mildner, T., Savino, G.-L., Doyle, P. R., Cowan, B. R., & Malaka, R. (2023). About engaging and governing strategies: A thematic analysis of dark patterns in social networking services. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–15. https://doi.org/10.1145/3544548.3580695
Mislove, A., Marcon, M., Gummadi, K. P., Druschel, P., & Bhattacharjee, B. (2007). Measurement and analysis of online social networks. Proceedings of the 7th ACM SIGCOMM Conference on Internet Measurement, 29–42. https://doi.org/10.1145/1298306.1298311
Monge Roffarello, A., Lukoff, K., & De Russis, L. (2023). Defining and identifying attention capture deceptive designs in digital interfaces. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–19. https://doi.org/10.1145/3544548.3580729
Niknejad, S., Mildner, T., Zargham, N., Putze, S., & Malaka, R. (2024). Level up or game over: Exploring how dark patterns shape mobile games. Proceedings of the International Conference on Mobile and Ubiquitous Multimedia, 148–156. https://doi.org/10.1145/3701571.3701604
Nouwens, M., Liccardi, I., Veale, M., Karger, D., & Kagal, L. (2020). Dark patterns after the GDPR: Scraping consent pop-ups and demonstrating their influence. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–13. https://doi.org/10.1145/3313831.3376321
NOYB – European Center for Digital Rights. (2021). Noyb files 422 formal GDPR complaints on nerve-wrecking “cookie banners”. https://noyb.eu/en/noyb-files-422-formal-gdpr-complaints-nerve-wrecking-cookie-banners
Nunnally, B., & Farkas, D. (2016). UX research: Practical techniques for designing better products. O’Reilly Media, Inc. https://dl.acm.org/doi/book/10.5555/3126014
OECD. (2022). Dark commercial patterns (OECD Digital Economy Papers No. 336). https://www.oecd.org/en/publications/dark-commercial-patterns_44f5e846-en.html
Olson, J. S., & Kellogg, W. A. (Eds). (2014). Ways of knowing in HCI. Springer New York. https://link.springer.com/10.1007/978-1-4939-0378-8
Owens, K., Gunawan, J., Choffnes, D., Emami-Naeini, P., Kohno, T., & Roesner, F. (2022). Exploring deceptive design patterns in voice interfaces. Proceedings of the 2022 European Symposium on Usable Security, 64–78. https://doi.org/10.1145/3549015.3554213
PS/00080/2023. (2023). Resolucion del procedimiento sancionador [resolution of the sanctioning procedure against chatwith.io]. Agencia Española de Protección de Datos. https://www.aepd.es/documento/ps-00080-2023.pdf
Regulation 2016/679. (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [GDPR]. European Parliament and Council. European Parliament and Council. https://eur-lex.europa.eu/eli/reg/2016/679/oj/eng
Regulation 2024/1689. (2024). Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) [AI Act]. European Parliament and Council. https://eur-lex.europa.eu/eli/reg/2024/1689/oj/eng
Regulation (EU) 2022/1925. (2022). Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives. European Parliament and Council. European Parliament and Council. http://data.europa.eu/eli/reg/2022/1925/oj
Regulation (EU) 2022/2065. (2022). Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a single market for digital services and amending Directive 2000/31/EC (Digital Services Act. European Parliament and Council. European Parliament and Council. https://eur-lex.europa.eu/eli/reg/2022/2065/oj/eng
Regulation (EU) 2023/2854. (2023). Regulation (EU) 2023/2854 of the European Parliament and of the Council of 13 December 2023 on harmonised rules on fair access to and use of data and amending Regulation (EU) 2017/2394 and Directive (EU) 2020/1828 (Data Act. European Parliament and Council. European Parliament and Council.
Restore Online Shoppers’ Confidence Act (ROSCA), 15 U.S.C. §§ 8401-8405 (2010).
Riigikohus [Supreme Court of Estonia]. (n.d.). Case law analysis. Riigikohus. https://www.riigikohus.ee/en/case-law-analysis
Rohrer, C. (2022). When to use which user-experience research methods. Nielsen Norman Group. https://www.nngroup.com/articles/which-ux-research-methods/
Rule 702. Testimony by Expert Witnesses. Fed. R. of Evid., 702.
Santos, C., Bielova, N., Ahuja, S., Utz, C., Gray, C., & Mertens, G. (2024). Which online platforms and dark patterns should be regulated under article 25 of the DSA? [Tech. rep.]. https://ssrn.com/abstract=4899559
Santos, C., Morozovaite, V., & De Conca, S. (2025). No harm no foul: How harms caused by dark patterns are conceptualised and tackled under EU data protection, consumer and competition laws. Information & Communications Technology Law, 34(3), 329–375. https://doi.org/10.1080/13600834.2025.2461958
Schaffner, B., Lingareddy, N. A., & Chetty, M. (2022). Understanding account deletion and relevant dark patterns on social media. Proceedings of the ACM on Human-Computer Interaction, 6(CSCW2), 1–43. https://doi.org/10.1145/3555142
Schraffenberger, H., Gellert, R., Gray., C. M., Rossi, A., & Santos, C. (2024). Fair patterns for online interfaces. Lorentz Center Workshop. https://www.lorentzcenter.nl/fair-patterns-for-online-interfaces.html
Schreier, M. (2012). Qualitative content analysis in practice. SAGE Publications Ltd. https://methods.sagepub.com/book/qualitative-content-analysis-in-practice
Section 5(a) of the FTC Act, 15 U.S.C §45(a) (1914).
Soe, T. H., Nordberg, O. E., Guribye, F., & Slavkovik, M. (2020). Circumvention by design—Dark patterns in cookie consent for online news outlets. Proceedings of the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences, Shaping Society, 1–12. https://doi.org/10.1145/3419249.3420132
Sveriges Konsumenter. (2024). Are you sure you want to leave us? Deceptive design patterns in the cancellation processes of 20 digital services in Sweden (Sveriges Konsumenter Reports). https://www.sverigeskonsumenter.se/media/mgkdpb3g/are-you-sure-you-want-to-leave-us.pdf
Testimony of Plaintiff’s expert witness, Jennifer King, FTC v. Commerce Planet, Inc., No. SACV 09-1324, 2012 WL 12533790 (C.D. Cal. July 17, 2012). (2012, July 17).
Toth, M., Bielova, N., & Roca, V. (2022). On dark patterns and manipulation of website publishers by CMPs. Proceedings on Privacy Enhancing Technologies, 2022(3), 478–497. https://doi.org/10.56553/popets-2022-0082
Tran, V. H., Mehrotra, A., Chetty, M., Feamster, N., Frankenreiter, J., & Strahilevitz, L. (2024). Measuring compliance with the California Consumer Privacy Act over space and time. Proceedings of the CHI Conference on Human Factors in Computing Systems, 1–19. https://doi.org/10.1145/3613904.3642597
United Nations. (1945). Statute of the International Court of Justice. https://www.icj-cij.org/statute
Utz, C., Degeling, M., Fahl, S., Schaub, F., & Holz, T. (2019). (Un)informed consent: Studying GDPR consent notices in the field. Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security, 973–990. https://doi.org/10.1145/3319535.3354212
Von der Leyen, U. (2024). Mission letter to Michael McGrath, Commissioner-designate for Democracy, Justice, and the Rule of Law. https://commission.europa.eu/document/download/907fd6b6-0474-47d7-99da-47007ca30d02_en?filename=Mission%5C%20letter%5C%20-%5C%20McGRATH.pdf
Yu, H. H. (2020). Effective academic–practitioner collaboration on gender research in federal law enforcement: The value of a coproduction process. International Review of Administrative Sciences, 86(3), 567–581. https://doi.org/10.1177/0020852318801499