Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available June 23, 2025
-
Free, publicly-accessible full text available June 23, 2025
-
The EU ePrivacy Directive requires consent before using cookies or other tracking technologies, while the EU General Data Protection Regulation (“GDPR”) sets high-level and principle-based requirements for such consent to be valid. However, the translation of such requirements into concrete design interfaces for consent banners is far from straightforward. This situation has given rise to the use of manipulative tactics in user experience (“UX”), commonly known as dark patterns, which influence users’ decision-making and may violate the GDPR requirements for valid consent. To address this problem, EU regulators aim to interpret GDPR requirements and to limit the design space of consent banners within their guidelines. Academic researchers from various disciplines address the same problem by performing user studies to evaluate the impact of design and dark patterns on users’ decision making. Regrettably, the guidelines and user studies rarely impact each other. In this Essay, we collected and analyzed seventeen official guidelines issued by EU regulators and the EU Data Protection Board (“EDPB”), as well as eleven consent-focused empirical user studies which we thoroughly studied from a User Interface (“UI”) design perspective. We identified numerous gaps between consent banner designs recommended by regulators and those evaluated in user studies. By doing so, we contribute to both the regulatory discourse and future user studies. We pinpoint EU regulatory inconsistencies and provide actionable recommendations for regulators. For academic scholars, we synthesize insights on design elements discussed by regulators requiring further user study evaluations. Finally, we recommend that EDPB and EU regulators, alongside usability, Human-Computer Interaction (“HCI”), and design researchers, engage in transdisciplinary dialogue in order to close the gap between EU guidelines and user studies.more » « lessFree, publicly-accessible full text available May 17, 2025
-
Deceptive and coercive design practices are increasingly used by companies to extract profit, harvest data, and limit consumer choice. Dark patterns represent the most common contemporary amalgamation of these problematic practices, connecting designers, technologists, scholars, regulators, and legal professionals in transdisciplinary dialogue. However, a lack of universally accepted definitions across the academic, legislative, practitioner, and regulatory space has likely limited the impact that scholarship on dark patterns might have in supporting sanctions and evolved design practices. In this paper, we seek to support the development of a shared language of dark patterns, harmonizing ten existing regulatory and academic taxonomies of dark patterns and proposing a three-level ontology with standardized definitions for 64 synthesized dark pattern types across low-, meso-, and high-level patterns. We illustrate how this ontology can support translational research and regulatory action, including transdisciplinary pathways to extend our initial types through new empirical work across application and technology domains.more » « lessFree, publicly-accessible full text available May 11, 2025
-
Ethics as embodied by technology practitioners resists simple definition—particularly as it relates to the interplay of identity, organizational, and professional complexity. In this paper we use the linguistic notion of languaging as an analytic lens to describe how technology and design practitioners negotiate their conception of ethics as they reflect upon their everyday work. We engaged twelve practitioners in individual co-creation workshops, encouraging them to reflect on their ethical role in their everyday work through a series of generative and evaluative activities. We analyzed these data to identify how each practitioner reasoned about ethics through language and artifacts, finding that practitioners used a range of rhetorical tropes to describe their ethical commitments and beliefs in ways that were complex and sometimes contradictory. Across three cases, we describe how ethics was negotiated through language across three key zones of ecological emergence: the practitioner’s “core” beliefs about ethics, internal and external ecological elements that shaped or mediated these core beliefs, and the ultimate boundaries they reported refusing to cross. Building on these findings, we describe how the languaging of ethics reveals opportunities to definitionally and practically engage with ethics in technology ethics research, practice, and education.more » « less
-
Deceptive, manipulative, and coercive practices are deeply embedded in our digital experiences, impacting our ability to make informed choices and undermining our agency and autonomy. These design practices—collectively known as “dark patterns” or “deceptive patterns”—are increasingly under legal scrutiny and sanctions, largely due to the efforts of human-computer interaction scholars that have conducted pioneering research relating to dark patterns types, definitions, and harms. In this workshop, we continue building this scholarly community with a focus on organizing for action. Our aims include: (i) building capacity around specific research questions relating to methodologies for detection; (ii) characterization of harms; and (iii) creating effective countermeasures. Through the outcomes of the workshop, we will connect our scholarship to the legal, design, and regulatory communities to inform further legislative and legal action.more » « less
-
Design and technology practitioners are increasingly aware of the ethical impact of their work practices, desiring tools to support their ethical awareness across a range of contexts. In this paper, we report on findings from a series of six co-creation workshops with 26 technology and design practitioners that supported their creation of a bespoke ethics-focused action plan. Using a qualitative content analysis and thematic analysis approach, we identified a range of roles and process moves that practitioners and design students with professional experience employed and illustrate the interplay of these elements that impacted the creation of their action plan and revealed aspects of their ethical design complexity. We conclude with implications for supporting ethics in socio-technical practice and opportunities for the further development of methods that support ethical engagement and are resonant with the realities of practice.more » « lessFree, publicly-accessible full text available May 11, 2025
-
The development of Artificial Intelligence (AI) systems involves a significant level of judgment and decision making on the part of engineers and designers to ensure the safety, robustness, and ethical design of such systems. However, the kinds of judgments that practitioners employ while developing AI platforms are rarely foregrounded or examined to explore areas practitioners might need ethical support. In this short paper, we employ the concept of design judgment to foreground and examine the kinds of sensemaking software engineers use to inform their decisionmaking while developing AI systems. Relying on data generated from two exploratory observation studies of student software engineers, we connect the concept of fairness to the foregrounded judgments to implicate their potential algorithmic fairness impacts. Our findings surface some ways in which the design judgment of software engineers could adversely impact the downstream goal of ensuring fairness in AI systems. We discuss the implications of these findings in fostering positive innovation and enhancing fairness in AI systems, drawing attention to the need to provide ethical guidance, support, or intervention to practitioners as they engage in situated and contextual judgments while developing AI systems.more » « less
-
Dark patterns are increasingly ubiquitous in digital services and regulation, describing instances where designers use deceptive, manipulative, or coercive tactics to encourage end users to make decisions that are not in their best interest. Research regarding dark patterns has also increased significantly over the past several years. In this systematic review, we evaluate literature (n=79) from 2014 to 2022 that has empirically described dark patterns in order to identify the presence, impact, or user experience of these patterns as they appear in digital systems. Based on our analysis, we identify key areas of current interest in evaluating dark patterns’ context, presence, and impact; describe common disciplinary perspectives and framing concepts; characterize dominant methodologies; and outline opportunities for further methodological support and scholarship to empower scholars, designers, and regulators.more » « less