skip to main content


Title: Accounting for Privacy Pluralism: Lessons and Strategies from Community-Based Privacy Groups
The emergent, dynamic nature of privacy concerns in a shifting sociotechnical landscape creates a constant need for privacy-related resources and education. One response to this need is community-based privacy groups. We studied privacy groups that host meetings in diverse urban communities and interviewed the meeting organizers to see how they grapple with potentially varied and changeable privacy concerns. Our analysis identified three features of how privacy groups are organized to serve diverse constituencies: situating (finding the right venue for meetings), structuring (finding the right format/content for the meeting), and providing support (offering varied dimensions of assistance). We use these findings to inform a discussion of "privacy pluralism" as a perennial challenge for the HCI privacy research community, and we use the practices of privacy groups as an anchor for reflection on research practices.  more » « less
Award ID(s):
1814909
NSF-PAR ID:
10485021
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
ACM
Date Published:
ISSN:
1062-9432
ISBN:
9781450394215
Page Range / eLocation ID:
1 to 12
Format(s):
Medium: X
Location:
Hamburg Germany
Sponsoring Org:
National Science Foundation
More Like this
  1. Inspired by earlier academic research, iOS app privacy labels and the recent Google Play data safety labels have been introduced as a way to systematically present users with concise summaries of an app’s data practices. Yet, little research has been conducted to determine how well today’s mobile app privacy labels address people’s actual privacy concerns or questions. We analyze a crowd-sourced corpus of privacy questions collected from mobile app users to determine to what extent these mobile app labels actually address users’ privacy concerns and questions. While there are differences between iOS labels and Google Play labels, our results indicate that an important percentage of people’s privacy questions are not answered or only partially addressed in today’s labels. Findings from this work not only shed light on the additional fields that would need to be included in mobile app privacy labels but can also help inform refinements to existing labels to better address users’ typical privacy questions. 
    more » « less
  2. People who are marginalized experience disproportionate harms when their privacy is violated. Meeting their needs is vital for developing equitable and privacy-protective technologies. In response, research at the intersection of privacy and marginalization has acquired newfound urgency in the HCI and social computing community. In this literature review, we set out to understand how researchers have investigated this area of study. What topics have been examined, and how? What are the key findings and recommendations? And, crucially, where do we go from here? Based on a review of papers on privacy and marginalization published between 2010-2020 across HCI, Communication, and Privacy-focused venues, we make three main contributions: (1) we identify key themes in existing work and introduce the Privacy Responses and Costs framework to describe the tensions around protecting privacy in marginalized contexts, (2) we identify understudied research topics (e.g., race) and other avenues for future work, and (3) we characterize trends in research practices, including the under-reporting of important methodological choices, and provide suggestions to establish shared best practices for this growing research area.

     
    more » « less
  3. Through a series of ACM SIGCHI workshops, we have built a research community of individuals dedicated to networked privacy--from identifying the key challenges to designing privacy solutions and setting a privacy-focused agenda for the future. In this workshop, we take an intentional pause to unpack the potential ethical questions and concerns this agenda might raise. Rather than strictly focusing on privacy as a state that is always desired--where more privacy is viewed unequivocally as "better"--we consider situations where privacy may not be optimal for researchers, end users, or society. We discuss the current research landscape, including the recent updates to the ACM's Code of Ethics, and how researchers and designers can make more informed decisions regarding ethics, privacy, and other competing values in privacy-related research and designs. Our workshop includes group discussions, breakout activities, and a panel of experts with diverse insights discussing topics related to privacy and ethics. 
    more » « less
  4. The General Data Protection Regulation (GDPR) in the European Union contains directions on how user data may be collected, stored, and when it must be deleted. As similar legislation is developed around the globe, there is the potential for repercussions across multiple fields of research, including educational data mining (EDM). Over the past two decades, the EDM community has taken consistent steps to protect learner privacy within our research, whilst pursuing goals that will benefit their learning. However, recent privacy legislation may cause our practices to need to change. The right to be forgotten states that users have the right to request that all their data (including deidentified data generated by them) be removed. In this paper, we discuss the potential challenges of this legislation for EDM research, including impacts on Open Science practices, Data Modeling, and Data sharing. We also consider changes to EDM best practices that may aid compliance with this new legislation. 
    more » « less
  5. Workplaces are increasingly adopting emotion AI, promising benefits to organizations. However, little is known about the perceptions and experiences of workers subject to emotion AI in the workplace. Our interview study with (n=15) US adult workers addresses this gap, finding that (1) participants viewed emotion AI as a deep privacy violation over the privacy of workers’ sensitive emotional information; (2) emotion AI may function to enforce workers’ compliance with emotional labor expectations, and that workers may engage in emotional labor as a mechanism to preserve privacy over their emotions; (3) workers may be exposed to a wide range of harms as a consequence of emotion AI in the workplace. Findings reveal the need to recognize and define an individual right to what we introduce as emotional privacy, as well as raise important research and policy questions on how to protect and preserve emotional privacy within and beyond the workplace. 
    more » « less