skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on October 1, 2026

Title: Models Matter: Setting Accurate Privacy Expectations for Local and Central Differential Privacy
Differential privacy is a popular privacy-enhancing technology that has been deployed both by industry and government agencies. Unfortunately, existing explanations of differential privacy fail to set accurate privacy expectations for data subjects, which depend on the choice of deployment model. We design and evaluate new explanations of differential privacy for the local and central models, drawing inspiration from prior work explaining other privacy-enhancing technologies such as encryption. We reflect on the challenges in evaluating explanations and on the tradeoffs between qualitative and quantitative evaluation strategies. These reflections offer guidance for other researchers seeking to design and evaluate explanations of privacy-enhancing technologies.  more » « less
Award ID(s):
2429838 2429841 2429839
PAR ID:
10635037
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
Proc. Priv. Enhancing Technol.
Date Published:
Journal Name:
Proceedings on Privacy Enhancing Technologies
Volume:
2025
Issue:
4
ISSN:
2299-0984
Page Range / eLocation ID:
653 to 678
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Describing Privacy Enhancing Technologies (PETs) to the general public is challenging but essential to convey the privacy protections they provide. Existing research has explored the explanation of differential privacy in health contexts. Our study adapts well-performing textual descriptions of local differential privacy from prior work to a new context and broadens the investigation to the descriptions of additional PETs. Specifically, we develop user-centric textual descriptions for popular PETs in ad tracking and analytics, including local differential privacy, federated learning with and without local differential privacy, and Google's Topics. We examine the applicability of previous findings to these expanded contexts, and evaluate the PET descriptions with quantitative and qualitative survey data (n=306). We find that adapting a process- and implications-focused approach to the ad tracking and analytics context achieved similar effects in facilitating user understanding compared to health contexts, and that our descriptions developed with this process+implications approach for the additional, understudied PETs help users understand PETs' processes. We also find that incorporating an implications statement into PET descriptions did not hurt user comprehension but also did not achieve a significant positive effect, which contrasts prior findings in health contexts. We note that the use of technical terms as well as the machine learning aspect of PETs, even without delving into specifics, led to confusion for some respondents. Based on our findings, we offer recommendations and insights for crafting effective user-centric descriptions of privacy-enhancing technologies. 
    more » « less
  2. Voluntary sharing of personal information is at the heart of user engagement on social media and central to platforms' business models. From the users' perspective, so-called self-disclosure is closely connected with both privacy risks and social rewards. Prior work has studied contextual influences on self-disclosure, from platform affordances and interface design to user demographics and perceived social capital. Our work takes a mixed-methods approach to understand the contextual information which might be integrated in the development of privacy-enhancing technologies. Through observational study of several Reddit communities, we explore the ways in which topic of discussion, group norms, peer effects, and audience size are correlated with personal information sharing. We then build and test a prototype privacy-enhancing tool that exposes these contextual factors. Our work culminates in a browser extension that automatically detects instances of self-disclosure in Reddit posts at the time of posting and provides additional context to users before they post to support enhanced privacy decision-making. We share this prototype with social media users, solicit their feedback, and outline a path forward for privacy-enhancing technologies in this space. 
    more » « less
  3. Differential privacy is a widely accepted formal privacy definition that allows aggregate information about a dataset to be released while controlling privacy leakage for individuals whose records appear in the data. Due to the unavoidable tension between privacy and utility, there have been many works trying to relax the requirements of differential privacy to achieve greater utility.One class of relaxation, which is gaining support outside the privacy community is embodied by the definitions of individual differential privacy (IDP) and bootstrap differential privacy (BDP). Classical differential privacy defines a set of neighboring database pairs and achieves its privacy guarantees by requiring that each pair of neighbors should be nearly indistinguishable to an attacker. The privacy definitions we study, however, aggressively reduce the set of neighboring pairs that are protected.To a non-expert, IDP and BDP can seem very appealing as they echo the same types of privacy explanations that are associated with differential privacy, and also experimentally achieve dramatically better utility. However, we show that they allow a significant portion of the dataset to be reconstructed using algorithms that have arbitrarily low privacy loss under their privacy accounting rules.With the non-expert in mind, we demonstrate these attacks using the preferred mechanisms of these privacy definitions. In particular, we design a set of queries that, when protected by these mechanisms with high noise settings (i.e., with claims of very low privacy loss), yield more precise information about the dataset than if they were not protected at all. The specific attacks here can be defeated and we give examples of countermeasures. However, the defenses are either equivalent to using differential privacy or to ad-hoc methods tailored specifically to the attack (with no guarantee that they protect against other attacks). Thus, the defenses emphasize the deficiencies of these privacy definitions. 
    more » « less
  4. null (Ed.)
    Differential privacy is at a turning point. Implementations have been successfully leveraged in private industry, the public sector, and academia in a wide variety of applications, allowing scientists, engineers, and researchers the ability to learn about populations of interest without specifically learning about these individuals. Because differential privacy allows us to quantify cumulative privacy loss, these differentially private systems will, for the first time, allow us to measure and compare the total privacy loss due to these personal data-intensive activities. Appropriately leveraged, this could be a watershed moment for privacy. Like other technologies and techniques that allow for a range of instantiations, implementation details matter. When meaningfully implemented, differential privacy supports deep data-driven insights with minimal worst-case privacy loss. When not meaningfully implemented, differential privacy delivers privacy mostly in name. Using differential privacy to maximize learning while providing a meaningful degree of privacy requires judicious choices with respect to the privacy parameter epsilon, among other factors. However, there is little understanding of what is the optimal value of epsilon for a given system or classes of systems/purposes/data etc. or how to go about figuring it out. To understand current differential privacy implementations and how organizations make these key choices in practice, we conducted interviews with practitioners to learn from their experiences of implementing differential privacy. We found no clear consensus on how to choose epsilon, nor is there agreement on how to approach this and other key implementation decisions. Given the importance of these implementation details there is a need for shared learning amongst the differential privacy community. To serve these purposes, we propose the creation of the Epsilon Registry—a publicly available communal body of knowledge about differential privacy implementations that can be used by various stakeholders to drive the identification and adoption of judicious differentially private implementations. 
    more » « less
  5. This qualitative study examines the privacy challenges perceived by librarians who afford access to physical and electronic spaces and are in a unique position of safeguarding the privacy of their patrons. As internet “service providers,” librarians represent a bridge between the physical and internet world, and thus offer a unique sight line to the convergence of privacy, identity, and social disadvantage. Drawing on interviews with 16 librarians, we describe how they often interpret or define their own rules when it comes to privacy to protect patrons who face challenges that stem from structures of inequality outside their walls. We adopt the term “intersectional thinking” to describe how librarians reported thinking about privacy solutions, which is focused on identity and threats of structural discrimination (the rules, norms, and other determinants of discrimination embedded in institutions and other societal structures that present barriers to certain groups or individuals), and we examine the role that low/no-tech strategies play in ameliorating these threats. We then discuss how librarians act as privacy intermediaries for patrons, the potential analogue for this role for developers of systems, the power of low/no-tech strategies, and implications for design and research of privacy-enhancing technologies (PETs). 
    more » « less