skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Do Different Groups Have Comparable Privacy Tradeoffs?
Personalized systems increasingly employ Privacy Enhancing Technologies (PETs) to protect the identity of their users. In this paper, we are interested in whether the cost-benefit tradeoff — the underlying economics of the privacy calculus — is fairly distributed, or whether some groups of people experience a lower return on investment for their privacy decisions.  more » « less
Award ID(s):
1657774
PAR ID:
10222636
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
CHI 2018 Workshop on Moving Beyond a ‘One-Size Fits All’
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Personally Identifiable Information (PII) leakage can lead to identity theft, financial loss, reputation damage, and anxiety. However, individuals remain largely unaware of their PII exposure on the Internet, and whether providing individuals with information about the extent of their PII exposure can trigger privacy protection actions requires further investigation. In this pilot study, grounded by Protection Motivation Theory (PMT), we examine whether receiving privacy alerts in the form of threat and countermeasure information will trigger senior citizens to engage in protective behaviors. We also examine whether providing personalized information moderates the relationship between information and individuals' perceptions. We contribute to the literature by shedding light on the determinants and barriers to adopting privacy protection behaviors. 
    more » « less
  2. Despite recent widespread deployment of differential privacy, relatively little is known about what users think of differential privacy. In this work, we seek to explore users' privacy expectations related to differential privacy. Specifically, we investigate (1) whether users care about the protections afforded by differential privacy, and (2) whether they are therefore more willing to share their data with differentially private systems. Further, we attempt to understand (3) users' privacy expectations of the differentially private systems they may encounter in practice and (4) their willingness to share data in such systems. To answer these questions, we use a series of rigorously conducted surveys (n=2424).   We find that users care about the kinds of information leaks against which differential privacy protects and are more willing to share their private information when the risks of these leaks are less likely to happen.  Additionally, we find that the ways in which differential privacy is described in-the-wild haphazardly set users' privacy expectations, which can be misleading depending on the deployment. We synthesize our results into a framework for understanding a user's willingness to share information with differentially private systems, which takes into account the interaction between the user's prior privacy concerns and how differential privacy is described. 
    more » « less
  3. As concern over data privacy and existing privacy regulations grows, legal scholars have proposed alternative models for data privacy. This work explores the impact of one such model---the data fiduciary model, which would stipulate that data processors must use personal information only in ways that reflect the best interest of the data subject---through a pair of user studies. We first conduct an interview study with nine mobile app developers in which we explore whether, how, and why these developers believe their current data practices are consistent with the best interest of their users. We then conduct an online study with 390 users in which we survey participants about whether they consider the same data practices to be in their own best interests. We also ask both developers and users about their attitudes towards and their predictions about the impact of a data fiduciary law, and we conclude with recommendations about such an approach to future privacy regulations. 
    more » « less
  4. Over the past few years, the two dominant app platforms made major improvements to their policies surrounding child-directed apps. While prior work repeatedly demonstrated that privacy issues were prevalent in child-directed apps, it is unclear whether platform policies can lead child-directed apps to comply with privacy requirements, when laws alone have not. To understand the effect of recent changes in platform policies (e.g., whether they result in greater levels of compliance with applicable privacy laws), we conducted a large-scale measurement study of the privacy behaviors of 7,377 child-directed Android apps, as well as a follow-up survey with some of their developers. We observed a drastic decrease in the number of apps that transmitted personal data without verifiable parental consent and an increase in the number of apps that encrypted their transmissions using TLS. However, improper use of third-party SDKs still led to privacy issues (e.g., inaccurate disclosures in apps’ privacy labels). Our analysis of apps’ privacy practices over a period of a few months in 2023 and a comparison of our results with those observed a few years ago demonstrate gradual improvements in apps’ privacy practices over time. We discuss how app platforms can further improve their policies and emphasize the role of enforcement in making such policies effective. 
    more » « less
  5. How does protecting consumers' privacy affect the value of their personal data? We model an intermediary that uses consumers' data to influence prices set by a seller. When privacy is protected, consumers choose whether to disclose their data to the intermediary. When privacy is not protected, the intermediary can access consumers' data without their consent. We illustrate that protecting privacy has complex effects. It can increase the value of some consumers' data while decreasing that of others. It can have redistributive effects, by benefiting some consumers at the expense of others. Finally, it can increase average prices and reduce trade. 
    more » « less