skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 11:00 PM ET on Friday, September 13 until 2:00 AM ET on Saturday, September 14 due to maintenance. We apologize for the inconvenience.


Title: “Did you know this camera tracks your mood?”: Understanding Privacy Expectations and Preferences in the Age of Video Analytics
Abstract Cameras are everywhere, and are increasingly coupled with video analytics software that can identify our face, track our mood, recognize what we are doing, and more. We present the results of a 10-day in-situ study designed to understand how people feel about these capabilities, looking both at the extent to which they expect to encounter them as part of their everyday activities and at how comfortable they are with the presence of such technologies across a range of realistic scenarios. Results indicate that while some widespread deployments are expected by many (e.g., surveillance in public spaces), others are not, with some making people feel particularly uncomfortable. Our results further show that individuals’ privacy preferences and expectations are complicated and vary with a number of factors such as the purpose for which footage is captured and analyzed, the particular venue where it is captured, and whom it is shared with. Finally, we discuss the implications of people’s rich and diverse preferences on opt-in or opt-out rights for the collection and use (including sharing) of data associated with these video analytics scenarios as mandated by regulations. Because of the user burden associated with the large number of privacy decisions people could be faced with, we discuss how new types of privacy assistants could possibly be configured to help people manage these decisions.  more » « less
Award ID(s):
1914486
NSF-PAR ID:
10257023
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
Proceedings on Privacy Enhancing Technologies
Volume:
2021
Issue:
2
ISSN:
2299-0984
Page Range / eLocation ID:
282 to 304
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Cameras are everywhere, and are increasingly coupled with video analytics software that can identify our face, track our mood, recognize what we are doing, and more. We present the results of a 10-day in-situ study designed to understand how people feel about these capabilities, looking both at the extent to which they expect to encounter them as part of their everyday activities and at how comfortable they are with the presence of such technologies across a range of realistic scenarios. Results indicate that while some widespread deployments are expected by many (e.g., surveillance in public spaces), others are not, with some making people feel particularly uncomfortable. Our results further show that individuals’ privacy preferences and expectations are complicated and vary with a number of factors such as the purpose for which footage is captured and analyzed, the particular venue where it is captured, and whom it is shared with. Finally, we discuss the implications of people’s rich and diverse preferences on opt-in or opt-out rights for the collection and use (including sharing) of data associated with these video analytics scenarios as mandated by regulations. Because of the user burden associated with the large number of privacy decisions people could be faced with, we discuss how new types of privacy assistants could possibly be configured to help people manage these decisions. 
    more » « less
  2. null (Ed.)
    The rapid growth of facial recognition technology across ever more diverse contexts calls for a better understanding of how people feel about these deployments — whether they see value in them or are concerned about their privacy, and to what extent they have generally grown accustomed to them. We present a qualitative analysis of data gathered as part of a 10-day experience sampling study with 123 participants who were presented with realistic deployment scenarios of facial recognition as they went about their daily lives. Responses capturing their attitudes towards these deployments were collected both in situ and through daily evening surveys, in which participants were asked to reflect on their experiences and reactions. Ten follow-up interviews were conducted to further triangulate the data from the study. Our results highlight both the perceived benefits and concerns people express when faced with different facial recognition deployment scenarios. Participants reported concerns about the accuracy of the technology, including possible bias in its analysis, privacy concerns about the type of information being collected or inferred, and more generally, the dragnet effect resulting from the widespread deployment. Based on our findings, we discuss strategies and guidelines for informing the deployment of facial recognition, particularly focusing on ensuring that people are given adequate levels of transparency and control. 
    more » « less
  3. People often rely on their friends, family, and other loved ones to help them make decisions about digital privacy and security. However, these social processes are rarely supported by technology. To address this gap, we developed an Android-based mobile application ("app") prototype which helps individuals collaborate with people they know to make informed decisions about their app privacy permissions. To evaluate our design, we conducted an interview study with 10 college students while they interacted with our prototype. Overall, participants responded positively to the novel idea of using social collaboration as a means for making better privacy decisions. Yet, we also found that users are less inclined to help others and may be only willing to partake in conversations that directly affect themselves. We discuss the potential for embedding social processes in the design of systems that support privacy decision-making, as well as some of the challenges of this approach. 
    more » « less
  4. null (Ed.)
    To account for privacy perceptions and preferences in user models and develop personalized privacy systems, we need to understand how users make privacy decisions in various contexts. Existing studies of privacy perceptions and behavior focus on overall tendencies toward privacy, but few have examined the context-specific factors in privacy decision making. We conducted a survey on Mechanical Turk (N=401) based on the theory of planned behavior (TPB) to measure the way users’ perceptions of privacy factors and intent to disclose information are affected by three situational factors embodied hypothetical scenarios: information type, recipients’ role, and trust source. Results showed a positive relationship between subjective norms and perceived behavioral control, and between each of these and situational privacy attitude; all three constructs are significantly positively associated with intent to disclose. These findings also suggest that, situational factors predict participants’ privacy decisions through their influence on the TPB constructs. 
    more » « less
  5. Shafiq, Zubair ; Sherr, Micah (Ed.)
    The California Consumer Privacy Act and other privacy laws give people a right to opt out of the sale and sharing of personal information. In combination with privacy preference signals, especially, Global Privacy Control (GPC), such rights have the potential to empower people to assert control over their data. However, many laws prohibit opt out settings being turned on by default. The resulting usability challenges for people to exercise their rights motivate generalizable active privacy choice — an interface design principle to make opt out settings usable without defaults. It is based on the idea of generalizing one individual opt out choice towards a larger set of choices. For example, people may apply an opt out choice on one site towards a larger set of sites. We explore generalizable active privacy choice in the context of GPC. We design and implement nine privacy choice schemes in a browser extension and explore them in a usability study with 410 participants. We find that generalizability features tend to decrease opt out utility slightly. However, at the same time, they increase opt out efficiency and make opting out less disruptive, which was more important to most participants. For the least disruptive scheme, selecting website categories to opt out from, 98% of participants expressed not feeling disrupted, a 40% point increase over the baseline schemes. 83% of participants understood the meaning of GPC. They also made their opt out choices with intent and, thus, in a legally relevant manner. To help people exercise their opt out rights via GPC our results support the adoption of a generalizable active privacy choice interface in web browsers. 
    more » « less