Smart voice assistants such as Amazon Alexa and Google Home are becoming increasingly pervasive in our everyday environments. Despite their benefits, their miniaturized and embedded cameras and microphones raise important privacy concerns related to surveillance and eavesdropping. Recent work on the privacy concerns of people in the vicinity of these devices has highlighted the need for 'tangible privacy', where control and feedback mechanisms can provide a more assured sense of whether the camera or microphone is 'on' or 'off'. However, current designs of these devices lack adequate mechanisms to provide such assurances. To address this gap in the design of smart voice assistants, especially in the case of disabling microphones, we evaluate several designs that incorporate (or not) tangible control and feedback mechanisms. By comparing people's perceptions of risk, trust, reliability, usability, and control for these designs in a between-subjects online experiment (N=261), we find that devices with tangible built-in physical controls are perceived as more trustworthy and usable than those with non-tangible mechanisms. Our findings present an approach for tangible, assured privacy especially in the context of embedded microphones.
more » « less- Award ID(s):
- 1814866
- PAR ID:
- 10498061
- Publisher / Repository:
- 25th ACM Conference On Computer-Supported Cooperative Work And Social Computing
- Date Published:
- Journal Name:
- Proceedings of the ACM on Human-Computer Interaction
- Volume:
- 6
- Issue:
- CSCW2
- ISSN:
- 2573-0142
- Page Range / eLocation ID:
- 1 to 31
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
null (Ed.)Abstract Voice-activated commands have become a key feature of popular devices such as smartphones, home assistants, and wearables. For convenience, many people configure their devices to be ‘always on’ and listening for voice commands from the user using a trigger phrase such as “Hey Siri,” “Okay Google,” or “Alexa.” However, false positives for these triggers often result in privacy violations with conversations being inadvertently uploaded to the cloud. In addition, malware that can record one’s conversations remains a signifi-cant threat to privacy. Unlike with cameras, which people can physically obscure and be assured of their privacy, people do not have a way of knowing whether their microphone is indeed off and are left with no tangible defenses against voice based attacks. We envision a general-purpose physical defense that uses a speaker to inject specialized obfuscating ‘babble noise’ into the microphones of devices to protect against automated and human based attacks. We present a comprehensive study of how specially crafted, personalized ‘babble’ noise (‘MyBabble’) can be effective at moderate signal-to-noise ratios and can provide a viable defense against microphone based eavesdropping attacks.more » « less
-
Smart speakers come with always-on microphones to facilitate voice-based interaction. To address user privacy concerns, existing devices come with a number of privacy features: e.g., mute buttons and local trigger-word detection modules. But it is difficult for users to trust that these manufacturer-provided privacy features actually work given that there is a misalignment of incentives: Google, Meta, and Amazon benefit from collecting personal data and users know it. What’s needed is perceptible assurance — privacy features that users can, through physical perception, verify actually work. To that end, we introduce, implement, and evaluate the idea of “intentionally-powered” microphones to provide users with perceptible assurance of privacy with smart speakers. We employed an iterative-design process to develop Candid Mic, a battery-free, wireless microphone that can only be powered by harvesting energy from intentional user interactions. Moreover, users can visually inspect the (dis)connection between the energy harvesting module and the microphone. Through a within-subjects experiment, we found that Candid Mic provides users with perceptible assurance about whether the microphone is capturing audio or not, and improves user trust in using smart speakers relative to mute button interfaces.more » « less
-
Abstract As devices with always-on microphones located in people’s homes, smart speakers have significant privacy implications. We surveyed smart speaker owners about their beliefs, attitudes, and concerns about the recordings that are made and shared by their devices. To ground participants’ responses in concrete interactions, rather than collecting their opinions abstractly, we framed our survey around randomly selected recordings of saved interactions with their devices. We surveyed 116 owners of Amazon and Google smart speakers and found that almost half did not know that their recordings were being permanently stored and that they could review them; only a quarter reported reviewing interactions, and very few had ever deleted any. While participants did not consider their own recordings especially sensitive, they were more protective of others’ recordings (such as children and guests) and were strongly opposed to use of their data by third parties or for advertising. They also considered permanent retention, the status quo, unsatisfactory. Based on our findings, we make recommendations for more agreeable data retention policies and future privacy controls.more » « less
-
Intelligent voice assistants may soon become proactive, offering suggestions without being directly invoked. Such behavior increases privacy risks, since proactive operation requires continuous monitoring of conversations. To mitigate this problem, our study proposes and evaluates one potential privacy control, in which the assistant requests permission for the information it wishes to use immediately after hearing it. To find out how people would react to runtime permission requests, we recruited 23 pairs of participants to hold conversations while receiving ambient suggestions from a proactive assistant, which we simulated in real time using the Wizard of Oz technique. The interactive sessions featured different modes and designs of runtime permission requests and were followed by in-depth interviews about people's preferences and concerns. Most participants were excited about the devices despite their continuous listening, but wanted control over the assistant's actions and their own data. They generally prioritized an interruption-free experience above more fine-grained control over what the device would hear.more » « less
-
Due to the ubiquity of IoT devices, privacy violations can now occur across our cyber-physical-social lives. An individual is often not aware of the possible privacy implications of their actions and commonly lacks the ability to dynamically control the undesired access to themselves or their information. Present approaches to privacy management lack an immediacy of feedback and action, tend to be complex and non-engaging, are intrusive and socially inappropriate, and are inconsistent with users' natural interactions with the physical and social environment. This results in ineffective end-user privacy management. To address these challenges, I focus on designing tangible systems, which promise to provide high levels of stimulation, rich feedback, direct, and engaging interaction experiences. This is achieved through intuitive awareness mechanisms and control interactions, conceptualizing interaction metaphors, implementing tangible interfaces for privacy management and demonstrating their utility within various real life scenarios.more » « less