Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            There has been growing recognition of the crucial role users, especially those from marginalized groups, play in uncovering harmful algorithmic biases. However, it remains unclear how users’ identities and experiences might impact their rating of harmful biases. We present an online experiment (N=2,197) examining these factors: demographics, discrimination experiences, and social and technical knowledge. Participants were shown examples of image search results, including ones that previous literature has identified as biased against marginalized racial, gender, or sexual orientation groups. We found participants from marginalized gender or sexual orientation groups were more likely to rate the examples as more severely harmful. Belonging to marginalized races did not have a similar pattern. Additional factors affecting users’ ratings included discrimination experiences, and having friends or family belonging to marginalized demographics. A qualitative analysis offers insights into users' bias recognition, and why they see biases the way they do. We provide guidance for designing future methods to support effective user-driven auditing.more » « less
- 
            Apple announced the introduction of app privacy details to their App Store in December 2020, marking the frst ever real-world, large-scale deployment of the privacy nutrition label concept, which had been introduced by researchers over a decade earlier. The Apple labels are created by app developers, who self-report their app’s data practices. In this paper, we present the frst study examining the usability and understandability of Apple’s privacy nutrition label creation process from the developer’s perspective. By observing and interviewing 12 iOS app developers about how they created the privacy label for a real-world app that they developed, we identified common challenges for correctly and efciently creating privacy labels. We discuss design implications both for improving Apple’s privacy label design and for future deployment of other standardized privacy notices.more » « less
- 
            Account sharing is a common, if officially unsanctioned, practice among workgroups, but so far understudied in higher education. We interview 23 workgroup members about their account sharing practices at a U.S. university. Our study is the first to explicitly compare IT and non-IT observations of account sharing as a "normal and easy" workgroup practice, as well as to compare student practices with those of full-time employees. We contrast our results with those in prior works and offer recommendations for security design and for IT messaging. Our findings that account sharing is perceived as low risk by our participants and that security is seen as secondary to other priorities offer insights into the gap between technical affordances and social needs in an academic workplace such as this.?more » « less
- 
            We present Peekaboo, a new privacy-sensitive architecture for smart homes that leverages an in-home hub to pre-process and minimize outgoing data in a structured and enforceable manner before sending it to external cloud servers. Peekaboo’s key innovations are (1) abstracting common data preprocessing functionality into a small and fixed set of chainable operators, and (2) requiring that developers explicitly declare desired data collection behaviors (e.g., data granularity, destinations, conditions) in an application manifest, which also specifies how the operators are chained together. Given a manifest, Peekaboo assembles and executes a pre-processing pipeline using operators pre-loaded on the hub. In doing so, developers can collect smart home data on a need-to-know basis; third-party auditors can verify data collection behaviors; and the hub itself can offer a number of centralized privacy features to users across apps and devices, without additional effort from app developers. We present the design and implementation of Peekaboo, along with an evaluation of its coverage of smart home scenarios, system performance, data minimization, and example built-in privacy features.more » « less
- 
            Since December 2020, the Apple App Store has required all developers to create a privacy label when submitting new apps or app updates. However, there has not been a comprehensive study on how developers responded to this requirement. We present the frst measurement study of Apple privacy nutrition labels to understand how apps on the U.S. App Store create and update privacy labels. We collected weekly snapshots of the privacy label and other metadata for all the 1.4 million apps on the U.S. App Store from April 2 to November 5, 2021. Our analysis showed that 51.6% of apps still do not have a privacy label as of November 5, 2021. Although 35.3% of old apps have created a privacy label, only 2.7% of old apps created a privacy label without app updates (i.e., voluntary adoption). Our findings suggest that inactive apps have little incentive to create privacy labels.more » « less
- 
            In this paper, we studied people’s smart home privacy-protective behaviors (SH-PPBs), to gain a better understanding of their privacy management do’s and don’ts in this context. We first surveyed 159 participants and elicited 33 unique SH-PPB practices, revealing that users heavily rely on ad hoc approaches at the physical layer (e.g., physical blocking, manual powering off). We also characterized the types of privacy concerns users wanted to address through SH-PPBs, the reasons preventing users from doing SH-PPBs, and privacy features they wished they had to support SH-PPBs. We then storyboarded 11 privacy protection concepts to explore opportunities to better support users’ needs, and asked another 227 participants to criticize and rank these design concepts. Among the 11 concepts, Privacy Diagnostics, which is similar to security diagnostics in anti-virus software, was far preferred over the rest. We also witnessed rich evidence of four important factors in designing SH-PPB tools, as users prefer (1) simple, (2) proactive, (3) preventative solutions that can (4) offer more control.more » « less
- 
            User adoption of security and privacy (S&P) best practices remains low, despite sustained efforts by researchers and practitioners. Social influence is a proven method for guiding user S&P behavior, though most work has focused on studying peer influence, which is only possible with a known social graph. In a study of 104 Facebook users, we instead demonstrate that crowdsourced S&P suggestions are significantly influential. We also tested how reflective writing affected participants’ S&P decisions, with and without suggestions. With reflective writing, participants were less likely to accept suggestions — both social and Facebook default suggestions. Of particular note, when reflective writing participants were shown the Facebook default suggestion, they not only rejected it but also (unknowingly) configured their settings in accordance with expert recommendations. Our work suggests that both non-personal social influence and reflective writing can positively influence users’ S&P decisions, but have negative interactions.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
