Social media applications have benefited users in several ways, including ease of communication and quick access to information. However, they have also introduced several privacy and safety risks. These risks are particularly concerning in the context of interpersonal attacks, which are carried out by abusive friends, family members, intimate partners, co-workers, or even strangers. Evidence shows interpersonal attackers regularly exploit social media platforms to harass and spy on their targets. To help protect targets from such attacks, social media platforms have introduced several privacy and safety features. However, it is unclear how effective they are against interpersonal threats. In this work, we analyzed ten popular social media applications, identifying 100 unique privacy and safety features that provide controls across eight categories: discoverability, visibility, saving and sharing, interaction, self-censorship, content moderation, transparency, and reporting. We simulated 59 different attack actions by a persistent attacker — aimed at account discovery, information gathering, non-consensual sharing, and harassment — and found many were successful. Based on our findings, we proposed improvements to mitigate these risks.
more »
« less
Privacy threats in intimate relationships
Abstract This article provides an overview of intimate threats: a class of privacy threats that can arise within our families, romantic partnerships, close friendships, and caregiving relationships. Many common assumptions about privacy are upended in the context of these relationships, and many otherwise effective protective measures fail when applied to intimate threats. Those closest to us know the answers to our secret questions, have access to our devices, and can exercise coercive power over us. We survey a range of intimate relationships and describe their common features. Based on these features, we explore implications for both technical privacy design and policy, and offer design recommendations for ameliorating intimate privacy risks.
more »
« less
- Award ID(s):
- 1916096
- PAR ID:
- 10192797
- Date Published:
- Journal Name:
- Journal of Cybersecurity
- Volume:
- 6
- Issue:
- 1
- ISSN:
- 2057-2085
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Consumer Internet of Things (IoT) devices are increasingly common, from smart speakers to security cameras, in homes. Along with their benefits come potential privacy and security threats. To limit these threats a number of commercial services have become available (IoT safeguards). The safeguards claim to provide protection against IoT privacy risks and security threats. However, the effectiveness and the associated privacy risks of these safeguards remains a key open question. In this paper, we investigate the threat detection capabilities of IoT safeguards for the first time. We develop and release an approach for automated safeguards experimentation to reveal their response to common security threats and privacy risks. We perform thousands of automated experiments using popular commercial IoT safeguards when deployed in a large IoT testbed. Our results indicate not only that these devices may be ineffective in preventing risks, but also their cloud interactions and data collection operations may introduce privacy risks for the households that adopt them.more » « less
-
Set Cover is a fundamental problem in combinatorial optimization which has been studied for many decades due to its various applications across multiple domains. In many of these domains, the input data consists of locations, relationships, and other sensitive information of individuals which may leaked due to the set cover output. Attempts have been made to design privacy-preserving algorithms to solve the Set Cover problem under privacy constraints. Under differential privacy, it has been proved that the Set Cover problem has strong impossibility results and no explicit forms of the output can be released to the public. In this work, we observe that these hardness results dissolve when we turn to the Partial Set Cover problem, where we only need to cover a constant fraction of the elements. We show that this relaxation enables us to avoid the impossibility results, and give the first algorithm which outputs an explicit form of set cover with non-trivial utility guarantees under differential privacy. Using our algorithm as a subroutine, we design a differentially-private bicriteria algorithm to solve a recently-proposed facility-location problem for vaccine distribution which generalizes k-supplier with outliers. Our analysis shows that relaxing the covering requirement also allows us to circumvent the inherent hardness of k-supplier and give the first nontrivial guarantees.more » « less
-
Security design choices often fail to take into account users' social context. Our work is among the first to examine security behavior in romantic relationships. We surveyed 195 people on Amazon Mechanical Turk about their relationship status and account sharing behavior for a cross-section of popular websites and apps (e.g., Netflix, Amazon Prime). We examine differences in account sharing behavior at different stages in a relationship and for people in different age groups and income levels. We also present a taxonomy of sharing motivations and behaviors based on the iterative coding of open-ended responses. Based on this taxonomy, we present design recommendations to support end users in three relationship stages: when they start sharing access with romantic partners; when they are maintaining that sharing; and when they decide to stop. Our findings contribute to the field of usable privacy and security by enhancing our understanding of security and privacy behaviors and needs in intimate social relationships.more » « less
-
null (Ed.)Older adults are increasingly becoming adopters of digital technologies, such as smartphones; however, this population remains particularly vulnerable to digital privacy and security threats. To date, most research on technology used among older adults focuses on helping individuals overcome their discomfort or lack of expertise with technology to protect them from such threats. Instead, we are interested in how communities of older adults work together to collectively manage their digital privacy and security. To do this, we surveyed 67 individuals across two older adult communities (59 older adults and eight employees or volunteers) and found that the community's collective efficacy for privacy and security was significantly correlated with the individuals' self-efficacy, power usage of technology, and their sense of community belonging. Community collective efficacy is a group's mutual belief in its ability to achieve a shared goal. Using social network analysis, we further unpacked these relationships to show that many older adults interact with others who have similar technological expertise, and closer-knit older adult communities that have low technology expertise (i.e., low power usage and self-efficacy) may increase their community collective efficacy for privacy and security by embedding facilitators (e.g., employees or volunteers) who have more technical expertise within their communities. Our work demonstrates how both peer influence and outside expertise can be leveraged to support older adults in managing their digital privacy and security.more » « less
An official website of the United States government

