skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Let’s SOUP up XR: Collected thoughts from an IEEE VR workshop on privacy in mixed reality
This paper presents insights from the PrXR workshop conducted at IEEE VR 2021. We identified several topic areas related to privacy and security risks for virtual, augmented, and mixed-reality (XR) applications. Risks are presented from the perspective of the XR community. We attempt to thematically group the workshop findings and highlight the challenges brought up by the participants. The identified research topics serve as a roadmap to push forward privacy and security research in the context of XR.  more » « less
Award ID(s):
2026540
PAR ID:
10354958
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
VR4Sec: Security for VR and VR for Security, SOUPS 2021 Workshop
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Trustworthy data repositories ensure the security of their collections. We argue they should also ensure the security of researcher and human subject data. Here we demonstrate the use of a privacy impact assessment (PIA) to evaluate potential privacy risks to researchers using the ICPSR’s Open Badges Research Credential System as a case study. We present our workflow and discuss potential privacy risks and mitigations for those risks. [This paper is a conference pre-print presented at IDCC 2020 after lightweight peer review.] 
    more » « less
  2. Consumer Internet of Things (IoT) devices are increasingly common, from smart speakers to security cameras, in homes. Along with their benefits come potential privacy and security threats. To limit these threats a number of commercial services have become available (IoT safeguards). The safeguards claim to provide protection against IoT privacy risks and security threats. However, the effectiveness and the associated privacy risks of these safeguards remains a key open question. In this paper, we investigate the threat detection capabilities of IoT safeguards for the first time. We develop and release an approach for automated safeguards experimentation to reveal their response to common security threats and privacy risks. We perform thousands of automated experiments using popular commercial IoT safeguards when deployed in a large IoT testbed. Our results indicate not only that these devices may be ineffective in preventing risks, but also their cloud interactions and data collection operations may introduce privacy risks for the households that adopt them. 
    more » « less
  3. With 'smart' technology becoming more prevalent in homes, computing is increasingly embedded into everyday life. The benefits are well-advertised, but the risks associated with these technologies are not as clearly articulated. We aim to address this gap by educating community members on some of these risks, and providing actionable advice to mitigate risks. To this end, we describe our efforts to design and implement a hands-on workshop for the public on smart-home security and privacy. Our workshop curriculum centers on the smart-home device lifecycle: obtaining, installing, using, and removing devices in a home. For each phase of the lifecycle, we present possible vulnerabilities along with preventative measures relevant to a general audience. We integrate a hands-on activity for participants to put best-practices into action throughout the presentation. We ran our workshop at a science museum in June 2023, and we used participant surveys to evaluate the effectiveness of our curriculum. Prior to the workshop, 38.8% of survey responses did not meet learning objectives, 22.4% partially met them, and 38.8% fully met them. After the workshop, only 9.2% of responses did not meet learning objectives, while 29.6% partially met them and 61.2% fully met them. Our experience shows that consumer-focused workshops can aid in bridging information gaps and are a promising form of outreach. 
    more » « less
  4. Abstract Machine unlearning is a cutting‐edge technology that embodies the privacy legal principle of the right to be forgotten within the realm of machine learning (ML). It aims to remove specific data or knowledge from trained models without retraining from scratch and has gained significant attention in the field of artificial intelligence in recent years. However, the development of machine unlearning research is associated with inherent vulnerabilities and threats, posing significant challenges for researchers and practitioners. In this article, we provide the first comprehensive survey of security and privacy issues associated with machine unlearning by providing a systematic classification across different levels and criteria. Specifically, we begin by investigating unlearning‐based security attacks, where adversaries exploit vulnerabilities in the unlearning process to compromise the security of machine learning (ML) models. We then conduct a thorough examination of privacy risks associated with the adoption of machine unlearning. Additionally, we explore existing countermeasures and mitigation strategies designed to protect models from malicious unlearning‐based attacks targeting both security and privacy. Further, we provide a detailed comparison between machine unlearning‐based security and privacy attacks and traditional malicious attacks. Finally, we discuss promising future research directions for security and privacy issues posed by machine unlearning, offering insights into potential solutions and advancements in this evolving field. 
    more » « less
  5. null (Ed.)
    Universities have been forced to rely on remote educational technology to facilitate the rapid shift to online learning. In doing so, they acquire new risks of security vulnerabilities and privacy violations. To help universities navigate this landscape, we develop a model that describes the actors, incentives, and risks, informed by surveying 105 educators and 10 administrators. Next, we develop a methodology for administrators to assess security and privacy risks of these products. We then conduct a privacy and security analysis of 23 popular platforms using a combination of sociological analyses of privacy policies and 129 state laws, alongside a technical assessment of platform software. Based on our findings, we develop recommendations for universities to mitigate the risks to their stakeholders. 
    more » « less