skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Friday, December 13 until 2:00 AM ET on Saturday, December 14 due to maintenance. We apologize for the inconvenience.


Title: Forensic Analysis of Immersive Virtual Reality Social Applications: A Primary Account
Our work presents the primary account for exploring the forensics of immersive Virtual Reality (VR) systems and their social applications. The Social VR applications studied in this work include Bigscreen, Altspace VR, Rec Room and Facebook Spaces. We explored the two most widely adopted consumer VR systems: the HTC Vive and the Oculus Rift. Our tests examined the efficacy of reconstructing evidence from network traffic as well as the systems themselves. The results showed that a significant amount of forensically relevant data such as user names, user profile pictures, events, and system details may be recovered. We anticipate that this work will stimulate future research directions in VR and Augmented Reality (AR) forensics as it is an area that is understudied and needs more attention from the community.  more » « less
Award ID(s):
1748950
PAR ID:
10201307
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
lished in: 2018 IEEE Security and Privacy Workshops (SPW)
Page Range / eLocation ID:
186 to 196
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. As social virtual reality (VR) continues to grow as a medium for digital communication, sustaining presence among communicators remains one of the main constructs that researchers and practitioners use to assess the quality of user experience. In the present paper, we explore language patterns as a behavioral link to presence. We accomplished this through an exploratory text analysis of over 4,800 min of conversation in social VR, consisting of over 130,000 spoken words from 126 participants. We observed that the use of self-references and collective references positively correlated to social presence and spatial presence. Furthermore, median interpersonal distance between communicators was positively associated with using impersonal pronouns, suggesting that participants who stood farther apart from their interlocutors tended to speak in more impersonal terms. Our work sheds light on the possible psychological mechanisms behind presence and the potential of using speech data to help build systems that enhance user engagement.

     
    more » « less
  2. null (Ed.)
    In this paper, we demonstrate the Information Interactions in Virtual Reality (IIVR) system designed and implemented to study how users interact with abstract information objects in immersive virtual environments in the context of information retrieval. Virtual reality displays are quickly growing as social and personal computing media, and understanding user interactions in these immersive environments is imperative. As a step towards effective information retrieval in such emerging platforms, our system is central to upcoming studies to observe how users engage in information triaging tasks in Virtual Reality (VR). In these studies, we will observe the effects of (1) information layouts and (2) types of interactions in VR. We believe this early system motivates researchers in understanding and designing meaningful interactions for future VR information retrieval applications. 
    more » « less
  3. Virtual reality (VR) simulations have been adopted to provide controllable environments for running augmented reality (AR) experiments in diverse scenarios. However, insufficient research has explored the impact of AR applications on users, especially their attention patterns, and whether VR simulations accurately replicate these effects. In this work, we propose to analyze user attention patterns via eye tracking during XR usage. To represent applications that provide both helpful guidance and irrelevant information, we built a Sudoku Helper app that includes visual hints and potential distractions during the puzzle-solving period. We conducted two user studies with 19 different users each in AR and VR, in which we collected eye tracking data, conducted gaze-based analysis, and trained machine learning (ML) models to predict user attentional states and attention control ability. Our results show that the AR app had a statistically significant impact on enhancing attention by increasing the fixated proportion of time, while the VR app reduced fixated time and made the users less focused. Results indicate that there is a discrepancy between VR simulations and the AR experience. Our ML models achieve 99.3% and 96.3% accuracy in predicting user attention control ability in AR and VR, respectively. A noticeable performance drop when transferring models trained on one medium to the other further highlights the gap between the AR experience and the VR simulation of it. 
    more » « less
  4. Due to the growing popularity of consumer virtual reality (VR) systems and applications, researchers have been investigating how tracking and interaction data from VR applications can be used for a wide variety of purposes, including user authentication, predicting cybersickness, and estimating cognitive processing capabilities. In many cases, researchers have to develop their own VR applications to collect such data. In some cases, prior researchers have provided open datasets from their own custom VR applications. In this paper, we present CLOVR, a tool for Capturing and Logging OpenVR data from any VR application built with the OpenVR API, including closed-source consumer VR games and experiences. CLOVR pro- vides an easy-to-use interface for collecting interaction data from OpenVR-based applications. It supports capturing and logging VR device poses, VR actions, microphone audio, VR views, VR videos, and in-VR questionnaires. To demonstrate CLOVR’s capabilities, we also present six datasets of a single user experiencing six different closed-source SteamVR applications. 
    more » « less
  5. Emerging Virtual Reality (VR) displays with embedded eye trackers are currently becoming a commodity hardware (e.g., HTC Vive Pro Eye). Eye-tracking data can be utilized for several purposes, including gaze monitoring, privacy protection, and user authentication/identification. Identifying users is an integral part of many applications due to security and privacy concerns. In this paper, we explore methods and eye-tracking features that can be used to identify users. Prior VR researchers explored machine learning on motion-based data (such as body motion, head tracking, eye tracking, and hand tracking data) to identify users. Such systems usually require an explicit VR task and many features to train the machine learning model for user identification. We propose a system to identify users utilizing minimal eye-gaze-based features without designing any identification-specific tasks. We collected gaze data from an educational VR application and tested our system with two machine learning (ML) models, random forest (RF) and k-nearest-neighbors (kNN), and two deep learning (DL) models: convolutional neural networks (CNN) and long short-term memory (LSTM). Our results show that ML and DL models could identify users with over 98% accuracy with only six simple eye-gaze features. We discuss our results, their implications on security and privacy, and the limitations of our work. 
    more » « less