Modern autonomous vehicles are increasingly infused with sensors, electronics, and software software. One consequence is that they are getting increasingly susceptible to cyber-attacks. However, awareness of cybersecurity challenges for automotive systems remains low. In this paper, we consider the problem of developing a virtual reality (VR) infrastructure that can enable users who are not necessarily experts in automotive security to explore vulnerabilities arising from compromised ranging sensors. A key requirement for such platforms is to develop natural, intuitive scenarios that enable the user to experience security challenges and impact. We discuss the challenges in developing such scenarios, and develop a solution that enables exploration of jamming and spoong attacks. Our solution is integrated into a VR platform for automotive se- curity exploration called IVE (Immersive Virtual Environment). It combines realistic driving with a rst-person view, user interaction, and sound eects to provide all the benets of a real-life simulation without the consequences.
more »
« less
Immersive Virtual Reality Attacks and the Human Joystick
This is one of the first accounts for the security analysis of consumer immersive Virtual Reality (VR) systems. This work breaks new ground, coins new terms, and constructs proof of concept implementations of attacks related to immersive VR. Our work used the two most widely adopted immersive VR systems, the HTC Vive, and the Oculus Rift. More specifically, we were able to create attacks that can potentially disorient users, turn their Head Mounted Display (HMD) camera on without their knowledge, overlay images in their field of vision, and modify VR environmental factors that force them into hitting physical objects and walls. Finally, we illustrate through a human participant deception study the success of being able to exploit VR systems to control immersed users and move them to a location in physical space without their knowledge. We term this the Human Joystick Attack. We conclude our work with future research directions and ways to enhance the security of these systems.
more »
« less
- Award ID(s):
- 1748950
- PAR ID:
- 10113849
- Date Published:
- Journal Name:
- IEEE Transactions on Dependable and Secure Computing
- ISSN:
- 1545-5971
- Page Range / eLocation ID:
- 1 to 1
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
During active shooter events or emergencies, the ability of security personnel to respond appropriately to the situation is driven by pre-existing knowledge and skills, but also depends upon their state of mind and familiarity with similar scenarios. Human behavior becomes unpredictable when it comes to making a decision in emergency situations. The cost and risk of determining these human behavior characteristics in emergency situations is very high. This paper presents an immersive collaborative virtual reality (VR) environment for performing virtual building evacuation drills and active shooter training scenarios using Oculus Rift head mounted displays. The collaborative immersive environment is implemented in Unity 3D and is based on run, hide, and fight mode for emergency response. The immersive collaborative VR environment also offers a unique method for training in emergencies for campus safety. The participant can enter the collaborative VR environment setup on the cloud and participate in the active shooter response training environment, which leads to considerable cost advantages over large-scale real-life exercises. A presence questionnaire in the user study was used to evaluate the effectiveness of our immersive training module. The results show that a majority of users agreed that their sense of presence was increased when using the immersive emergencymore » « less
-
While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.more » « less
-
Immersive environments enable users to engage in embodied interaction, enhancing the sensemaking processes involved in completing tasks such as immersive analytics. Previous comparative studies on immersive analytics using augmented and virtual realities have revealed that users employ different strategies for data interpretation and text-based analytics depending on the environment. Our study seeks to investigate how augmented and virtual reality influences sensemaking processes in quantitative immersive analytics. Our results, derived from a diverse group of participants, indicate that users demonstrate comparable performance in both environments. However, it was observed that users exhibit a higher tolerance for cognitive load in VR and travel further in AR. Based on our findings, we recommend providing users with the option to switch between AR and VR, thereby enabling them to select an environment that aligns with their preferences and task requirements.more » « less
-
Emerging Virtual Reality (VR) displays with embedded eye trackers are currently becoming a commodity hardware (e.g., HTC Vive Pro Eye). Eye-tracking data can be utilized for several purposes, including gaze monitoring, privacy protection, and user authentication/identification. Identifying users is an integral part of many applications due to security and privacy concerns. In this paper, we explore methods and eye-tracking features that can be used to identify users. Prior VR researchers explored machine learning on motion-based data (such as body motion, head tracking, eye tracking, and hand tracking data) to identify users. Such systems usually require an explicit VR task and many features to train the machine learning model for user identification. We propose a system to identify users utilizing minimal eye-gaze-based features without designing any identification-specific tasks. We collected gaze data from an educational VR application and tested our system with two machine learning (ML) models, random forest (RF) and k-nearest-neighbors (kNN), and two deep learning (DL) models: convolutional neural networks (CNN) and long short-term memory (LSTM). Our results show that ML and DL models could identify users with over 98% accuracy with only six simple eye-gaze features. We discuss our results, their implications on security and privacy, and the limitations of our work.more » « less
An official website of the United States government

