skip to main content


Title: Predictive Caching for AR/VR Experiences in a Household Scenario
Augmented/virtual reality (AR/VR) technologies can be deployed in a household environment for applications such as checking the weather or traffic reports, watching a summary of news, or attending classes. Since AR/VR applications are highly delay sensitive, delivering these types of reports in maximum quality could be very challenging. In this paper, we consider that users go through a series of AR/VR experience units that can be delivered at different experience quality levels. In order to maximize the quality of the experience while minimizing the cost of delivering it, we aim to predict the users’ behavior in the home and the experiences they are interested in at specific moments in time. We describe a deep learning based technique to predict the users’ requests from AR/VR devices and optimize the local caching of experience units. We evaluate the performance of the proposed technique on two real-world datasets and compare our results with other baselines. Our results show that predicting users’ requests can improve the quality of experience and decrease the cost of delivery.  more » « less
Award ID(s):
1800961
PAR ID:
10159366
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
2020 International Conference on Computing, Networking and Communications (ICNC)
Page Range / eLocation ID:
591 to 595
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Virtual reality (VR) simulations have been adopted to provide controllable environments for running augmented reality (AR) experiments in diverse scenarios. However, insufficient research has explored the impact of AR applications on users, especially their attention patterns, and whether VR simulations accurately replicate these effects. In this work, we propose to analyze user attention patterns via eye tracking during XR usage. To represent applications that provide both helpful guidance and irrelevant information, we built a Sudoku Helper app that includes visual hints and potential distractions during the puzzle-solving period. We conducted two user studies with 19 different users each in AR and VR, in which we collected eye tracking data, conducted gaze-based analysis, and trained machine learning (ML) models to predict user attentional states and attention control ability. Our results show that the AR app had a statistically significant impact on enhancing attention by increasing the fixated proportion of time, while the VR app reduced fixated time and made the users less focused. Results indicate that there is a discrepancy between VR simulations and the AR experience. Our ML models achieve 99.3% and 96.3% accuracy in predicting user attention control ability in AR and VR, respectively. A noticeable performance drop when transferring models trained on one medium to the other further highlights the gap between the AR experience and the VR simulation of it. 
    more » « less
  2. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less
  3. By allowing people to manipulate digital content placed in the real world, Augmented Reality (AR) provides immersive and enriched experiences in a variety of domains. Despite its increasing popularity, providing a seamless AR experience under bandwidth fluctuations is still a challenge, since delivering these experiences at photorealistic quality with minimal latency requires high bandwidth. Streaming approaches have already been proposed to solve this problem, but they require accurate prediction of the Field-Of-View of the user to only stream those regions of scene that are most likely to be watched by the user. To solve this prediction problem, we study in this paper the watching behavior of users exploring different types of AR scenes via mobile devices. To this end, we introduce the ACE Dataset, the first dataset collecting movement data of 50 users exploring 5 different AR scenes. We also propose a four-feature taxonomy for AR scene design, which allows categorizing different types of AR scenes in a methodical way, and supporting further research in this domain. Motivated by the ACE dataset analysis results, we develop a novel user visual attention prediction algorithm that jointly utilizes information of users' historical movements and digital objects positions in the AR scene. The evaluation on the ACE Dataset show the proposed approach outperforms baseline approaches under prediction horizons of variable lengths, and can therefore be beneficial to the AR ecosystem in terms of bandwidth reduction and improved quality of users' experience. 
    more » « less
  4. Virtual Reality (VR), together with the network infrastructure, can provide an interactive and immersive experience for multiple users simultaneously and thus enables collaborative VR applications (e.g., VR-based classroom). However, the satisfactory user experience requires not only high-resolution panoramic image rendering but also extremely low latency and seamless user experience. Besides, the competition for limited network resources (e.g., multiple users share the total limited bandwidth) poses a significant challenge to collaborative user experience, in particular under the wireless network with time-varying capacities. While existing works have tackled some of these challenges, a principled design considering all those factors is still missing. In this paper, we formulate a combinatorial optimization problem to maximize the Quality of Experience (QoE), defined as the linear combination of the quality, the average VR content delivery delay, and variance of the quality over a finite time horizon. In particular, we incorporate the influence of imperfect motion prediction when considering the quality of the perceived contents. However, the optimal solution to this problem can not be implemented in real-time since it relies on future decisions. Then, we decompose the optimization problem into a series of combinatorial optimization in each time slot and develop a low-complexity algorithm that can achieve at least 1/2 of the optimal value. Despite this, the trace-based simulation results reveal that our algorithm performs very close to the optimal offline solution. Furthermore, we implement our proposed algorithm in a practical system with commercial mobile devices and demonstrate its superior performance over state-of-the-art algorithms. We open-source our implementations on https://github.com/SNeC-Lab-PSU/ICDCS-CollaborativeVR. 
    more » « less
  5. null (Ed.)
    Abstract—Virtual Reality (VR) has become one of the emerging technologies over the past decade for improving the quality of life in human experiences. It has exciting and popular applications in entertainment, sports, education, and even digital documentation of notable or historical sites, allowing users to immerse themselves in an alternate reality. By combining the principles of software development and immersive VR, real-life VR experiences seek to transport users to an interactive environment where they can view, observe, and experience historical events and artifacts in a new way. There are several steps involved in VR development of cultural and historical sites that require a solid understanding for adaptable and scalable design. This paper is a review of the VR development process for notable historic preservation VR projects. This process can be used to create immersive VR experiences for other cultural sites. 
    more » « less