Smartphones have recently become a popular platform for deploying the computation-intensive virtual reality (VR) applications, such as immersive video streaming (a.k.a., 360-degree video streaming). One specific challenge involving the smartphone-based head mounted display (HMD) is to reduce the potentially huge power consumption caused by the immersive video. To address this challenge, we first conduct an empirical power measurement study on a typical smartphone immersive streaming system, which identifies the major power consumption sources. Then, we develop QuRate, a quality-aware and user-centric frame rate adaptation mechanism to tackle the power consumption issue in immersive video streaming. QuRate optimizes the immersive video power consumption by modeling the correlation between the perceivable video quality and the user behavior. Specifically, QuRate builds on top of the user’s reduced level of concentration on the video frames during view switching and dynamically adjusts the frame rate without impacting the perceivable video quality. We evaluate QuRate with a comprehensive set of experiments involving 5 smartphones, 21 users, and 6 immersive videos using empirical user head movement traces. Our experimental results demonstrate that QuRate is capable of extending the smartphone battery life by up to 1.24X while maintaining the perceivable video quality during immersive video streaming. Also, we conduct an Institutional Review Board (IRB)- approved subjective user study to further validate the minimum video quality impact caused by QuRate.
more »
« less
Environment-driven mmWave Beamforming for Multi-user Immersive Applications
This position paper explores the challenges and opportunities for high-quality immersive volumetric video streaming for multiple users over millimeter-wave (mmWave) WLANs. While most of the previous work has focused on single-user streaming, there is a growing need for multi-user immersive applications such as virtual collaboration, classroom education, teleconferencing, etc. While mmWave wireless links can provide multi-gigabit per second data rates, they suffer from blockages and high beamforming overhead. This paper investigates an environment-driven approach to address the challenges. It presents a comprehensive research agenda that includes developing a collaborative 3D scene reconstruction process, material identification, ray tracing, blockage mitigation, and cross-layer multi-user video rate adaptation. Our preliminary results show the feasibility and identify the limitations of existing solutions. Finally, we discuss the open challenges of implementing a practical system based on the proposed research agenda.
more »
« less
- PAR ID:
- 10467145
- Publisher / Repository:
- ACM
- Date Published:
- ISBN:
- 9798400703393
- Page Range / eLocation ID:
- 268 to 275
- Format(s):
- Medium: X
- Location:
- Madrid Spain
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
This paper presents the results of motion-tracking synchronized millimeter wave (mmWave) link bandwidth fluctuations while a user is engaged in immersive augmented/virtual reality applications. Our system, called MITRAS, supports ex- tensive exploration of human-induced impacts on mmWave link bandwidth during immersive experience. MITRAS adopts the packet train measurement application to track link bandwidth fluctuations. Meanwhile, the user movements are tracked using an Oculus Quest 2 headset. Through investigating the impacts of human movements on link bandwidth fluctuations, we further propose a link state prediction model to shed light on higher layer protocol design for immersive applications over mmWave links.more » « less
-
Bulterman_Dick; Kankanhalli_Mohan; Muehlhaueser_Max; Persia_Fabio; Sheu_Philip; Tsai_Jeffrey (Ed.)The emergence of 360-video streaming systems has brought about new possibilities for immersive video experiences while requiring significantly higher bandwidth than traditional 2D video streaming. Viewport prediction is used to address this problem, but interesting storylines outside the viewport are ignored. To address this limitation, we present SAVG360, a novel viewport guidance system that utilizes global content information available on the server side to enhance streaming with the best saliency-captured storyline of 360-videos. The saliency analysis is performed offline on the media server with powerful GPU, and the saliency-aware guidance information is encoded and shared with clients through the Saliency-aware Guidance Descriptor. This enables the system to proactively guide users to switch between storylines of the video and allow users to follow or break guided storylines through a novel user interface. Additionally, we present a viewing mode prediction algorithms to enhance video delivery in SAVG360. Evaluation of user viewport traces in 360-videos demonstrate that SAVG360 outperforms existing tiled streaming solutions in terms of overall viewport prediction accuracy and the ability to stream high-quality 360 videos under bandwidth constraints. Furthermore, a user study highlights the advantages of our proactive guidance approach over predicting and streaming of where users look.more » « less
-
The highly directional nature of the millimeter wave (mmWave) beams pose several challenges in using that spectrum for meeting the communication needs of immersive applications. In particular, the mmWave beams are susceptible to misalignments and blockages caused by user movements. As a result, mmWave channels are vulnerable to large fluctuations in quality, which in turn, cause disproportionate degradation in end-to-end performance of Transmission Control Protocol (TCP) based applications. In this paper, we propose a reinforcement learning (RL) integrated transport-layer plugin, Millimeter wave based Immersive Agent (MIA), for immersive content delivery over the mmWave link. MIA uses the RL model to predict mmWave link bandwidth based on the real-time measurement. Then, MIA cooperates with TCP’s congestion control scheme to adapt the sending rate in accordance with the predictions of the mmWave bandwidth. To evaluate the effectiveness of the proposed MIA, we conduct experiments using a mmWave augmented immersive testbed and network simulations. The evaluation results show that MIA improves end-to-end immersive performance significantly on both throughput and latency.more » « less
-
Low-latency is a critical user Quality-of-Experience (QoE) metric for live video streaming. It poses significant challenges for streaming over the Internet. In this paper, we explore the design space of low-latency live video streaming by developing dynamic models and optimal control strategies. We further develop practical live video streaming algorithms within the Model Predictive Control (MPC) framework, namely MPC-Live, to maximize user QoE by adapting the video bitrate while maintaining low end-to-end video latency in dynamic network environment. Through extensive experiments driven by real network traces, we demonstrate that our live video streaming algorithms can improve the performance dramatically within latency range of two to five seconds.more » « less
An official website of the United States government

