skip to main content


Title: QuRate: Power-Efficient Mobile Immersive Video Streaming
Smartphones have recently become a popular platform for deploying the computation-intensive virtual reality (VR) applications, such as immersive video streaming (a.k.a., 360-degree video streaming). One specific challenge involving the smartphone-based head mounted display (HMD) is to reduce the potentially huge power consumption caused by the immersive video. To address this challenge, we first conduct an empirical power measurement study on a typical smartphone immersive streaming system, which identifies the major power consumption sources. Then, we develop QuRate, a quality-aware and user-centric frame rate adaptation mechanism to tackle the power consumption issue in immersive video streaming. QuRate optimizes the immersive video power consumption by modeling the correlation between the perceivable video quality and the user behavior. Specifically, QuRate builds on top of the user’s reduced level of concentration on the video frames during view switching and dynamically adjusts the frame rate without impacting the perceivable video quality. We evaluate QuRate with a comprehensive set of experiments involving 5 smartphones, 21 users, and 6 immersive videos using empirical user head movement traces. Our experimental results demonstrate that QuRate is capable of extending the smartphone battery life by up to 1.24X while maintaining the perceivable video quality during immersive video streaming. Also, we conduct an Institutional Review Board (IRB)- approved subjective user study to further validate the minimum video quality impact caused by QuRate.  more » « less
Award ID(s):
1755659
NSF-PAR ID:
10159009
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
ACM Multimedia Systems Conference 2020 (MMSys'20)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Battery life is an increasingly urgent challenge for today's untethered VR and AR devices. However, the power efficiency of head-mounted displays is naturally at odds with growing computational requirements driven by better resolution, refresh rate, and dynamic ranges, all of which reduce the sustained usage time of untethered AR/VR devices. For instance, the Oculus Quest 2, under a fully-charged battery, can sustain only 2 to 3 hours of operation time. Prior display power reduction techniques mostly target smartphone displays. Directly applying smartphone display power reduction techniques, however, degrades the visual perception in AR/VR with noticeable artifacts. For instance, the "power-saving mode" on smartphones uniformly lowers the pixel luminance across the display and, as a result, presents an overall darkened visual perception to users if directly applied to VR content. Our key insight is that VR display power reduction must be cognizant of the gaze-contingent nature of high field-of-view VR displays. To that end, we present a gaze-contingent system that, without degrading luminance, minimizes the display power consumption while preserving high visual fidelity when users actively view immersive video sequences. This is enabled by constructing 1) a gaze-contingent color discrimination model through psychophysical studies, and 2) a display power model (with respect to pixel color) through real-device measurements. Critically, due to the careful design decisions made in constructing the two models, our algorithm is cast as a constrained optimization problem with a closed-form solution, which can be implemented as a real-time, image-space shader. We evaluate our system using a series of psychophysical studies and large-scale analyses on natural images. Experiment results show that our system reduces the display power by as much as 24% (14% on average) with little to no perceptual fidelity degradation. 
    more » « less
  2. With the advent of 5G, supporting high-quality game streaming applications on edge devices has become a reality. This is evidenced by a recent surge in cloud gaming applications on mobile devices. In contrast to video streaming applications, interactive games require much more compute power for supporting improved rendering (such as 4K streaming) with the stipulated frames-per second (FPS) constraints. This in turn consumes more battery power in a power-constrained mobile device. Thus, the state-of-the-art gaming applications suffer from lower video quality (QoS) and/or energy efficiency. While there has been a plethora of recent works on optimizing game streaming applications, to our knowledge, there is no study that systematically investigates the design pairs on the end-to-end game streaming pipeline across the cloud, network, and edge devices to understand the individual contributions of the different stages of the pipeline for improving the overall QoS and energy efficiency. In this context, this paper presents a comprehensive performance and power analysis of the entire game streaming pipeline consisting of the server/cloud side, network, and edge. Through extensive measurements with a high-end workstation mimicking the cloud end, an open-source platform (Moonlight-GameStreaming) emulating the edge device/mobile platform, and two network settings (WiFi and 5G) we conduct a detailed measurement-based study with seven representative games with different characteristics. We characterize the performance in terms of frame latency, QoS, bitrate, and energy consumption for different stages of the gaming pipeline. Our study shows that the rendering stage and the encoding stage at the cloud end are the bottlenecks to support 4K streaming. While 5G is certainly more suitable for supporting enhanced video quality with 4K streaming, it is more expensive in terms of power consumption compared to WiFi. Further, fluctuations in 5G network quality can lead to huge frame drops thus affecting QoS, which needs to be addressed by a coordinated design between the edge device and the server. Finally, the network interface and the decoder units in a mobile platform need more energy-efficient design to support high quality games at a lower cost. These observations should help in designing more cost-effective future cloud gaming platforms. 
    more » « less
  3. Short videos have recently emerged as a popular form of short- duration User Generated Content (UGC) within modern social me- dia. Short video content is generally less than a minute long and predominantly produced in vertical orientation on smartphones. While still fundamentally being streaming, short video delivery is distinctly characterized by the deployment of a mechanism that pre-loads ahead of user request. Background pre-loading aims to eliminate start-up time, which is now prioritized higher in Quality of Experience (QoE) objectives, given that the application design facilitates instant ‘swiping’ to the next video in a recommended sequence. In this work, we provide a comprehensive comparison of four popular short video services. In particular, we explore content characteristics and evaluate the video quality across resolutions for each service. We next characterize the pre-loading policy adopted by each service. Last, we conduct an experimental study to investi- gate data consumption and evaluate achieved QoE under different network scenarios and application configurations. 
    more » « less
  4. null (Ed.)
    Emerging virtual and augmented reality applications are envisioned to significantly enhance user experiences. An important issue related to user experience is thermal management in smartphones widely adopted for virtual and augmented reality applications. Although smartphone overheating has been reported many times, a systematic measurement and analysis of their thermal behaviors is relatively scarce, especially for virtual and augmented reality applications. To address the issue, we build a temperature measurement and analysis framework for virtual and augmented reality applications using a robot, infrared cameras, and smartphones. Using the framework, we analyze a comprehensive set of data including the battery power consumption, smartphone surface temperature, and temperature of key hardware components, such as the battery, CPU, GPU, and WiFi module. When a 360◦ virtual reality video is streamed to a smartphone, the phone surface temperature reaches near 39◦C. Also, the temperature of the phone surface and its main hardware components generally increases till the end of our 20-minute experiments despite thermal control undertaken by smartphones, such as CPU/GPU frequency scaling. Our thermal analysis results of a popular AR game are even more serious: the battery power consumption frequently exceeds the thermal design power by 20–80%, while the peak battery, CPU, GPU, and WiFi module temperature exceeds 45, 70, 70, and 65◦C, respectively 
    more » « less
  5. The emerging volumetric videos offer a fully immersive, six degrees of freedom (6DoF) viewing experience, at the cost of extremely high bandwidth demand. In this paper, we design, implement, and evaluate Vues, an edge-assisted transcoding system that delivers high-quality volumetric videos with low bandwidth requirement, low decoding overhead, and high quality of experience (QoE) on mobile devices. Through an IRB-approved user study, we build a f irst-of-its-kind QoE model to quantify the impact of various factors introduced by transcoding volumetric content into 2D videos. Motivated by the key observations from this user study, Vues employs a novel multiview approach with the overarching goal of boosting QoE. The Vues edge server adaptively transcodes a volumetric video frame into multiple 2D views with the help of a few lightweight machine learning models and strategically balances the extra bandwidth consumption of additional views and the improved QoE, indicated by our QoE model. The client selects the view that optimizes the QoE among the delivered candidates for display. Comprehensive evaluations using a prototype implementation indicate that Vues dramatically outperforms existing approaches. On average, it improves the QoE by 35% (up to 85%), compared to single-view transcoding schemes, and reduces the bandwidth consumption by 95%, compared to the state-of-the-art that directly streams volumetric videos. 
    more » « less