skip to main content

Title: A Smartphone Thermal Temperature Analysis for Virtual and Augmented Reality
Emerging virtual and augmented reality applications are envisioned to significantly enhance user experiences. An important issue related to user experience is thermal management in smartphones widely adopted for virtual and augmented reality applications. Although smartphone overheating has been reported many times, a systematic measurement and analysis of their thermal behaviors is relatively scarce, especially for virtual and augmented reality applications. To address the issue, we build a temperature measurement and analysis framework for virtual and augmented reality applications using a robot, infrared cameras, and smartphones. Using the framework, we analyze a comprehensive set of data including the battery power consumption, smartphone surface temperature, and temperature of key hardware components, such as the battery, CPU, GPU, and WiFi module. When a 360◦ virtual reality video is streamed to a smartphone, the phone surface temperature reaches near 39◦C. Also, the temperature of the phone surface and its main hardware components generally increases till the end of our 20-minute experiments despite thermal control undertaken by smartphones, such as CPU/GPU frequency scaling. Our thermal analysis results of a popular AR game are even more serious: the battery power consumption frequently exceeds the thermal design power by 20–80%, while the peak battery, CPU, GPU, and WiFi module temperature exceeds 45, 70, 70, and 65◦C, respectively  more » « less
Award ID(s):
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
IEEE International Conference on Artificial Intelligence & Virtual Reality
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Smartphones have recently become a popular platform for deploying the computation-intensive virtual reality (VR) applications, such as immersive video streaming (a.k.a., 360-degree video streaming). One specific challenge involving the smartphone-based head mounted display (HMD) is to reduce the potentially huge power consumption caused by the immersive video. To address this challenge, we first conduct an empirical power measurement study on a typical smartphone immersive streaming system, which identifies the major power consumption sources. Then, we develop QuRate, a quality-aware and user-centric frame rate adaptation mechanism to tackle the power consumption issue in immersive video streaming. QuRate optimizes the immersive video power consumption by modeling the correlation between the perceivable video quality and the user behavior. Specifically, QuRate builds on top of the user’s reduced level of concentration on the video frames during view switching and dynamically adjusts the frame rate without impacting the perceivable video quality. We evaluate QuRate with a comprehensive set of experiments involving 5 smartphones, 21 users, and 6 immersive videos using empirical user head movement traces. Our experimental results demonstrate that QuRate is capable of extending the smartphone battery life by up to 1.24X while maintaining the perceivable video quality during immersive video streaming. Also, we conduct an Institutional Review Board (IRB)- approved subjective user study to further validate the minimum video quality impact caused by QuRate. 
    more » « less
  2. Abstract

    Clean water free of bacteria is a precious resource in areas where no centralized water facilities are available. Conventional chlorine disinfection is limited by chemical transportation, storage, and the production of carcinogenic by-products. Here, a smartphone-powered disinfection system is developed for point-of-use (POU) bacterial inactivation. The integrated system uses the smartphone battery as a power source, and a customized on-the-go (OTG) hardware connected to the phone to realize the desired electrical output. Through a downloadable mobile application, the electrical output, either constant current (20–1000 µA) or voltage (0.7–2.1 V), can be configured easily through a user-friendly graphical interface on the screen. The disinfection device, a coaxial-electrode copper ionization cell (CECIC), inactivates bacteria by low levels of electrochemically generated copper with low energy consumption. The strategy of constant current control is applied in this study to solve the problem of uncontrollable copper release by previous constant voltage control. With the current control, a high inactivation efficiency ofE. coli(~6 logs) is achieved with a low level of effluent Cu (~200 µg L−1) in the water samples within a range of salt concentration (0.2–1 mmol L−1). The smartphone-based power workstation provides a versatile and accurate electrical output with a simple graphical user interface. The disinfection device is robust, highly efficient, and does not require complex equipment. As smartphones are pervasive in modern life, the smartphone-powered CECIC system could provide an alternative decentralized water disinfection approach like rural areas and outdoor activities.

    more » « less
  3. With the advent of 5G, supporting high-quality game streaming applications on edge devices has become a reality. This is evidenced by a recent surge in cloud gaming applications on mobile devices. In contrast to video streaming applications, interactive games require much more compute power for supporting improved rendering (such as 4K streaming) with the stipulated frames-per second (FPS) constraints. This in turn consumes more battery power in a power-constrained mobile device. Thus, the state-of-the-art gaming applications suffer from lower video quality (QoS) and/or energy efficiency. While there has been a plethora of recent works on optimizing game streaming applications, to our knowledge, there is no study that systematically investigates the design pairs on the end-to-end game streaming pipeline across the cloud, network, and edge devices to understand the individual contributions of the different stages of the pipeline for improving the overall QoS and energy efficiency. In this context, this paper presents a comprehensive performance and power analysis of the entire game streaming pipeline consisting of the server/cloud side, network, and edge. Through extensive measurements with a high-end workstation mimicking the cloud end, an open-source platform (Moonlight-GameStreaming) emulating the edge device/mobile platform, and two network settings (WiFi and 5G) we conduct a detailed measurement-based study with seven representative games with different characteristics. We characterize the performance in terms of frame latency, QoS, bitrate, and energy consumption for different stages of the gaming pipeline. Our study shows that the rendering stage and the encoding stage at the cloud end are the bottlenecks to support 4K streaming. While 5G is certainly more suitable for supporting enhanced video quality with 4K streaming, it is more expensive in terms of power consumption compared to WiFi. Further, fluctuations in 5G network quality can lead to huge frame drops thus affecting QoS, which needs to be addressed by a coordinated design between the edge device and the server. Finally, the network interface and the decoder units in a mobile platform need more energy-efficient design to support high quality games at a lower cost. These observations should help in designing more cost-effective future cloud gaming platforms. 
    more » « less
  4. This paper presents ssLOTR (self-supervised learning on the rings), a system that shows the feasibility of designing self-supervised learning based techniques for 3D finger motion tracking using a custom-designed wearable inertial measurement unit (IMU) sensor with a minimal overhead of labeled training data. Ubiquitous finger motion tracking enables a number of applications in augmented and virtual reality, sign language recognition, rehabilitation healthcare, sports analytics, etc. However, unlike vision, there are no large-scale training datasets for developing robust machine learning (ML) models on wearable devices. ssLOTR designs ML models based on data augmentation and self-supervised learning to first extract efficient representations from raw IMU data without the need for any training labels. The extracted representations are further trained with small-scale labeled training data. In comparison to fully supervised learning, we show that only 15% of labeled training data is sufficient with self-supervised learning to achieve similar accuracy. Our sensor device is designed using a two-layer printed circuit board (PCB) to minimize the footprint and uses a combination of Polylactic acid (PLA) and Thermoplastic polyurethane (TPU) as housing materials for sturdiness and flexibility. It incorporates a system-on-chip (SoC) microcontroller with integrated WiFi/Bluetooth Low Energy (BLE) modules for real-time wireless communication, portability, and ubiquity. In contrast to gloves, our device is worn like rings on fingers, and therefore, does not impede dexterous finger motion. Extensive evaluation with 12 users depicts a 3D joint angle tracking accuracy of 9.07° (joint position accuracy of 6.55mm) with robustness to natural variation in sensor positions, wrist motion, etc, with low overhead in latency and power consumption on embedded platforms. 
    more » « less
  5. Mobile sequencing technologies, including Oxford Nanopore’s MinION, MklC, and SmidgION, are bringing genomics in the palm of a hand, opening unprecedented new opportunities in clinical and ecological research and translational applications. While sequencers now need only a USB outlet and provide on-board preprocessing (e.g., base calling), the main data analysis phases are tied to an available broadband Internet connection and cloud computing. Yet the ubiquity of tablets and smartphones, along with their increase in computational power, makes them a perfect candidate for enabling mobile/edge mobile bioinformatics analytics. Also, in on site experimental settings tablets and smartphones are preferable to standard computers due to resilience to humidity or spills, and ease of sterilization. We here present an experimental study on power dissipation, aiming at reducing the battery consumption that currently impedes the execution of intensive bioinformatics analytics pipelines. In particular, we investigated the effects of assorted data structures (including hash tables, vectors, balanced trees, tries) employed in some of the most common tasks of a bioinformatics pipeline, the k- mer representation and counting. By employing a thermal camera, we show how different k-mer-handling data structures impact the power dissipation on a smartphone, finding that a cache-oblivious data structure reduces power dissipation (up to 26% better than others). In conclusion, the choice of data structures in mobile bioinformatics must consider not only computing efficiency (e.g., succinct data structures to reduce RAM usage), but also power consumption of mobile devices that heavily rely on batteries in order to function. 
    more » « less