skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 11:00 PM ET on Thursday, October 10 until 2:00 AM ET on Friday, October 11 due to maintenance. We apologize for the inconvenience.


Title: LDRP: Device-Centric Latency Diagnostic and Reduction for Cellular Networks Without Root
We design and implement LDRP , a device-based, standard-compliant solution to latency diagnosis and reduction in mobile networks without root privilege. LDRP takes a data-driven approach and works with a variety of latency-sensitive applications. After identifying elements in LTE uplink latency, we design LDRP that can infer the critical parameter used in data transmission and infer them for diagnosis. In addition, LDRP designates small dummy messages, which precede uplink data transmissions, thus eliminating latency elements due to power-saving, scheduling, etc. It imposes proper timing control among dummy messages and data packets to handle various conflicts. We achieve the latency diagnosis and reduction without requiring root privilege and ensure the latency is no worse than the legacy LTE design. The design of LDRP is also applicable for 5 G. The evaluation shows that, LDRP infers the latency with at most 4% error and reduces the median LTE uplink latency by a factor up to 7.4× (from 42 to 5 ms) for four apps over 4 mobile carriers.  more » « less
Award ID(s):
1910150
NSF-PAR ID:
10488002
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
IEEE
Date Published:
Journal Name:
IEEE Transactions on Mobile Computing
ISSN:
1536-1233
Page Range / eLocation ID:
1 to 17
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The highly anticipated 5G mmWave technology promises to enable many uplink-oriented, latency-critical applications (LCAs) such as Augmented Reality and Connected Autonomous Vehicles. Nonetheless, recent measurement studies have largely focused on its downlink performance. In thiswork,we perform a systematic study of the uplink performance of commercial 5G mmWave networks across 3 major US cities and 2 mobile operators. Our study makes three contributions. (1) It reveals that 5G mmWave uplink performance is geographically diverse, substantially higher over LTE in terms of bandwidth and latency, but often erratic and suboptimal, which can degrade LCA performance. (2) Our analysis of control messages and PHY-level KPIs shows that the root causes for the suboptimal performance are fundamental to 5G mmWave and cannot be easily fixed via simple tuning of network configurations. (3) We identify various design and deployment optimizations that 5G operators can explore to bring 5G mmWave performance to the level needed to ultimately support the LCAs. 
    more » « less
  2. Augmented Reality (AR) has been widely hailed as a representative of ultra-high bandwidth and ultra-low latency apps that will be enabled by 5G networks. While single-user AR can perform AR tasks locally on the mobile device, multi-user AR apps, which allow multiple users to interact within the same physical space, critically rely on the cellular network to support user interactions. However, a recent study showed that multi-user AR apps can experience very high end-to-end latency when running over LTE, rendering user interaction practically infeasible. In this paper, we study whether 5G mmWave, which promises significant bandwidth and latency improvements over LTE, can support multi-user AR by conducting an in-depth measurement study of the same popular multi-user AR app over both LTE and 5G mmWave. Our measurement and analysis show that: (1) The E2E AR latency over LTE is significantly lower compared to the values reported in the previous study. However, it still remains too high for practical user interaction. (2) 5G mmWave brings no benefits to multi-user AR apps. (3) While 5G mmWave reduces the latency of the uplink visual data transmission, there are other components of the AR app that are independent of the network technology and account for a significant fraction of the E2E latency. (4) The app drains 66% more network energy, which translates to 28% higher total energy over 5G mmWave compared to over LTE. 
    more » « less
  3. Abstract: With the proliferation of Dynamic Spectrum Access (DSA), Internet of Things (IoT), and Mobile Edge Computing (MEC) technologies, various methods have been proposed to deduce key network and user information in cellular systems, such as available cell bandwidths, as well as user locations and mobility. Not only is such information dominated by cellular networks of vital significance on other systems co-located spectrum-wise and/or geographically, but applications within cellular systems can also benefit remarkably from inferring such information, as exemplified by the endeavours made by video streaming to predict cell bandwidth. Hence, we are motivated to develop a new tool to uncover as much information used to be closed to outsiders or user devices as possible with off-the-shelf products. Given the wide-spread deployment of LTE and its continuous evolution to 5G, we design and implement U-CIMAN, a client-side system to accurately UnCover as much Information in Mobile Access Networks as allowed by LTE encryption. Among the many potential applications of U-CIMAN, we highlight one use case of accurately measuring the spectrum tenancy of a commercial LTE cell. Besides measuring spectrum tenancy in unit of resource blocks, U-CIMAN discovers user mobility and traffic types associated with spectrum usage through decoded control messages and user data bytes. We conduct 4-month detailed accurate spectrum measurement on a commercial LTE cell, and the observations include the predictive power of Modulation and Coding Scheme on spectrum tenancy, and channel off-time bounded under 10 seconds, to name a few. 
    more » « less
  4. The dramatic growth in demand for mobile data service has prompted mobile network operators (MNOs) to explore new spectrum resources in unlicensed bands. MNOs have been recently allowed to extend LTE-based service called LTE-LAA over 5 GHz U-NII bands, currently occupied by Wi-Fi. To support applications with diverse QoS requirements, both LTE and Wi-Fi technologies introduce multiple priority classes with different channel contention parameters for accessing unlicensed bands. How these different priority classes affect the interplay between coexisting LTE and Wi-Fi technologies is still relatively under-explored. In this paper, we develop a simple and efficient framework that helps MNOs assess the fair coexistence between MNOs and Wi-Fi operators with prioritized channel access under the multi-channel setting. We derive an approximated closed-form solution for each MNO to pre-evaluate the probability of successful transmission (PST), average contention delay, and average throughput when adopting different priority classes to serve different traffics. MNOs and Wi-Fi operators can fit our model using measurements collected offline and/or online, and use it to further optimize their systems’ throughput and latency. Our results reveal that PSTs computed with our approximated closed-form model approach those collected from system-level simulations with around 95% accuracy under scenarios of dense network deployment density and high traffic intensity. 
    more » « less
  5. 5G aims to offer not only significantly higher throughput than previous generations of cellular networks, but also promises millisecond (ms) and sub-millisecond (ultra-)low latency support at the 5G physical (PHY) layer for future applications. While prior measurement studies have confirmed that commercial 5G deployments can achieve up to several Gigabits per second (Gbps) throughput (especially with the mmWave 5G radio), are they able to deliver on the (sub) millisecond latency promise? With this question in mind, we conducted to our knowledge the first in-depth measurement study of commercial 5G mmWave PHY latency using detailed physical channel events and messages. Through carefully designed experiments and data analytics, we dissect various factors that influence 5G PHY latency of both downlink and uplink data transmissions, and explore their impacts on end-to-end delay. We find that while in the best cases, the 5G (mmWave) PHY-layer is capable of delivering ms/sub-ms latency (with a minimum of 0.09 ms for downlink and 0.76 ms for uplink), these happen rarely. A variety of factors such as channel conditions, re-transmissions, physical layer control and scheduling mechanisms, mobility, and application (edge) server placement can all contribute to increased 5G PHY latency (and thus end-to-end (E2E) delay). Our study provides insights to 5G vendors, carriers as well as application developers/content providers on how to better optimize or mitigate these factors for improved 5G latency performance. 
    more » « less