Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available August 1, 2025
-
Free, publicly-accessible full text available August 4, 2025
-
Free, publicly-accessible full text available September 1, 2025
-
The Open Radio Access Network (RAN) paradigm is transforming cellular networks into a system of disaggregated, virtualized, and software-based components. These self-optimize the network through programmable, closed-loop control, leveraging Artificial Intelligence (AI) and Machine Learning (ML) routines. In this context, Deep Reinforcement Learning (DRL) has shown great potential in addressing complex resource allocation problems. However, DRL-based solutions are inherently hard to explain, which hinders their deployment and use in practice. In this paper, we propose EXPLORA, a framework that provides explainability of DRL-based control solutions for the Open RAN ecosystem. EXPLORA synthesizes network-oriented explanations based on an attributed graph that produces a link between the actions taken by a DRL agent (i.e., the nodes of the graph) and the input state space (i.e., the attributes of each node). This novel approach allows EXPLORA to explain models by providing information on the wireless context in which the DRL agent operates. EXPLORA is also designed to be lightweight for real-time operation. We prototype EXPLORA and test it experimentally on an O-RAN-compliant near-real-time RIC deployed on the Colosseum wireless network emulator. We evaluate EXPLORA for agents trained for different purposes and showcase how it generates clear network-oriented explanations. We also show how explanations can be used to perform informative and targeted intent-based action steering and achieve median transmission bitrate improvements of 4% and tail improvements of 10%.more » « less
-
Tomorrow's massive-scale IoT sensor networks are poised to drive uplink traffic demand, especially in areas of dense deployment. To meet this demand, however, network designers leverage tools that often require accurate estimates of Channel State Information (CSI), which incurs a high overhead and thus reduces network throughput. Furthermore, the overhead generally scales with the number of clients, and so is of special concern in such massive IoT sensor networks. While prior work has used transmissions over one frequency band to predict the channel of another frequency band on the same link, this paper takes the next step in the effort to reduce CSI overhead: predict the CSI of a nearby but distinct link. We propose Cross-Link Channel Prediction (CLCP), a technique that leverages multi-view representation learning to predict the channel response of a large number of users, thereby reducing channel estimation overhead further than previously possible. CLCP's design is highly practical, exploiting existing transmissions rather than dedicated channel sounding or extra pilot signals. We have implemented CLCP for two different Wi-Fi versions, namely 802.11n and 802.11ax, the latter being the leading candidate for future IoT networks. We evaluate CLCP in two large-scale indoor scenarios involving both line-of-sight and non-line-of-sight transmissions with up to 144 different 802.11ax users and four different channel bandwidths, from 20 MHz up to 160 MHz. Our results show that CLCP provides a 2× throughput gain over baseline and a 30% throughput gain over existing prediction algorithms.more » « less
-
ACM (Ed.)The well-known susceptibility of millimeter wave links to human blockage and client mobility has recently motivated researchers to propose approaches that leverage both 802.11ad radios (operating in the 60 GHz band) and legacy 802.11ac radios (operating in the 5 GHz band) in dual-band commercial off-the-shelf devices to simultaneously provide Gbps throughput and reliability. One such approach is via Multipath TCP (MPTCP), a transport layer protocol that is transparent to applications and requires no changes to the underlying wireless drivers. However, MPTCP (as well as other bundling approaches) have only been evaluated to date in 60 GHz WLANs with laptop clients. In this work, we port for first time the MPTCP source code to a dual-band smartphone equipped with an 802.11ad and an 802.11ac radio. We discuss the challenges we face and the system-level optimizations required to enable the phone to support Gbps data rates and yield optimal MPTCP throughput (i.e., the sum of the individual throughputs of the two radios) under ideal conditions. We also evaluate for first time the power consumption of MPTCP in a dual-band 802.11ad/ac smartphone and provide recommendations towards the design of an energy-aware MPTCP scheduler. We make our source code publicly available to enable other researchers to experiment with MPTCP in smartphones equipped with millimeter wave radios.more » « less