skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: UAV Assisted Cellular Networks With Renewable Energy Charging Infrastructure: A Reinforcement Learning Approach
Deploying unmanned aerial vehicle (UAV) mounted base stations with a renewable energy charging infrastructure in a temporary event (e.g., sporadic hotspots for light reconnaissance mission or disaster-struck areas where regular power-grid is unavailable) provides a responsive and cost-effective solution for cellular networks. Nevertheless, the energy constraint incurred by renewable energy (e.g., solar panel) imposes new challenges on the recharging coordination. The amount of available energy at a charging station (CS) at any given time is variable depending on: the time of day, the location, sunlight availability, size and quality factor of the solar panels used, etc. Uncoordinated UAVs make redundant recharging attempts and result in severe quality of service (QoS) degradation. The system stability and lifetime depend on the coordination between the UAVs and available CSs. In this paper, we develop a reinforcement learning time-step based algorithm for the UAV recharging scheduling and coordination using a Q-Learning approach. The agent is considered a central controller of the UAVs in the system, which uses the ϵ -greedy based action selection. The goal of the algorithm is to maximize the average achieved throughput, reduce the number of recharging occurrences, and increase the life-span of the network. Extensive simulations based on experimentally validated UAV and charging energy models reveal that our approach exceeds the benchmark strategies by 381% in system duration, 47% reduction in the number of recharging occurrences, and achieved 66% of the performance in average throughput compared to a power-grid based infrastructure where there are no energy limitations on the CSs.  more » « less
Award ID(s):
1757207
PAR ID:
10315807
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
MILCOM 2021 - 2021 IEEE Military Communications Conference (MILCOM)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The transition to Electric Vehicles (EVs) for reducing urban greenhouse gas emissions is hindered by the lack of public charging infrastructure, particularly fast-charging stations. Given that electric vehicle fast charging stations (EVFCS) can burden the electricity grid, it is crucial for EVFCS to adopt sustainable energy supply methods while accommodating the growing demands of EVs. Despite recent research efforts to optimize the placement of renewable-powered EV charging stations, current planning methods face challenges when applied to a complex city scale and integrating with renewable energy resources. This study thus introduces a robust decision-making model for optimal EVFCS placement planning integrated with solar power supply in a large and complex urban environment (e.g., Chicago), utilizing an advantage actor-critic (A2C) deep reinforcement learning (DRL) approach. The model balances traffic demand with energy supply, strategically placing charging stations in areas with high traffic density and solar potential. As a result, the model is used to optimally place 1,000 charging stations with a random starting search approach, achieving total reward values of 74.30 %, and estimated the capacities of potential EVFCS. This study can inform the identification of suitable locations to advance the microgrid-based charging infrastructure systems in large urban environments. 
    more » « less
  2. The emerging unmanned aerial vehicle (UAV) such as a quadcopter offers a reliable, controllable, and flexible way of ferrying information from energy harvesting powered IoT devices in remote areas to the IoT edge servers. Nonetheless, the employment of UAVs faces a major challenge which is the limited fly range due to the necessity for recharging, especially when the charging stations are situated at considerable distances from the monitoring area, resulting in inefficient energy usage. To mitigate these challenges, we proposed to place multiple charging stations in the field and each is equipped with a powerful energy harvester and acting as a cluster head to collect data from the sensor node under its jurisdiction. In this way, the UAV can remain in the field continuously and get the data while charging. However, the intermittent and unpredictable nature of energy harvesting can render stale or even obsolete information stored at cluster heads. To tackle this issue, in this work, we proposed a Deep Reinforcement Learning (DRL) based path planning for UAVs. The DRL agent will gather the global information from the UAV to update its input environmental states for outputting the location of the next stop to optimize the overall age of information of the whole network. The experiments show that the proposed DDQN can significantly reduce the age of information (AoI) by 3.7% reliably compared with baseline techniques. 
    more » « less
  3. This paper proposes a solar energy harvesting based modular battery balance system for electric vehicles. The proposed system is designed to charge the battery module with minimum SOC/voltage by solar power during charging and discharging. With the solar power input, the useful energy of the battery can be improved while vehicle driving. For vehicle charging, the charging energy from grid and total charging time can be reduced as well. Simulation analysis shows that for a 50Ah rated battery pack, the overall pure electric drive mileage can be improved by 22.9%, while consumed grid energy and total charging time can be reduced by 9.6% and 9.3% respectively. In addition, the battery life can be improved around 10%~11%. The prototype design and test of a 48V battery pack vehicle consisting of four 12V battery modules are carried out. The experimental results validate that the system has good modular balance performance for the 100Ah battery modules with 5~7A charging current from solar power, and the overall usable battery energy has been increased. 
    more » « less
  4. Unmanned aerial vehicles (UAVs) can supplement the existing ground-based heterogeneous cellular networks (Het-Nets), by replacing/supporting damaged infrastructure, providing real-time video support at the site of an emergency, offloading traffic in congested areas, extending coverage, and filling coverage gaps. In this paper, we introduce distributed algorithms that leverage UAV mobility, enhanced inter-cell interference coordination (ICIC), and cell range expansion (CRE) techniques defined in 3GPP Release-10 and 3GPP Release-11. Through Monte-Carlo simulations, we compare the system-wide 5th percentile spectral efficiency (5pSE) while optimizing the performance using a brute force algorithm, a heuristic-based sequential algorithm, and a deep Q-learning algorithm. The autonomous UAVs jointly optimize their location, ICIC parameters, and CRE to maximize 5pSE gains and minimize the outage probability. Our results show that the ICIC technique relying on a simple heuristic outperforms the ICIC technique based on deep Q-learning. Taking advantage of the multiple optimization parameters for interference coordination, the heuristic based ICIC technique can achieve 5pSE values that are reasonably close to those achieved with exhaustive brute force search techniques, at a significantly lower computational complexity. 
    more » « less
  5. Stoustrup J., Annaswamy A. (Ed.)
    Loads are expected to help the power grid of the future in balancing the highs and lows caused by intermittent renewables such as solar and wind. With appropriate intelligence, loads will be able manipulate demand around a nominal baseline so that the increase and decrease of demand appears like charging and discharging of a battery, thereby creating a virtual energy storage (VES) device. An important question for the control systems community is: how to control these flexible loads so that the apparently conflicting goal of maintaining consumers’ quality of service (QoS) and providing reliable grid support are achieved? We advocate a frequency domain thinking of handling both of these issues, along the lines of a recent paper. In this article, we discuss some of the challenges and opportunities in designing appropriate control algorithms and coordination architectures in obtaining reliable VES from flexible loads. 
    more » « less