skip to main content


Title: Machine Learning Vision and Nonlinear Control Approach for Autonomous Ship Landing of Vertical Flight Aircraft
The paper discusses a machine learning vision and nonlinear control approach for autonomous ship landing of vertical flight aircraft without utilizing GPS signal. The central idea involves automating the Navy helicopter ship landing procedure where the pilot utilizes the ship as the visual reference for long-range tracking, but refers to a standardized visual cue installed on most Navy ships called the ”horizon bar” for the final approach and landing phases. This idea is implemented using a uniquely designed nonlinear controller integrated with machine vision. The vision system utilizes machine learning based object detection for long-range ship tracking, and classical computer vision for object detection and the estimation of aircraft relative position and orientation during the final approach and landing phases. The nonlinear controller operates based on the information estimated by the vision system and has demonstrated robust tracking performance even in the presence of uncertainties. The developed autonomous ship landing system is implemented on a quad-rotor vertical take-off and landing (VTOL) capable unmanned aerial vehicle (UAV) equipped with an onboard camera and was demonstrated on a moving deck, which imitates realistic ship deck motions using a Stewart platform and a visual cue equivalent to the horizon bar. Extensive simulations and flight tests are conducted to demonstrate vertical landing safety, tracking capability, and landing accuracy while the deck is in motion.  more » « less
Award ID(s):
1946890
NSF-PAR ID:
10318628
Author(s) / Creator(s):
Date Published:
Journal Name:
77th Annual National Forum of the Vertical Flight Society
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The paper discusses an intelligent vision-based control solution for autonomous tracking and landing of Vertical Take-Off and Landing (VTOL) capable Unmanned Aerial Vehicles (UAVs) on ships without utilizing GPS signal. The central idea involves automating the Navy helicopter ship landing procedure where the pilot utilizes the ship as the visual reference for long-range tracking; however, refers to a standardized visual cue installed on most Navy ships called the ”horizon bar” for the final approach and landing phases. This idea is implemented using a uniquely designed nonlinear controller integrated with machine vision. The vision system utilizes machine learning based object detection for long-range ship tracking and classical computer vision for the estimation of aircraft relative position and orientation utilizing the horizon bar during the final approach and landing phases. The nonlinear controller operates based on the information estimated by the vision system and has demonstrated robust tracking performance even in the presence of uncertainties. The developed autonomous ship landing system was implemented on a quad-rotor UAV equipped with an onboard camera, and approach and landing were successfully demonstrated on a moving deck, which imitates realistic ship deck motions. Extensive simulations and flight tests were conducted to demonstrate vertical landing safety, tracking capability, and landing accuracy. The video of the real-world experiments and demonstrations is available at this URL. 
    more » « less
  2. The paper discusses a deep reinforcement learning (RL) control strategy for fully autonomous vision-based approach and landing of vertical take-off and landing (VTOL) capable unmanned aerial vehicles (UAVs) on ships in the presence of disturbances such as wind gusts. The automation closely follows the Navy helicopter ship landing procedure and therefore, it detects a horizon bar that is installed on most Navy ships as a visual aid for pilots by applying uniquely developed computer vision techniques. The vision system utilizes the detected corners of the horizon bar and its known dimensions to estimate the relative position and heading angle of the aircraft. A deep RL-based controller was coupled with the vision system to ensure a safe and robust approach and landing at the proximity of the ship where the airflow is highly turbulent. The vision and RL-based control system was implemented on a quadrotor UAV and flight tests were conducted where the UAV approached and landed on a sub-scale ship platform undergoing 6 degrees of freedom deck motions in the presence of wind gusts. Simulations and flight tests confirmed the superior disturbance rejection capability of the RL controller when subjected to sudden 5 m/s wind gusts in different directions. Specifically, it was observed during flight tests that the deep RL controller demonstrated a 50% reduction in lateral drift from the flight path and 3 times faster disturbance rejection in comparison to a nonlinear proportional-integral-derivative controller. 
    more » « less
  3. The dataset is derived from HELiX Uncrewed Aircraft System flights that were conducted in the Central Arctic Ocean over sea ice during the Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC) expedition. The data include Universal Coordinated Time (UTC), downwelling and upwelling shortwave radiation measurements, and position and attitude from the Uncrewed Aircraft System (UAS). Temperature, relative humidity and pressure from two different sensors are also provided. A quality control flag is associated with each scientific measurement. A flight flag is also included to indicate the different phases of the flight - on the ground, take-off/landing phases, and in flight. All the data have been synchronized and interpolated at 10 hertz (Hz). The purpose of this dataset is to provide information on albedo over different features of the sea ice (snow, melt pond, ocean). Three flight patterns were implemented during the campaign with the HELiX, a grid pattern at constant altitude (15 meters or 7 meters above ground level), hovering flights ( 2-5 minutes hovering over identified sea ice features at low altitude ~ 3 meters above ground level), and profiles up to 400 meters above ground level. Displaying latitude, longitude and altitude will help users to identify the flight pattern. Albedo measurements have been validated with surface-based measurements and details can be found in de Boer, G. R. Calmer, G. Jozef, J. Cassano, J. Hamilton, D. Lawrence, S. Borenstein, A. Doddi, C. Cox, J. Schmale, A. Preußer and B. Argrow (2021): Observing the Central Arctic Atmosphere and Surface with University of Colorado Uncrewed Aircraft Systems, Nature Scientific Data, in prep. 
    more » « less
  4. Abstract

    We use a novel backstepping method to solve a stabilization problem for a nonlinear system with delayed sampled outputs that are not accurately measured. We provide an application to a system arising in vision‐based landing of airliners that includes coupling between the lateral and longitudinal dynamics, for which we provide performance guarantees in the presence of the delay, nonlinearity, and sampling. Our major contributions are (a) designs of lateral and longitudinal controls for our nonlinear model of an aircraft landing on an unequipped runway, (b) mathematical proofs that our controls ensure that the aircraft being modeled achieves desired alignment with the runway during its align phase, under sampling and delays that arise from image processing of visual information, and (c) comparative simulations exhibiting considerable improvement in control performance compared with previous methods that did not take the coupling of the dynamics or imprecise delayed sampled measurements into account.

     
    more » « less
  5. The Atlantic Tradewind Ocean-Atmosphere Mesoscale Interaction Campaign (ATOMIC) took place from 7 January to 11 July 2020 in the tropical North Atlantic between the eastern edge of Barbados and 51∘ W, the longitude of the Northwest Tropical Atlantic Station (NTAS) mooring. Measurements were made to gather information on shallow atmospheric convection, the effects of aerosols and clouds on the ocean surface energy budget, and mesoscale oceanic processes. Multiple platforms were deployed during ATOMIC including the NOAA RV Ronald H. Brown (RHB) (7 January to 13 February) and WP-3D Orion (P-3) aircraft (17 January to 10 February), the University of Colorado's Robust Autonomous Aerial Vehicle-Endurant Nimble (RAAVEN) uncrewed aerial system (UAS) (24 January to 15 February), NOAA- and NASA-sponsored Saildrones (12 January to 11 July), and Surface Velocity Program Salinity (SVPS) surface ocean drifters (23 January to 29 April). The RV Ronald H. Brown conducted in situ and remote sensing measurements of oceanic and atmospheric properties with an emphasis on mesoscale oceanic–atmospheric coupling and aerosol–cloud interactions. In addition, the ship served as a launching pad for Wave Gliders, Surface Wave Instrument Floats with Tracking (SWIFTs), and radiosondes. Details of measurements made from the RV Ronald H. Brown, ship-deployed assets, and other platforms closely coordinated with the ship during ATOMIC are provided here. These platforms include Saildrone 1064 and the RAAVEN UAS as well as the Barbados Cloud Observatory (BCO) and Barbados Atmospheric Chemistry Observatory (BACO). Inter-platform comparisons are presented to assess consistency in the data sets. Data sets from the RV Ronald H. Brown and deployed assets have been quality controlled and are publicly available at NOAA's National Centers for Environmental Information (NCEI) data archive (https://www.ncei.noaa.gov/archive/accession/ATOMIC-2020, last access: 2 April 2021). Point-of-contact information and links to individual data sets with digital object identifiers (DOIs) are provided herein. 
    more » « less