skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Thursday, January 16 until 2:00 AM ET on Friday, January 17 due to maintenance. We apologize for the inconvenience.


Title: A sensorless drone-based system for mapping indoor 3D airflow gradients: demo abstract
With the global spread of the COVID-19 pandemic, ventilation indoors is becoming increasingly important in preventing the spread of airborne viruses. However, while sensors exist to measure wind speed and airflow gradients, they must be manually held by a human or an autonomous vehicle, robot, or drone that moves around the space to build an airflow map of the environment. In this demonstration, we present DAE, a novel drone-based system that can automatically navigate and estimate air flow in a space without the need of additional sensors attached onto the drone. DAE directly utilizes the flight controller data that all drones use to self-stabilize in the air to estimate airflow. DAE estimates airflow gradients in a room based on how the flight controller adjusts the motors on the drone to compensate external perturbations and air currents, without the need for attaching additional wind or airflow sensors.  more » « less
Award ID(s):
1943396 1837022
PAR ID:
10362791
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Proceedings of the 20th Annual International Conference on Mobile Systems, Applications and Services (MobiSys '22)
Page Range / eLocation ID:
634 to 635
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Mapping 3D airflow fields is important for many HVAC, industrial, medical, and home applications. However, current approaches are expensive and time-consuming. We present Anemoi, a sub-$100 drone-based system for autonomously mapping 3D airflow fields in indoor environments. Anemoi leverages the effects of airflow on motor control signals to estimate the magnitude and direction of wind at any given point in space. We introduce an exploration algorithm for selecting optimal waypoints that minimize overall airflow estimation uncertainty. We demonstrate through microbenchmarks and real deployments that Anemoi is able to estimate wind speed and direction with errors up to 0.41 m/s and 25.1° lower than the existing state of the art and map 3D airflow fields with an average RMS error of 0.73 m/s. 
    more » « less
  2. The paper discusses a deep reinforcement learning (RL) control strategy for fully autonomous vision-based approach and landing of vertical take-off and landing (VTOL) capable unmanned aerial vehicles (UAVs) on ships in the presence of disturbances such as wind gusts. The automation closely follows the Navy helicopter ship landing procedure and therefore, it detects a horizon bar that is installed on most Navy ships as a visual aid for pilots by applying uniquely developed computer vision techniques. The vision system utilizes the detected corners of the horizon bar and its known dimensions to estimate the relative position and heading angle of the aircraft. A deep RL-based controller was coupled with the vision system to ensure a safe and robust approach and landing at the proximity of the ship where the airflow is highly turbulent. The vision and RL-based control system was implemented on a quadrotor UAV and flight tests were conducted where the UAV approached and landed on a sub-scale ship platform undergoing 6 degrees of freedom deck motions in the presence of wind gusts. Simulations and flight tests confirmed the superior disturbance rejection capability of the RL controller when subjected to sudden 5 m/s wind gusts in different directions. Specifically, it was observed during flight tests that the deep RL controller demonstrated a 50% reduction in lateral drift from the flight path and 3 times faster disturbance rejection in comparison to a nonlinear proportional-integral-derivative controller. 
    more » « less
  3. Unmanned aerial vehicles (UAVs) rely on optical sensors such as cameras and lidar for autonomous operation. However, such optical sensors are error-prone in bad lighting, inclement weather conditions including fog and smoke, and around textureless or transparent surfaces. In this paper, we ask: is it possible to fly UAVs without relying on optical sensors, i.e., can UAVs fly without seeing? We present BatMobility, a lightweight mmWave radar-only perception system for UAVs that eliminates the need for optical sensors. BatMobility enables two core functionalities for UAVs – radio flow estimation (a novel FMCW radar-based alternative for optical flow based on surface-parallel doppler shift) and radar-based collision avoidance. We build BatMobility using commodity sensors and deploy it as a real-time system on a small off-the-shelf quadcopter running an unmodified flight controller. Our evaluation shows that BatMobility achieves comparable or better performance than commercial-grade optical sensors across a wide range of scenarios. 
    more » « less
  4. In this article, we first investigate the quality of aerial air pollution measurements and characterize the main error sources of drone-mounted gas sensors. To that end, we build ASTRO+, an aerial-ground pollution monitoring platform, and use it to collect a comprehensive dataset of both aerial and reference air pollution measurements. We show that the dynamic airflow caused by drones affects temperature and humidity levels of the ambient air, which then affect the measurement quality of gas sensors. Then, in the second part of this article, we leverage the effects of weather conditions on pollution measurements’ quality in order to design an unmanned aerial vehicle mission planning algorithm that adapts the trajectory of the drones while taking into account the quality of aerial measurements. We evaluate our mission planning approach based on a Volatile Organic Compound pollution dataset and show a high-performance improvement that is maintained even when pollution dynamics are high. 
    more » « less
  5. null (Ed.)
    Traditional configurations for mounting Temperature–Humidity (TH) sensors on multirotor Unmanned Aerial Systems (UASs) often suffer from insufficient radiation shielding, exposure to mixed and turbulent air from propellers, and inconsistent aspiration while situated in the wake of the UAS. Descent profiles using traditional methods are unreliable (when compared to an ascent profile) due to the turbulent mixing of air by the UAS while descending into that flow field. Consequently, atmospheric boundary layer profiles that rely on such configurations are bias-prone and unreliable in certain flight patterns (such as descent). This article describes and evaluates a novel sensor housing designed to shield airborne sensors from artificial heat sources and artificial wet-bulbing while pulling air from outside the rotor wash influence. The housing is mounted above the propellers to exploit the rotor-induced pressure deficits that passively induce a high-speed laminar airflow to aspirate the sensor consistently. Our design is modular, accommodates a variety of other sensors, and would be compatible with a wide range of commercially available multirotors. Extensive flight tests conducted at altitudes up to 500 m Above Ground Level (AGL) show that the housing facilitates reliable measurements of the boundary layer phenomena and is invariant in orientation to the ambient wind, even at high vertical/horizontal speeds (up to 5 m/s) for the UAS. A low standard deviation of errors shows a good agreement between the ascent and descent profiles and proves our unique design is reliable for various UAS missions. 
    more » « less