We present the design, implementation, and evaluation of MiFly, a self-localization system for autonomous drones that works across indoor and outdoor environments, including low-visibility, dark, and GPS-denied settings. MiFly performs 6DoF self-localization by leveraging a single millimeter-wave (mmWave) anchor in its vicinity- even if that anchor is visually occluded. MiFly’s core contribution is in its joint design of a mmWave anchor and localization algorithm. The lowpower anchor features a novel dual-polarization dual-modulation architecture, which enables single-shot 3D localization. MmWave radars mounted on the drone perform 3D localization relative to the anchor and fuse this data with the drone’s internal inertial measurement unit (IMU) to estimate its 6DoF trajectory. We implemented and evaluated MiFly on a DJI drone. We collected over 6,600 localization estimates across different trajectory patterns and demonstrate a median localization error of 7 cm and a 90th percentile less than 15 cm, even in low-light conditions and when the anchor is fully occluded (visually) from the drone. Demo video: youtu.be/LfXfZ26tEok 
                        more » 
                        « less   
                    
                            
                            Development of a Free-Flight Wind Test Facility Featuring a GNSS Simulator to Achieve Immersive Drone Testing
                        
                    
    
            Weather, winds, thermals, and turbulence pose an ever-present challenge to small UAS. These challenges become magnified in rough terrain and especially within urban canyons. As the industry moves towards Beyond Visual Line of Sight (BVLOS) and fully autonomous operations, resilience to weather perturbations will be key. As the human decision-maker is removed from the in-situ environment, producing control systems that are robust will be paramount to the preservation of any Airspace System. Safety requirements and regulations require quantifiable performance metrics to guarantee a safe aerial environment with ever- increasing traffic. In this regards, the effect of wind and weather disturbances on a UAS and its ability to reject these disturbances present some unique concerns. Currently, drone manufacturers and operators rely on outdoor testing during windy days (or in windy locations) and onboard logging to evaluate and improve the flight worthiness, reliability and perturbation rejection capability of their vehicles. Waiting for the desired weather or travelling to a windier location is cost- and time-inefficient. Moreover, the conditions found on outdoor test sites are difficult to quantify and repeatability is non-existent. To address this situation, a novel testing methodology is proposed, combining artificial wind generation thanks to a multi-fan array wind generator (windshaper), coherent GNSS signal generation and accurate tracking of the test subject thanks to motion capture cameras. In this environment, the drone being tested can fly freely, follow missions and experience wind perturbations whilst staying in a modest indoor volume. By coordinating the windshaper, the motion tracking feedback and the position emulated by the GNSS signal generator with the drone’s mission profile, it was demonstrated that outdoor flight conditions can be reliably recreated in a controlled and repeatable environment. Specifically, thanks to real-time update of the position simulated by the GNSS signal generator, it was possible to demonstrate that the drone’s perception of the situation is similar to a corresponding mission being executed outdoor. In this work, the drone was subjected to three distinct flight cases: (1) hover in 2 m s−1 wind, (2) forward flight at 2 m s−1 without wind and (3) forward flight at 2 m s−1 with 2 m s−1 headwind. In each case, it could be demonstrated that by using indoor GNSS signal simulation and wind generation, the drone displays the characteristics of a 20 m move forward, while actually staying stationary in the test volume, within ±1 m. Further development of this methodology opens the door for fully integrated hardware-in- the-loop simulation of drone flight operations. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 2132799
- PAR ID:
- 10393933
- Date Published:
- Journal Name:
- AIAA SCITECH 2022 Forum
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Recently, sensors deployed on unpiloted aerial systems (UAS) have provided snow depth estimates with high spatial resolution over watershed scales. While light detection and ranging (LiDAR) produces precise snow depth estimates for areas without vegetation cover, there has generally been poorer precision in forested areas. At a constant flight speed, the poorest precision within forests is observed beneath tree canopies that retain foliage into or through winter. The precision of lidar-derived elevation products is improved by increasing the sample size of ground returns but doing so reduces the spatial coverage of a mission due to limitations of battery power. We address the influence of flight speed on ground return density for baseline and snow-covered conditions and the subsequent effect on precision of snow depth estimates across a mixed landscape, while evaluating trade-offs between precision and bias. Prior to and following a snow event in December 2020, UAS flights were conducted at four different flight speeds over a region consisting of three contrasting land types: (1) open field, (2) deciduous forest, (3) conifer forest. For all cover types, we observed significant improvements in precision as flight speeds were reduced to 2 m s−1, as well as increases in the area over which a 2 cm snow depth precision was achieved. On the other hand, snow depth estimate differences were minimized at baseline flight speeds of 2 m s−1 and 4 m s−1 and snow-on flight speeds of 6 m s−1 over open fields and between 2 and 4 m s−1 over forest areas. Here, with consideration to precision and estimate bias within each cover type, we make recommendations for ideal flight speeds based on survey ground conditions and vegetation cover.more » « less
- 
            We present a model-based approach to estimate the vertical profile of horizontal wind velocity components using motion perturbations of a multirotor unmanned aircraft system (UAS) in both hovering and steady ascending flight. The state estimation framework employed for wind estimation was adapted to a set of closed-loop rigid body models identified for an off-the-shelf quadrotor. The quadrotor models used for wind estimation were characterized for hovering and steady ascending flight conditions ranging between 0 and 2 m/s. The closed-loop models were obtained using system identification algorithms to determine model structures and estimate model parameters. The wind measurement method was validated experimentally above the Virginia Tech Kentland Experimental Aircraft Systems Laboratory by comparing quadrotor and independent sensor measurements from a sonic anemometer and two SoDAR instruments. Comparison results demonstrated quadrotor wind estimation in close agreement with the independent wind velocity measurements. However, horizontal wind velocity profiles were difficult to validate using time-synchronized SoDAR measurements. Analysis of the noise intensity and signal-to-noise ratio of the SoDARs proved that close-proximity quadrotor operations can corrupt wind measurement from SoDARs, which has not previously been reported.more » « less
- 
            null (Ed.)With the increase in commercially available small unmanned aircraft systems (UAS), new observations in extreme environments are becoming more obtainable. One such application is the fire environment, wherein measuring both fire and atmospheric properties are challenging. The Fire and Smoke Model Evaluation Experiment offered the unique opportunity of a large controlled wildfire, which allowed measurements that cannot generally be taken during an active wildfire. Fire–atmosphere interactions have typically been measured from stationary instrumented towers and by remote sensing systems such as lidar. Advances in UAS and compact meteorological instrumentation have allowed for small moving weather stations that can move with the fire front while sampling. This study highlights the use of DJI Matrice 200, which was equipped with a TriSonica Mini Wind and Weather station sonic anemometer weather station in order to sample the fire environment in an experimental and controlled setting. The weather station was mounted on to a carbon fiber pole extending off the side of the platform. The system was tested against an RM-Young 81,000 sonic anemometer, mounted at 6 and 2 m above ground levelto assess any bias in the UAS platform. Preliminary data show that this system can be useful for taking vertical profiles of atmospheric variables, in addition to being used in place of meteorological tower measurements when suitable.more » « less
- 
            Mapping 3D airflow fields is important for many HVAC, industrial, medical, and home applications. However, current approaches are expensive and time-consuming. We present Anemoi, a sub-$100 drone-based system for autonomously mapping 3D airflow fields in indoor environments. Anemoi leverages the effects of airflow on motor control signals to estimate the magnitude and direction of wind at any given point in space. We introduce an exploration algorithm for selecting optimal waypoints that minimize overall airflow estimation uncertainty. We demonstrate through microbenchmarks and real deployments that Anemoi is able to estimate wind speed and direction with errors up to 0.41 m/s and 25.1° lower than the existing state of the art and map 3D airflow fields with an average RMS error of 0.73 m/s.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    