LiDAR data were acquired over the footprint of the flux tower and established long-term study plots at Thompson Farm Observatory, Durham, NH during the growing season. Data were acquired using a LiVox Avia lidar sensor on a Green Valley International LiAirV70 payload. The LiVox Avia is a triple echo 905 nm lidar sensor with a non-repetitive circular scanning pattern that can retrieve ~700,000 returns per second. The sensor payload was flown on board a DJI M300 at an altitude of ~65 m above ground level in a double grid pattern with ~32 m flight line spacing, yielding a return density across the sampling area >500 points per square meter. Returns were georeferenced to WGS84 UTM Zone 19N coordinates with heights above ellipsoid using Green Valley International’s LiGeoreference software with automatic boresight calibration. Outliers were removed, then flight line point clouds were merged. Returns were classified as ground and non-ground returns using Green Valley International’s Lidar360 software and output as LAS (v 1.4) data sets. LAS files were subsequently tiled for publication. 
                        more » 
                        « less   
                    
                            
                            Hyperspectral Surface Directional Reflectance Flight Line Mosaics of Thompson Farm Flux Tower Footprint Acquired by Unpiloted Aerial System, 2021
                        
                    
    
            Orthorectified flight line hyperspectral cubes retiled for publication. Collectively, the tiled hyperspectral cubes cover the footprint of the flux tower and established long-term study plots at Thompson Farm Observatory, Durham, NH. Data were acquired using a Headwall Photonics, Inc. Nano VNIR hyperspectral line scanning imager with 273 bands from 400-1000 nm. The sensor was flown on board a DJI M600 hexacopter at an altitude of ~80 m above the forest canopy, yielding ~6 cm GSD. Flight lines were converted from raw sensor observations to upwelling radiance a using a vendor-supplied radiometric calibration file for the sensor, then converted to reflectance using a calibration tarp with known reflectance. Finally, cubes were orthorectified using a 1m DSM in Headwall’s SpectralView software, mosaicked to individual flight line cubes, then subsequently tiled for publication. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1638688
- PAR ID:
- 10400726
- Publisher / Repository:
- Harvard Dataverse
- Date Published:
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            LiDAR data were acquired over the footprint of the flux tower and established long-term study plots at Thompson Farm Observatory, Durham, NH during leaf-off conditions in November 2022. Data were acquired using a LiVox Avia lidar sensor on a Green Valley International LiAirV70 payload. The LiVox Avia is a triple echo 905 nm lidar sensor with a non-repetitive circular scanning pattern that can retrieve ~700,000 returns per second. The sensor payload was flown on board a DJI M300 at an altitude of ~65 m above ground level in a double grid pattern with ~32 m flight line spacing, yielding a return density across the sampling area >500 points per square meter. Returns were georeferenced to WGS84 UTM Zone 19N coordinates with heights above ellipsoid using Green Valley International’s LiGeoreference software with automatic boresight calibration. Outliers were removed, then flight line point clouds were merged. Returns were classified as ground and non-ground returns using Green Valley International’s Lidar360 software and output as LAS (v 1.4) data sets. LAS files were subsequently tiled for publication.more » « less
- 
            Abstract. Calculating solar-sensor zenith and azimuth angles for hyperspectral images collected by UAVs are important in terms of conducting bi-directional reflectance function (BRDF) correction or radiative transfer modeling-based applications in remote sensing. These applications are even more necessary to perform high-throughput phenotyping and precision agriculture tasks. This study demonstrates an automated Python framework that can calculate the solar-sensor zenith and azimuth angles for a push-broom hyperspectral camera equipped in a UAV. First, the hyperspectral images were radiometrically and geometrically corrected. Second, the high-precision Global Navigation Satellite System (GNSS) and Inertial Measurement Unit (IMU) data for the flight path was extracted and corresponding UAV points for each pixel were identified. Finally, the angles were calculated using spherical trigonometry and linear algebra. The results show that the solar zenith angle (SZA) and solar azimuth angle (SAA) calculated by our method provided higher precision angular values compared to other available tools. The viewing zenith angle (VZA) was lower near the flight path and higher near the edge of the images. The viewing azimuth angle (VAA) pattern showed higher values to the left and lower values to the right side of the flight line. The methods described in this study is easily reproducible to other study areas and applications.more » « less
- 
            Abstract This study uses a small unmanned aircraft system equipped with a multispectral sensor to assess various vegetation indices (VIs) for their potential to monitor iron deficiency chlorosis (IDC) in a grain sorghum (Sorghum bicolorL.) crop. IDC is a nutritional disorder that stunts a plants’ growth and causes its leaves to yellow due to an iron deficit. The objective of this project is to find the best VI to detect and monitor IDC. A series of flights were completed over the course of the growing season and processed using Structure‐from‐Motion photogrammetry to create orthorectified, multispectral reflectance maps in the red, green, red‐edge, and near‐infrared wavelengths. Ground data collection methods were used to analyze stress, chlorophyll levels, and grain yield, correlating them to the multispectral imagery for ground control and precise crop examination. The reflectance maps and soil‐removed reflectance maps were used to calculate 25 VIs whose separability was then calculated using a two‐class distance measure, determining which contained the largest separation between the pixels representing IDC and healthy vegetation. The field‐acquired data were used to conclude which VIs achieved the best results for the dataset as a whole and at each level of IDC (low, moderate, and severe). It was concluded that the MERIS terrestrial chlorophyll index, normalized difference red‐edge, and normalized green (NG) indices achieved the highest amount of separation between plants with IDC and healthy vegetation, with the NG reaching the highest levels of separability for both soil‐included and soil‐removed VIs.more » « less
- 
            Abstract Estimates of plant traits derived from hyperspectral reflectance data have the potential to efficiently substitute for traits, which are time or labor intensive to manually score. Typical workflows for estimating plant traits from hyperspectral reflectance data employ supervised classification models that can require substantial ground truth datasets for training. We explore the potential of an unsupervised approach, autoencoders, to extract meaningful traits from plant hyperspectral reflectance data using measurements of the reflectance of 2151 individual wavelengths of light from the leaves of maize (Zea mays) plants harvested from 1658 field plots in a replicated field trial. A subset of autoencoder‐derived variables exhibited significant repeatability, indicating that a substantial proportion of the total variance in these variables was explained by difference between maize genotypes, while other autoencoder variables appear to capture variation resulting from changes in leaf reflectance between different batches of data collection. Several of the repeatable latent variables were significantly correlated with other traits scored from the same maize field experiment, including one autoencoder‐derived latent variable (LV8) that predicted plant chlorophyll content modestly better than a supervised model trained on the same data. In at least one case, genome‐wide association study hits for variation in autoencoder‐derived variables were proximal to genes with known or plausible links to leaf phenotypes expected to alter hyperspectral reflectance. In aggregate, these results suggest that an unsupervised, autoencoder‐based approach can identify meaningful and genetically controlled variation in high‐dimensional, high‐throughput phenotyping data and link identified variables back to known plant traits of interest.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
