skip to main content


Title: Hyperspectral Surface Directional Reflectance Flight Line Mosaics of Thompson Farm Flux Tower Footprint Acquired by Unpiloted Aerial System, 2021
Orthorectified flight line hyperspectral cubes retiled for publication. Collectively, the tiled hyperspectral cubes cover the footprint of the flux tower and established long-term study plots at Thompson Farm Observatory, Durham, NH. Data were acquired using a Headwall Photonics, Inc. Nano VNIR hyperspectral line scanning imager with 273 bands from 400-1000 nm. The sensor was flown on board a DJI M600 hexacopter at an altitude of ~80 m above the forest canopy, yielding ~6 cm GSD. Flight lines were converted from raw sensor observations to upwelling radiance a using a vendor-supplied radiometric calibration file for the sensor, then converted to reflectance using a calibration tarp with known reflectance. Finally, cubes were orthorectified using a 1m DSM in Headwall’s SpectralView software, mosaicked to individual flight line cubes, then subsequently tiled for publication.  more » « less
Award ID(s):
1638688
NSF-PAR ID:
10400726
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
Harvard Dataverse
Date Published:
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. LiDAR data were acquired over the footprint of the flux tower and established long-term study plots at Thompson Farm Observatory, Durham, NH during leaf-off conditions in November 2022. Data were acquired using a LiVox Avia lidar sensor on a Green Valley International LiAirV70 payload. The LiVox Avia is a triple echo 905 nm lidar sensor with a non-repetitive circular scanning pattern that can retrieve ~700,000 returns per second. The sensor payload was flown on board a DJI M300 at an altitude of ~65 m above ground level in a double grid pattern with ~32 m flight line spacing, yielding a return density across the sampling area >500 points per square meter. Returns were georeferenced to WGS84 UTM Zone 19N coordinates with heights above ellipsoid using Green Valley International’s LiGeoreference software with automatic boresight calibration. Outliers were removed, then flight line point clouds were merged. Returns were classified as ground and non-ground returns using Green Valley International’s Lidar360 software and output as LAS (v 1.4) data sets. LAS files were subsequently tiled for publication. 
    more » « less
  2. LiDAR data were acquired over the footprint of the flux tower and established long-term study plots at Thompson Farm Observatory, Durham, NH during the growing season. Data were acquired using a LiVox Avia lidar sensor on a Green Valley International LiAirV70 payload. The LiVox Avia is a triple echo 905 nm lidar sensor with a non-repetitive circular scanning pattern that can retrieve ~700,000 returns per second. The sensor payload was flown on board a DJI M300 at an altitude of ~65 m above ground level in a double grid pattern with ~32 m flight line spacing, yielding a return density across the sampling area >500 points per square meter. Returns were georeferenced to WGS84 UTM Zone 19N coordinates with heights above ellipsoid using Green Valley International’s LiGeoreference software with automatic boresight calibration. Outliers were removed, then flight line point clouds were merged. Returns were classified as ground and non-ground returns using Green Valley International’s Lidar360 software and output as LAS (v 1.4) data sets. LAS files were subsequently tiled for publication. 
    more » « less
  3. Abstract. Calculating solar-sensor zenith and azimuth angles for hyperspectral images collected by UAVs are important in terms of conducting bi-directional reflectance function (BRDF) correction or radiative transfer modeling-based applications in remote sensing. These applications are even more necessary to perform high-throughput phenotyping and precision agriculture tasks. This study demonstrates an automated Python framework that can calculate the solar-sensor zenith and azimuth angles for a push-broom hyperspectral camera equipped in a UAV. First, the hyperspectral images were radiometrically and geometrically corrected. Second, the high-precision Global Navigation Satellite System (GNSS) and Inertial Measurement Unit (IMU) data for the flight path was extracted and corresponding UAV points for each pixel were identified. Finally, the angles were calculated using spherical trigonometry and linear algebra. The results show that the solar zenith angle (SZA) and solar azimuth angle (SAA) calculated by our method provided higher precision angular values compared to other available tools. The viewing zenith angle (VZA) was lower near the flight path and higher near the edge of the images. The viewing azimuth angle (VAA) pattern showed higher values to the left and lower values to the right side of the flight line. The methods described in this study is easily reproducible to other study areas and applications. 
    more » « less
  4. The monitoring of agronomic parameters like biomass, water stress, and plant health can benefit from synergistic use of all available remotely sensed information. Multispectral imagery has been used for this purpose for decades, largely with vegetation indices (VIs). Many multispectral VIs exist, typically relying on a single feature—the spectral red edge—for information. Where hyperspectral imagery is available, spectral mixture models can use the full VSWIR spectrum to yield further insight, simultaneously estimating area fractions of multiple materials within mixed pixels. Here we investigate the relationships between VIs and mixture models by comparing hyperspectral endmember fractions to six common multispectral VIs in California’s diverse crops and soils. In so doing, we isolate spectral effects from sensor- and acquisition-specific variability associated with atmosphere, illumination, and view geometry. Specifically, we compare: (1) fractional area of photosynthetic vegetation (Fv) from 64,000,000 3–5 m resolution AVIRIS-ng reflectance spectra; and (2) six popular VIs (NDVI, NIRv, EVI, EVI2, SR, DVI) computed from simulated Planet SuperDove reflectance spectra derived from the AVIRIS-ng spectra. Hyperspectral Fv and multispectral VIs are compared using both parametric (Pearson correlation, ρ) and nonparametric (Mutual Information, MI) metrics. Four VIs (NIRv, DVI, EVI, EVI2) showed strong linear relationships with Fv (ρ > 0.94; MI > 1.2). NIRv and DVI showed strong interrelation (ρ > 0.99, MI > 2.4), but deviated from a 1:1 correspondence with Fv. EVI and EVI2 were strongly interrelated (ρ > 0.99, MI > 2.3) and more closely approximated a 1:1 relationship with Fv. In contrast, NDVI and SR showed a weaker, nonlinear, heteroskedastic relation to Fv (ρ < 0.84, MI = 0.69). NDVI exhibited both especially severe sensitivity to unvegetated background (–0.05 < NDVI < +0.6) and saturation (0.2 < Fv < 0.8 for NDVI = 0.7). The self-consistent atmospheric correction, radiometry, and sun-sensor geometry allows this simulation approach to be further applied to indices, sensors, and landscapes worldwide.

     
    more » « less
  5. null (Ed.)
    The agricultural industry suffers from a significant amount of food waste, some of which originates from an inability to apply site-specific management at the farm-level. Snap bean, a broad-acre crop that covers hundreds of thousands of acres across the USA, is not exempt from this need for informed, within-field, and spatially-explicit management approaches. This study aimed to assess the utility of machine learning algorithms for growth stage and pod maturity classification of snap bean (cv. Huntington), as well as detecting and discriminating spectral and biophysical features that lead to accurate classification results. Four major growth stages and six main sieve size pod maturity levels were evaluated for growth stage and pod maturity classification, respectively. A point-based in situ spectroradiometer in the visible-near-infrared and shortwave-infrared domains (VNIR-SWIR; 400–2500 nm) was used and the radiance values were converted to reflectance to normalize for any illumination change between samples. After preprocessing the raw data, we approached pod maturity assessment with multi-class classification and growth stage determination with binary and multi-class classification methods. Results from the growth stage assessment via the binary method exhibited accuracies ranging from 90–98%, with the best mathematical enhancement method being the continuum-removal approach. The growth stage multi-class classification method used raw reflectance data and identified a pair of wavelengths, 493 nm and 640 nm, in two basic transforms (ratio and normalized difference), yielding high accuracies (~79%). Pod maturity assessment detected narrow-band wavelengths in the VIS and SWIR region, separating between not ready-to-harvest and ready-to-harvest scenarios with classification measures at the ~78% level by using continuum-removed spectra. Our work is a best-case scenario, i.e., we consider it a stepping-stone to understanding snap bean harvest maturity assessment via hyperspectral sensing at a scalable level (i.e., airborne systems). Future work involves transferring the concepts to unmanned aerial system (UAS) field experiments and validating whether or not a simple multispectral camera, mounted on a UAS, could incorporate < 10 spectral bands to meet the need of both growth stage and pod maturity classification in snap bean production. 
    more » « less