skip to main content

Title: Detection of soil-borne wheat mosaic virus using hyperspectral imaging: from lab to field scans and from hyperspectral to multispectral data

Hyperspectral imaging allows for rapid, non-destructive and objective assessments of crop health. Narrowband-hyperspectral data was used to select wavelength regions that can be exploited to identify wheat infected with soil-borne mosaic virus. First, leaf samples were scanned in the lab to investigate spectral differences between healthy and diseased leaves, including non-symptomatic and symptomatic areas within a diseased leaf. The potential of 84 commonly used vegetation indices to find infection was explored. A machine-learning approach was used to create a classification model to automatically separate pixels into symptomatic, non-symptomatic and healthy classes. The success rate of the model was 69.7% using the full spectrum. It was very encouraging that by using a subset of only four broad bands, sampled to simulate a data set from a much simpler and less costly multispectral camera, accuracy increased to 71.3%. Next, the classification models were validated on field data. Infection in the field was successfully identified using classifiers trained on the entire spectrum of the hyperspectral data acquired in a lab setting, with the best accuracy being 64.9%. Using a subset of wavelengths, simulating multispectral data, the accuracy dropped by only 3 percentage points to 61.9%. This research shows the potential of using lab more » scans to train classifiers to be successfully applied in the field, even when simultaneously reducing the hyperspectral data to multispectral data.

« less
; ; ;
Award ID(s):
1832109 1832170
Publication Date:
Journal Name:
Precision Agriculture
Springer Science + Business Media
Sponsoring Org:
National Science Foundation
More Like this
  1. Finding trees that are resistant to pathogens is key in preparing for current and future disease threats such as the invasive white pine blister rust. In this study, we analyzed the potential of using hyperspectral imaging to find and diagnose the degree of infection of the non-native white pine blister rust in southwestern white pine seedlings from different seed-source families. A support vector machine was able to automatically detect infection with a classification accuracy of 87% (κ = 0.75) over 16 image collection dates. Hyperspectral imaging only missed 4% of infected seedlings that were impacted in terms of vigor according to expert’s assessments. Classification accuracy per family was highly correlated with mortality rate within a family. Moreover, classifying seedlings into a ‘growth vigor’ grouping used to identify the degree of impact of the disease was possible with 79.7% (κ = 0.69) accuracy. We ranked hyperspectral features for their importance in both classification tasks using the following features: 84 vegetation indices, simple ratios, normalized difference indices, and first derivatives. The most informative features were identified using a ‘new search algorithm’ that combines both the p-value of a 2-sample t-test and the Bhattacharyya distance. We ranked the normalized photochemical reflectance index (PRIn)more »first for infection detection. This index also had the highest classification accuracy (83.6%). Indices such as PRIn use only a small subset of the reflectance bands. This could be used for future developments of less expensive and more data-parsimonious multispectral cameras.« less
  2. The ability to detect diseased trees before symptoms emerge is key in forest health management because it allows for more timely and targeted intervention. The objective of this study was to develop an in-field approach for early and rapid detection of beech leaf disease (BLD), an emerging disease of American beech trees, based on supervised classification models of leaf near-infrared (NIR) spectral profiles. To validate the effectiveness of the method we also utilized a qPCR-based protocol for the quantification of the newly identified foliar nematode identified as the putative causal agent of BLD, Litylenchus crenatae ssp. mccannii (LCM). NIR spectra were collected in May, July, and September of 2021 and analyzed using support vector machine and random forest algorithms. For the May and July datasets, the models accurately predicted pre-symptomatic leaves (highest testing accuracy = 100%), but also accurately discriminated the spectra based on geographic location (highest testing accuracy = 90%). Therefore, we could not conclude that spectral differences were due to pathogen presence alone. However, the September dataset removed location as a factor and the models accurately discriminated pre-symptomatic from naïve samples (highest testing accuracy = 95.9%). Five spectral bands (2,220, 2,400, 2,346, 1,750, and 1,424 nm), selected usingmore »variable selection models, were shared across all models, indicating consistency with respect to phytochemical induction by LCM infection of pre-symptomatic leaves. Our results demonstrate that this technique holds high promise as an in-field diagnostic tool for BLD.« less
  3. Messinger, David W. ; Velez-Reyes, Miguel (Ed.)
    Recently, multispectral and hyperspectral data fusion models based on deep learning have been proposed to generate images with a high spatial and spectral resolution. The general objective is to obtain images that improve spatial resolution while preserving high spectral content. In this work, two deep learning data fusion techniques are characterized in terms of classification accuracy. These methods fuse a high spatial resolution multispectral image with a lower spatial resolution hyperspectral image to generate a high spatial-spectral hyperspectral image. The first model is based on a multi-scale long short-term memory (LSTM) network. The LSTM approach performs the fusion using a multiple step process that transitions from low to high spatial resolution using an intermediate step capable of reducing spatial information loss while preserving spectral content. The second fusion model is based on a convolutional neural network (CNN) data fusion approach. We present fused images using four multi-source datasets with different spatial and spectral resolutions. Both models provide fused images with increased spatial resolution from 8m to 1m. The obtained fused images using the two models are evaluated in terms of classification accuracy on several classifiers: Minimum Distance, Support Vector Machines, Class-Dependent Sparse Representation and CNN classification. The classification results showmore »better performance in both overall and average accuracy for the images generated with the multi-scale LSTM fusion over the CNN fusion« less
  4. Alaska has witnessed a significant increase in wildfire events in recent decades that have been linked to drier and warmer summers. Forest fuel maps play a vital role in wildfire management and risk assessment. Freely available multispectral datasets are widely used for land use and land cover mapping, but they have limited utility for fuel mapping due to their coarse spectral resolution. Hyperspectral datasets have a high spectral resolution, ideal for detailed fuel mapping, but they are limited and expensive to acquire. This study simulates hyperspectral data from Sentinel-2 multispectral data using the spectral response function of the Airborne Visible/Infrared Imaging Spectrometer-Next Generation (AVIRIS-NG) sensor, and normalized ground spectra of gravel, birch, and spruce. We used the Uniform Pattern Decomposition Method (UPDM) for spectral unmixing, which is a sensor-independent method, where each pixel is expressed as the linear sum of standard reference spectra. The simulated hyperspectral data have spectral characteristics of AVIRIS-NG and the reflectance properties of Sentinel-2 data. We validated the simulated spectra by visually and statistically comparing it with real AVIRIS-NG data. We observed a high correlation between the spectra of tree classes collected from AVIRIS-NG and simulated hyperspectral data. Upon performing species level classification, we achieved amore »classification accuracy of 89% for the simulated hyperspectral data, which is better than the accuracy of Sentinel-2 data (77.8%). We generated a fuel map from the simulated hyperspectral image using the Random Forest classifier. Our study demonstrated that low-cost and high-quality hyperspectral data can be generated from Sentinel-2 data using UPDM for improved land cover and vegetation mapping in the boreal forest.« less
  5. The agricultural industry suffers from a significant amount of food waste, some of which originates from an inability to apply site-specific management at the farm-level. Snap bean, a broad-acre crop that covers hundreds of thousands of acres across the USA, is not exempt from this need for informed, within-field, and spatially-explicit management approaches. This study aimed to assess the utility of machine learning algorithms for growth stage and pod maturity classification of snap bean (cv. Huntington), as well as detecting and discriminating spectral and biophysical features that lead to accurate classification results. Four major growth stages and six main sieve size pod maturity levels were evaluated for growth stage and pod maturity classification, respectively. A point-based in situ spectroradiometer in the visible-near-infrared and shortwave-infrared domains (VNIR-SWIR; 400–2500 nm) was used and the radiance values were converted to reflectance to normalize for any illumination change between samples. After preprocessing the raw data, we approached pod maturity assessment with multi-class classification and growth stage determination with binary and multi-class classification methods. Results from the growth stage assessment via the binary method exhibited accuracies ranging from 90–98%, with the best mathematical enhancement method being the continuum-removal approach. The growth stage multi-class classification methodmore »used raw reflectance data and identified a pair of wavelengths, 493 nm and 640 nm, in two basic transforms (ratio and normalized difference), yielding high accuracies (~79%). Pod maturity assessment detected narrow-band wavelengths in the VIS and SWIR region, separating between not ready-to-harvest and ready-to-harvest scenarios with classification measures at the ~78% level by using continuum-removed spectra. Our work is a best-case scenario, i.e., we consider it a stepping-stone to understanding snap bean harvest maturity assessment via hyperspectral sensing at a scalable level (i.e., airborne systems). Future work involves transferring the concepts to unmanned aerial system (UAS) field experiments and validating whether or not a simple multispectral camera, mounted on a UAS, could incorporate < 10 spectral bands to meet the need of both growth stage and pod maturity classification in snap bean production.« less