skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Conventional and hyperspectral time-series imaging of maize lines widely used in field trials
Abstract BackgroundMaize (Zea mays ssp. mays) is 1 of 3 crops, along with rice and wheat, responsible for more than one-half of all calories consumed around the world. Increasing the yield and stress tolerance of these crops is essential to meet the growing need for food. The cost and speed of plant phenotyping are currently the largest constraints on plant breeding efforts. Datasets linking new types of high-throughput phenotyping data collected from plants to the performance of the same genotypes under agronomic conditions across a wide range of environments are essential for developing new statistical approaches and computer vision–based tools. FindingsA set of maize inbreds—primarily recently off patent lines—were phenotyped using a high-throughput platform at University of Nebraska-Lincoln. These lines have been previously subjected to high-density genotyping and scored for a core set of 13 phenotypes in field trials across 13 North American states in 2 years by the Genomes 2 Fields Consortium. A total of 485 GB of image data including RGB, hyperspectral, fluorescence, and thermal infrared photos has been released. ConclusionsCorrelations between image-based measurements and manual measurements demonstrated the feasibility of quantifying variation in plant architecture using image data. However, naive approaches to measuring traits such as biomass can introduce nonrandom measurement errors confounded with genotype variation. Analysis of hyperspectral image data demonstrated unique signatures from stem tissue. Integrating heritable phenotypes from high-throughput phenotyping data with field data from different environments can reveal previously unknown factors that influence yield plasticity.  more » « less
Award ID(s):
1557417
PAR ID:
10555469
Author(s) / Creator(s):
; ; ; ; ; ;
Publisher / Repository:
Oxford University Press
Date Published:
Journal Name:
GigaScience
Volume:
7
Issue:
2
ISSN:
2047-217X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Estimates of plant traits derived from hyperspectral reflectance data have the potential to efficiently substitute for traits, which are time or labor intensive to manually score. Typical workflows for estimating plant traits from hyperspectral reflectance data employ supervised classification models that can require substantial ground truth datasets for training. We explore the potential of an unsupervised approach, autoencoders, to extract meaningful traits from plant hyperspectral reflectance data using measurements of the reflectance of 2151 individual wavelengths of light from the leaves of maize (Zea mays) plants harvested from 1658 field plots in a replicated field trial. A subset of autoencoder‐derived variables exhibited significant repeatability, indicating that a substantial proportion of the total variance in these variables was explained by difference between maize genotypes, while other autoencoder variables appear to capture variation resulting from changes in leaf reflectance between different batches of data collection. Several of the repeatable latent variables were significantly correlated with other traits scored from the same maize field experiment, including one autoencoder‐derived latent variable (LV8) that predicted plant chlorophyll content modestly better than a supervised model trained on the same data. In at least one case, genome‐wide association study hits for variation in autoencoder‐derived variables were proximal to genes with known or plausible links to leaf phenotypes expected to alter hyperspectral reflectance. In aggregate, these results suggest that an unsupervised, autoencoder‐based approach can identify meaningful and genetically controlled variation in high‐dimensional, high‐throughput phenotyping data and link identified variables back to known plant traits of interest. 
    more » « less
  2. Abstract BackgroundThe use of 3D imaging techniques, such as X-ray CT, in root phenotyping has become more widespread in recent years. However, due to the complexity of the root structure, analyzing the resulting 3D volumes to obtain detailed architectural root traits remains a challenging computational problem. When it comes to image-based phenotyping of excavated maize root crowns, two types of root features that are notably missing from existing methods are the whorls and soil line. Whorls refer to the distinct areas located at the base of each stem node from which roots sprout in a circular pattern (Liu S, Barrow CS, Hanlon M, Lynch JP, Bucksch A. Dirt/3D: 3D root phenotyping for field-grown maize (zea mays). Plant Physiol. 2021;187(2):739–57.https://doi.org/10.1093/plphys/kiab311.). The soil line is where the root stem meets the ground. Knowledge of these features would give biologists deeper insights into the root system architecture (RSA) and the below- and above-ground root properties. ResultsWe developed TopoRoot+, a computational pipeline that produces architectural traits from 3D X-ray CT volumes of excavated maize root crowns. Building upon the TopoRoot software (Zeng D, Li M, Jiang N, Ju Y, Schreiber H, Chambers E, et al. Toporoot: A method for computing hierarchy and fine-grained traits of maize roots from 3D imaging. Plant Methods. 2021;17(1).https://doi.org/10.1186/s13007-021-00829-z.) for computing fine-grained root traits, TopoRoot + adds the capability to detect whorls, identify nodal roots at each whorl, and compute the soil line location. The new algorithms in TopoRoot + offer an additional set of fine-grained traits beyond those provided by TopoRoot. The addition includes internode distances, root traits at every hierarchy level associated with a whorl, and root traits specific to above or below the ground. TopoRoot + is validated on a diverse collection of field-grown maize root crowns consisting of nine genotypes and spanning across three years. TopoRoot + runs in minutes for a typical volume size of$$\:40{0}^{3}$$on a desktop workstation. Our software and test dataset are freely distributed on Github. ConclusionsTopoRoot + advances the state-of-the-art in image-based phenotyping of excavated maize root crowns by offering more detailed architectural traits related to whorls and soil lines. The efficiency of TopoRoot + makes it well-suited for high-throughput image-based root phenotyping. 
    more » « less
  3. SUMMARY High‐throughput phenotyping systems are powerful, dramatically changing our ability to document, measure, and detect biological phenomena. Here, we describe a cost‐effective combination of a custom‐built imaging platform and deep‐learning‐based computer vision pipeline. A minimal version of the maize (Zea mays) ear scanner was built with low‐cost and readily available parts. The scanner rotates a maize ear while a digital camera captures a video of the surface of the ear, which is then digitally flattened into a two‐dimensional projection. Segregating GFP and anthocyanin kernel phenotypes are clearly distinguishable in ear projections and can be manually annotated and analyzed using image analysis software. Increased throughput was attained by designing and implementing an automated kernel counting system using transfer learning and a deep learning object detection model. The computer vision model was able to rapidly assess over 390 000 kernels, identifying male‐specific transmission defects across a wide range of GFP‐marked mutant alleles. This includes a previously undescribed defect putatively associated with mutation of Zm00001d002824, a gene predicted to encode a vacuolar processing enzyme. Thus, by using this system, the quantification of transmission data and other ear and kernel phenotypes can be accelerated and scaled to generate large datasets for robust analyses. 
    more » « less
  4. Abstract The development of crops with deeper roots holds substantial promise to mitigate the consequences of climate change. Deeper roots are an essential factor to improve water uptake as a way to enhance crop resilience to drought, to increase nitrogen capture, to reduce fertilizer inputs, and to increase carbon sequestration from the atmosphere to improve soil organic fertility. A major bottleneck to achieving these improvements is high-throughput phenotyping to quantify root phenotypes of field-grown roots. We address this bottleneck with Digital Imaging of Root Traits (DIRT)/3D, an image-based 3D root phenotyping platform, which measures 18 architecture traits from mature field-grown maize (Zea mays) root crowns (RCs) excavated with the Shovelomics technique. DIRT/3D reliably computed all 18 traits, including distance between whorls and the number, angles, and diameters of nodal roots, on a test panel of 12 contrasting maize genotypes. The computed results were validated through comparison with manual measurements. Overall, we observed a coefficient of determination of r2>0.84 and a high broad-sense heritability of Hmean2> 0.6 for all but one trait. The average values of the 18 traits and a developed descriptor to characterize complete root architecture distinguished all genotypes. DIRT/3D is a step toward automated quantification of highly occluded maize RCs. Therefore, DIRT/3D supports breeders and root biologists in improving carbon sequestration and food security in the face of the adverse effects of climate change. 
    more » « less
  5. This study describes the evaluation of a range of approaches to semantic segmentation of hyperspectral images of sorghum plants, classifying each pixel as either nonplant or belonging to one of the three organ types (leaf, stalk, panicle). While many current methods for segmentation focus on separating plant pixels from background, organ-specific segmentation makes it feasible to measure a wider range of plant properties. Manually scored training data for a set of hyperspectral images collected from a sorghum association population was used to train and evaluate a set of supervised classification models. Many algorithms show acceptable accuracy for this classification task. Algorithms trained on sorghum data are able to accurately classify maize leaves and stalks, but fail to accurately classify maize reproductive organs which are not directly equivalent to sorghum panicles. Trait measurements extracted from semantic segmentation of sorghum organs can be used to identify both genes known to be controlling variation in a previously measured phenotypes (e.g., panicle size and plant height) as well as identify signals for genes controlling traits not previously quantified in this population (e.g., stalk/leaf ratio). Organ level semantic segmentation provides opportunities to identify genes controlling variation in a wide range of morphological phenotypes in sorghum, maize, and other related grain crops. 
    more » « less