Improving root traits to improve efficiency of nutrient uptake in plants is an opportunity to increase crop production in response to climate change induced edaphic stresses. Maize (Zea mays L.) studies showed a large variation of root architecture traits in response to such stresses. Quantifying this response uses highthroughput, image-based phenotyping to characterize root architecture variation across edaphic stresses. Our objective is to test if commonly used root traits discriminate stress environments and if a single mathematical description of the complete root architecture reveals a phenotypic spectrum of root architectures in the B73 maize line using manual, DIRT/2D (Digital Imaging of Root Traits) and DIRT/3D measurements. Maize B73 inbred lines were grown in three field conditions: nonlimiting conditions, high nitrogen (N), and low N. A proprietary 3D scanner captured 2D and 3D images of harvested maize roots to compute root descriptors that distinguish shapes of root architecture. The results showed that the normalized mean value of computational root traits from DIRT/2D and DIRT/3D indicated significant discrimination among B73 across environments. We found a strong correlation (R2> 0.8) between the traits measured in 3D point clouds and manually measured traits. Ear weight and shoot biomass in low N significantly decreased bymore »
Complementary Phenotyping of Maize Root System Architecture by Root Pulling Force and X-Ray Imaging
The root system is critical for the survival of nearly all land plants and a key target for improving abiotic stress tolerance, nutrient accumulation, and yield in crop species. Although many methods of root phenotyping exist, within field studies, one of the most popular methods is the extraction and measurement of the upper portion of the root system, known as the root crown, followed by trait quantification based on manual measurements or 2D imaging. However, 2D techniques are inherently limited by the information available from single points of view. Here, we used X-ray computed tomography to generate highly accurate 3D models of maize root crowns and created computational pipelines capable of measuring 71 features from each sample. This approach improves estimates of the genetic contribution to root system architecture and is refined enough to detect various changes in global root system architecture over developmental time as well as more subtle changes in root distributions as a result of environmental differences. We demonstrate that root pulling force, a high-throughput method of root extraction that provides an estimate of root mass, is associated with multiple 3D traits from our pipeline. Our combined methodology can therefore be used to calibrate and interpret root more »
- Award ID(s):
- 1638507
- Publication Date:
- NSF-PAR ID:
- 10301756
- Journal Name:
- Plant Phenomics
- Volume:
- 2021
- Page Range or eLocation-ID:
- 1 to 12
- ISSN:
- 2643-6515
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Root-root interactions alter the architectural organization of individual root systems and therefore, affect nutrient foraging (O’Brien et al., 2005). Past reports have shown detrimental and beneficial effects to the amount of yield in crops as they avoid or prefer belowground competition (Li et al., 2006; O’Brien et al., 2005). With little research done into root-root interactions there is still much to discover about the root phenotypes arising from root-root interactions and functions. Quantifying architectural traits of root system interactions would provide insight for researchers into the benefit of a cooperation vs. competition trade-off belowground. We have begun to develop a soil filled mesocosm system to perform a series of preliminary studies using 3D imaging to develop metrics of root-root interaction using common beans (Phaseolus vulgaris). Common beans have a relatively fast growing and sparse adventitious and basal root system, making them a suitable organism for this imaging study. Our second revision of the mesocosm focused on improving and fine tuning a mesh system that provides better support for the root architecture during the soil removal process. We use a light-weight, low-visibility plastic mesh originally used as bird netting to allow image capture from all sides. Traits that we aim tomore »
-
Abstract Background 3D imaging, such as X-ray CT and MRI, has been widely deployed to study plant root structures. Many computational tools exist to extract coarse-grained features from 3D root images, such as total volume, root number and total root length. However, methods that can accurately and efficiently compute fine-grained root traits, such as root number and geometry at each hierarchy level, are still lacking. These traits would allow biologists to gain deeper insights into the root system architecture. Results We present TopoRoot, a high-throughput computational method that computes fine-grained architectural traits from 3D images of maize root crowns or root systems. These traits include the number, length, thickness, angle, tortuosity, and number of children for the roots at each level of the hierarchy. TopoRoot combines state-of-the-art algorithms in computer graphics, such as topological simplification and geometric skeletonization, with customized heuristics for robustly obtaining the branching structure and hierarchical information. TopoRoot is validated on both CT scans of excavated field-grown root crowns and simulated images of root systems, and in both cases, it was shown to improve the accuracy of traits over existing methods. TopoRoot runs within a few minutes on a desktop workstation for images at the resolution rangemore »
-
Bucksch, Alexander Clarke (Ed.)Understanding root traits is essential to improve water uptake, increase nitrogen capture and accelerate carbon sequestration from the atmosphere. High-throughput phenotyping to quantify root traits for deeper field-grown roots remains a challenge, however. Recently developed open-source methods use 3D reconstruction algorithms to build 3D models of plant roots from multiple 2D images and can extract root traits and phenotypes. Most of these methods rely on automated image orientation (Structure from Motion)[1] and dense image matching (Multiple View Stereo) algorithms to produce a 3D point cloud or mesh model from 2D images. Until now the performance of these methods when applied to field-grown roots has not been compared tested commonly used open-source pipelines on a test panel of twelve contrasting maize genotypes grown in real field conditions[2-6]. We compare the 3D point clouds produced in terms of number of points, computation time and model surface density. This comparison study provides insight into the performance of different open-source pipelines for maize root phenotyping and illuminates trade-offs between 3D model quality and performance cost for future high-throughput 3D root phenotyping. DOI recognition was not working: https://doi.org/10.1002/essoar.10508794.2
-
Abstract. High-resolution remote sensing imagery has been increasingly used for flood applications. Different methods have been proposed for flood extent mapping from creating water index to image classification from high-resolution data. Among these methods, deep learning methods have shown promising results for flood extent extraction; however, these two-dimensional (2D) image classification methods cannot directly provide water level measurements. This paper presents an integrated approach to extract the flood extent in three-dimensional (3D) from UAV data by integrating 2D deep learning-based flood map and 3D cloud point extracted from a Structure from Motion (SFM) method. We fine-tuned a pretrained Visual Geometry Group 16 (VGG-16) based fully convolutional model to create a 2D inundation map. The 2D classified map was overlaid on the SfM-based 3D point cloud to create a 3D flood map. The floodwater depth was estimated by subtracting a pre-flood Digital Elevation Model (DEM) from the SfM-based DEM. The results show that the proposed method is efficient in creating a 3D flood extent map to support emergency response and recovery activates during a flood event.