The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Friday, December 13 until 2:00 AM ET on Saturday, December 14 due to maintenance. We apologize for the inconvenience.
Explore Research Products in the PAR It may take a few hours for recently added research products to appear in PAR search results.
Title: Cotton plant part 3D segmentation and architectural trait extraction using point voxel convolutional neural networks
AbstractBackground
Plant architecture can influence crop yield and quality. Manual extraction of architectural traits is, however, time-consuming, tedious, and error prone. The trait estimation from 3D data addresses occlusion issues with the availability of depth information while deep learning approaches enable learning features without manual design. The goal of this study was to develop a data processing workflow by leveraging 3D deep learning models and a novel 3D data annotation tool to segment cotton plant parts and derive important architectural traits.
Results
The Point Voxel Convolutional Neural Network (PVCNN) combining both point- and voxel-based representations of 3D data shows less time consumption and better segmentation performance than point-based networks. Results indicate that the best mIoU (89.12%) and accuracy (96.19%) with average inference time of 0.88 s were achieved through PVCNN, compared to Pointnet and Pointnet++. On the seven derived architectural traits from segmented parts, an R2value of more than 0.8 and mean absolute percentage error of less than 10% were attained.
Conclusion
This plant part segmentation method based on 3D deep learning enables effective and efficient architectural trait measurement from point clouds, which could be useful to advance plant breeding programs and characterization of in-season developmental traits. The plant part segmentation code is available athttps://github.com/UGA-BSAIL/plant_3d_deep_learning.
Huang, Xin; Struck, Travis J; Davey, Sean W; Gutenkunst, Ryan N(
, bioRxiv)
AbstractSummary
dadi is a popular software package for inferring models of demographic history and natural selection from population genomic data. But using dadi requires Python scripting and manual parallelization of optimization jobs. We developed dadi-cli to simplify dadi usage and also enable straighforward distributed computing.
Availability and Implementation
dadi-cli is implemented in Python and released under the Apache License 2.0. The source code is available athttps://github.com/xin-huang/dadi-cli. dadi-cli can be installed via PyPI and conda, and is also available through Cacao on Jetstream2https://cacao.jetstream-cloud.org/.
Ju, Yiwen; Liu, Alexander E; Oestreich, Kenan; Wang, Tina; Topp, Christopher N; Ju, Tao(
, Plant Methods)
AbstractBackground
The use of 3D imaging techniques, such as X-ray CT, in root phenotyping has become more widespread in recent years. However, due to the complexity of the root structure, analyzing the resulting 3D volumes to obtain detailed architectural root traits remains a challenging computational problem. When it comes to image-based phenotyping of excavated maize root crowns, two types of root features that are notably missing from existing methods are the whorls and soil line. Whorls refer to the distinct areas located at the base of each stem node from which roots sprout in a circular pattern (Liu S, Barrow CS, Hanlon M, Lynch JP, Bucksch A. Dirt/3D: 3D root phenotyping for field-grown maize (zea mays). Plant Physiol. 2021;187(2):739–57.https://doi.org/10.1093/plphys/kiab311.). The soil line is where the root stem meets the ground. Knowledge of these features would give biologists deeper insights into the root system architecture (RSA) and the below- and above-ground root properties.
Results
We developed TopoRoot+, a computational pipeline that produces architectural traits from 3D X-ray CT volumes of excavated maize root crowns. Building upon the TopoRoot software (Zeng D, Li M, Jiang N, Ju Y, Schreiber H, Chambers E, et al. Toporoot: A method for computing hierarchy and fine-grained traits of maize roots from 3D imaging. Plant Methods. 2021;17(1).https://doi.org/10.1186/s13007-021-00829-z.) for computing fine-grained root traits, TopoRoot + adds the capability to detect whorls, identify nodal roots at each whorl, and compute the soil line location. The new algorithms in TopoRoot + offer an additional set of fine-grained traits beyond those provided by TopoRoot. The addition includes internode distances, root traits at every hierarchy level associated with a whorl, and root traits specific to above or below the ground. TopoRoot + is validated on a diverse collection of field-grown maize root crowns consisting of nine genotypes and spanning across three years. TopoRoot + runs in minutes for a typical volume size of$$\:40{0}^{3}$$on a desktop workstation. Our software and test dataset are freely distributed on Github.
Conclusions
TopoRoot + advances the state-of-the-art in image-based phenotyping of excavated maize root crowns by offering more detailed architectural traits related to whorls and soil lines. The efficiency of TopoRoot + makes it well-suited for high-throughput image-based root phenotyping.
Lam, Olee_Hoi Ying; Kattge, Jens; Tautenhahn, Susanne; Boenisch, Gerhard; Kovach, Kyle R; Townsend, Philip A(
, Ecology and Evolution)
Abstract
Plant trait data are used to quantify how plants respond to environmental factors and can act as indicators of ecosystem function. Measured trait values are influenced by genetics, trade‐offs, competition, environmental conditions, and phenology. These interacting effects on traits are poorly characterized across taxa, and for many traits, measurement protocols are not standardized. As a result, ancillary information about growth and measurement conditions can be highly variable, requiring a flexible data structure. In 2007, the TRY initiative was founded as an integrated database of plant trait data, including ancillary attributes relevant to understanding and interpreting the trait values. The TRY database now integrates around 700 original and collective datasets and has become a central resource of plant trait data. These data are provided in a generic long‐table format, where a unique identifier links different trait records and ancillary data measured on the same entity. Due to the high number of trait records, plant taxa, and types of traits and ancillary data released from the TRY database, data preprocessing is necessary but not straightforward. Here, we present the ‘rtry’ R package, specifically designed to support plant trait data exploration and filtering. By integrating a subset of existing R functions essential for preprocessing, ‘rtry’ avoids the need for users to navigate the extensive R ecosystem and provides the functions under a consistent syntax. ‘rtry’ is therefore easy to use even for beginners in R. Notably, ‘rtry’ does not support data retrieval or analysis; rather, it focuses on the preprocessing tasks to optimize data quality. While ‘rtry’ primarily targets TRY data, its utility extends to data from other sources, such as the National Ecological Observatory Network (NEON). The ‘rtry’ package is available on the Comprehensive R Archive Network (CRAN;https://cran.r‐project.org/package=rtry) and the GitHub Wiki (https://github.com/MPI‐BGC‐Functional‐Biogeography/rtry/wiki) along with comprehensive documentation and vignettes describing detailed data preprocessing workflows.
Rapid adaptation can aid invasive populations in their competitive success. Resource allocation trade‐off hypotheses predict higher resource availability or the lack of natural enemies in introduced ranges allow for increased growth and reproduction, thus contributing to invasive success. Evidence for such hypotheses is however equivocal and tests among multiple ranges over productivity gradients are required to provide a better understanding of the general applicability of these theories.
Using common gardens, we investigated the adaptive divergence of various constitutive and inducible defence‐related traits between the native North American and introduced European and Australian ranges, while controlling for divergence due to latitudinal trait clines, individual resource budgets, and population differentiation, using >11,000 SNPs.
Rapid, repeated clinal adaptation in defence‐related traits was apparent despite distinct demographic histories. We also identified divergence among ranges in some defence‐related traits, although differences in energy budgets among ranges may explain some, but not all, defence‐related trait divergence. We do not identify a general reduction in defence in concert with an increase in growth among the multiple introduced ranges as predicted trade‐off hypotheses.
Synthesis: The rapid spread of invasive species is affected by a multitude of factors, likely including adaptation to climate and escape from natural enemies. Unravelling the mechanisms underlying invasives' success enhances understanding of eco‐evolutionary theory and is essential to inform management strategies in the face of ongoing climate change.
OPEN RESEARCH BADGES
This article has been awarded Open Materials, Open Data, Preregistered Research Designs Badges. All materials and data are publicly accessible via the Open Science Framework athttps://doi.org/10.6084/m9.figshare.8028875.v1,https://github.com/lotteanna/defence_adaptation,https://doi.org/10.1101/435271.
The primary aim of this study was to develop an open-source Python-based software for the automated analysis of dynamic cell behaviors in microphysiological models using non-confocal microscopy. This research seeks to address the existing gap in accessible tools for high-throughput analysis of endothelial tube formation and cell invasion in vitro, facilitating the rapid assessment of drug sensitivity.
Methods
Our approach involved annotating over 1000 2 mm Z-stacks of cancer and endothelial cell co-culture model and training machine learning models to automatically calculate cell coverage, cancer invasion depth, and microvessel dynamics. Specifically, cell coverage area was computed using focus stacking and Gaussian mixture models to generate thresholded Z-projections. Cancer invasion depth was determined using a ResNet-50 binary classification model, identifying which Z-planes contained invaded cells and measuring the total invasion depth. Lastly, microvessel dynamics were assessed through a U-Net Xception-style segmentation model for vessel prediction, the DisPerSE algorithm to extract an embedded graph, then graph analysis to quantify microvessel length and connectivity. To further validate our software, we reanalyzed an image set from a high-throughput drug screen involving a chemotherapy agent on a 3D cervical and endothelial co-culture model. Lastly, we applied this software to two naive image datasets from coculture lumen and microvascular fragment models.
Results
The software accurately measured cell coverage, cancer invasion, and microvessel length, yielding drug sensitivity IC50values with a 95% confidence level compared to manual calculations. This approach significantly reduced the image processing time from weeks down to h. Furthermore, the software was able to calculate cell coverage, microvessel length, and invasion depth from two additional microphysiological models that were imaged with confocal microscopy, highlighting the versatility of the software.
Conclusions
Our free and open source software offers an automated solution for quantifying 3D cell behavior in microphysiological models assessed using non-confocal microscopy, providing the broader Cellular and Molecular Bioengineering community with an alternative to standard confocal microscopy paired with proprietary software.This software can be found in our GitHub repository:https://github.com/fogg-lab/tissue-model-analysis-tools.
Saeed, Farah, Sun, Shangpeng, Rodriguez-Sanchez, Javier, Snider, John, Liu, Tianming, and Li, Changying.
"Cotton plant part 3D segmentation and architectural trait extraction using point voxel convolutional neural networks". Plant Methods 19 (1). Country unknown/Code not available: BMC. https://doi.org/10.1186/s13007-023-00996-1.https://par.nsf.gov/biblio/10552587.
@article{osti_10552587,
place = {Country unknown/Code not available},
title = {Cotton plant part 3D segmentation and architectural trait extraction using point voxel convolutional neural networks},
url = {https://par.nsf.gov/biblio/10552587},
DOI = {10.1186/s13007-023-00996-1},
abstractNote = {Abstract BackgroundPlant architecture can influence crop yield and quality. Manual extraction of architectural traits is, however, time-consuming, tedious, and error prone. The trait estimation from 3D data addresses occlusion issues with the availability of depth information while deep learning approaches enable learning features without manual design. The goal of this study was to develop a data processing workflow by leveraging 3D deep learning models and a novel 3D data annotation tool to segment cotton plant parts and derive important architectural traits. ResultsThe Point Voxel Convolutional Neural Network (PVCNN) combining both point- and voxel-based representations of 3D data shows less time consumption and better segmentation performance than point-based networks. Results indicate that the best mIoU (89.12%) and accuracy (96.19%) with average inference time of 0.88 s were achieved through PVCNN, compared to Pointnet and Pointnet++. On the seven derived architectural traits from segmented parts, an R2value of more than 0.8 and mean absolute percentage error of less than 10% were attained. ConclusionThis plant part segmentation method based on 3D deep learning enables effective and efficient architectural trait measurement from point clouds, which could be useful to advance plant breeding programs and characterization of in-season developmental traits. The plant part segmentation code is available athttps://github.com/UGA-BSAIL/plant_3d_deep_learning.},
journal = {Plant Methods},
volume = {19},
number = {1},
publisher = {BMC},
author = {Saeed, Farah and Sun, Shangpeng and Rodriguez-Sanchez, Javier and Snider, John and Liu, Tianming and Li, Changying},
}
Warning: Leaving National Science Foundation Website
You are now leaving the National Science Foundation website to go to a non-government website.
Website:
NSF takes no responsibility for and exercises no control over the views expressed or the accuracy of
the information contained on this site. Also be aware that NSF's privacy policy does not apply to this site.