Abstract Prostate cancer treatment decisions rely heavily on subjective visual interpretation [assigning Gleason patterns or International Society of Urological Pathology (ISUP) grade groups] of limited numbers of two‐dimensional (2D) histology sections. Under this paradigm, interobserver variance is high, with ISUP grades not correlating well with outcome for individual patients, and this contributes to the over‐ and undertreatment of patients. Recent studies have demonstrated improved prognostication of prostate cancer outcomes based on computational analyses of glands and nuclei within 2D whole slide images. Our group has also shown that the computational analysis of three‐dimensional (3D) glandular features, extracted from 3D pathology datasets of whole intact biopsies, can allow for improved recurrence prediction compared to corresponding 2D features. Here we seek to expand on these prior studies by exploring the prognostic value of 3D shape‐based nuclear features in prostate cancer (e.g. nuclear size, sphericity). 3D pathology datasets were generated using open‐top light‐sheet (OTLS) microscopy of 102 cancer‐containing biopsies extractedex vivofrom the prostatectomy specimens of 46 patients. A deep learning‐based workflow was developed for 3D nuclear segmentation within the glandular epithelium versus stromal regions of the biopsies. 3D shape‐based nuclear features were extracted, and a nested cross‐validation scheme was used to train a supervised machine classifier based on 5‐year biochemical recurrence (BCR) outcomes. Nuclear features of the glandular epithelium were found to be more prognostic than stromal cell nuclear features (area under the ROC curve [AUC] = 0.72 versus 0.63). 3D shape‐based nuclear features of the glandular epithelium were also more strongly associated with the risk of BCR than analogous 2D features (AUC = 0.72 versus 0.62). The results of this preliminary investigation suggest that 3D shape‐based nuclear features are associated with prostate cancer aggressiveness and could be of value for the development of decision‐support tools. © 2023 The Pathological Society of Great Britain and Ireland. 
                        more » 
                        « less   
                    
                            
                            Prostate Cancer Risk Stratification via Nondestructive 3D Pathology with Deep Learning–Assisted Gland Analysis
                        
                    
    
            Abstract Prostate cancer treatment planning is largely dependent upon examination of core-needle biopsies. The microscopic architecture of the prostate glands forms the basis for prognostic grading by pathologists. Interpretation of these convoluted three-dimensional (3D) glandular structures via visual inspection of a limited number of two-dimensional (2D) histology sections is often unreliable, which contributes to the under- and overtreatment of patients. To improve risk assessment and treatment decisions, we have developed a workflow for nondestructive 3D pathology and computational analysis of whole prostate biopsies labeled with a rapid and inexpensive fluorescent analogue of standard hematoxylin and eosin (H&E) staining. This analysis is based on interpretable glandular features and is facilitated by the development of image translation–assisted segmentation in 3D (ITAS3D). ITAS3D is a generalizable deep learning–based strategy that enables tissue microstructures to be volumetrically segmented in an annotation-free and objective (biomarker-based) manner without requiring immunolabeling. As a preliminary demonstration of the translational value of a computational 3D versus a computational 2D pathology approach, we imaged 300 ex vivo biopsies extracted from 50 archived radical prostatectomy specimens, of which, 118 biopsies contained cancer. The 3D glandular features in cancer biopsies were superior to corresponding 2D features for risk stratification of patients with low- to intermediate-risk prostate cancer based on their clinical biochemical recurrence outcomes. The results of this study support the use of computational 3D pathology for guiding the clinical management of prostate cancer. Significance:An end-to-end pipeline for deep learning–assisted computational 3D histology analysis of whole prostate biopsies shows that nondestructive 3D pathology has the potential to enable superior prognostic stratification of patients with prostate cancer. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1934292
- PAR ID:
- 10370177
- Author(s) / Creator(s):
- ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; more »
- Publisher / Repository:
- DOI PREFIX: 10.1158
- Date Published:
- Journal Name:
- Cancer Research
- Volume:
- 82
- Issue:
- 2
- ISSN:
- 0008-5472
- Page Range / eLocation ID:
- p. 334-345
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Abstract In recent years, technological advances in tissue preparation, high‐throughput volumetric microscopy, and computational infrastructure have enabled rapid developments in nondestructive 3D pathology, in which high‐resolution histologic datasets are obtained from thick tissue specimens, such as whole biopsies, without the need for physical sectioning onto glass slides. While 3D pathology generates massive datasets that are attractive for automated computational analysis, there is also a desire to use 3D pathology to improve the visual assessment of tissue histology. In this perspective, we discuss and provide examples of potential advantages of 3D pathology for the visual assessment of clinical specimens and the challenges of dealing with large 3D datasets (of individual or multiple specimens) that pathologists have not been trained to interpret. We discuss the need for artificial intelligence triaging algorithms and explainable analysis methods to assist pathologists or other domain experts in the interpretation of these novel, often complex, large datasets.more » « less
- 
            Abstract BackgroundThis study presents user evaluation studies to assess the effect of information rendered by an interventional planning software on the operator's ability to plan transrectal magnetic resonance (MR)‐guided prostate biopsies using actuated robotic manipulators. MethodsAn intervention planning software was developed based on the clinical workflow followed for MR‐guided transrectal prostate biopsies. The software was designed to interface with a generic virtual manipulator and simulate an intervention environment using 2D and 3D scenes. User studies were conducted with urologists using the developed software to plan virtual biopsies. ResultsUser studies demonstrated that urologists with prior experience in using 3D software completed the planning less time. 3D scenes were required to control all degrees‐of‐freedom of the manipulator, while 2D scenes were sufficient for planar motion of the manipulator. ConclusionsThe study provides insights on using 2D versus 3D environment from a urologist's perspective for different operational modes of MR‐guided prostate biopsy systems.more » « less
- 
            Abstract Identifying prostate cancer patients that are harboring aggressive forms of prostate cancer remains a significant clinical challenge. Here we develop an approach based on multispectral deep-ultraviolet (UV) microscopy that provides novel quantitative insight into the aggressiveness and grade of this disease, thus providing a new tool to help address this important challenge. We find that UV spectral signatures from endogenous molecules give rise to a phenotypical continuum that provides unique structural insight (i.e., molecular maps or “optical stains") of thin tissue sections with subcellular (nanoscale) resolution. We show that this phenotypical continuum can also be applied as a surrogate biomarker of prostate cancer malignancy, where patients with the most aggressive tumors show a ubiquitous glandular phenotypical shift. In addition to providing several novel “optical stains” with contrast for disease, we also adapt a two-part Cycle-consistent Generative Adversarial Network to translate the label-free deep-UV images into virtual hematoxylin and eosin (H&E) stained images, thus providing multiple stains (including the gold-standard H&E) from the same unlabeled specimen. Agreement between the virtual H&E images and the H&E-stained tissue sections is evaluated by a panel of pathologists who find that the two modalities are in excellent agreement. This work has significant implications towards improving our ability to objectively quantify prostate cancer grade and aggressiveness, thus improving the management and clinical outcomes of prostate cancer patients. This same approach can also be applied broadly in other tumor types to achieve low-cost, stain-free, quantitative histopathological analysis.more » « less
- 
            Abstract BackgroundDelta radiomics is a high‐throughput computational technique used to describe quantitative changes in serial, time‐series imaging by considering the relative change in radiomic features of images extracted at two distinct time points. Recent work has demonstrated a lack of prognostic signal of radiomic features extracted using this technique. We hypothesize that this lack of signal is due to the fundamental assumptions made when extracting features via delta radiomics, and that other methods should be investigated. PurposeThe purpose of this work was to show a proof‐of‐concept of a new radiomics paradigm for sparse, time‐series imaging data, where features are extracted from a spatial‐temporal manifold modeling the time evolution between images, and to assess the prognostic value on patients with oropharyngeal cancer (OPC). MethodsTo accomplish this, we developed an algorithm to mathematically describe the relationship between two images acquired at time and . These images serve as boundary conditions of a partial differential equation describing the transition from one image to the other. To solve this equation, we propagate the position and momentum of each voxel according to Fokker–Planck dynamics (i.e., a technique common in statistical mechanics). This transformation is driven by an underlying potential force uniquely determined by the equilibrium image. The solution generates a spatial‐temporal manifold (3 spatial dimensions + time) from which we define dynamic radiomic features. First, our approach was numerically verified by stochastically sampling dynamic Gaussian processes of monotonically decreasing noise. The transformation from high to low noise was compared between our Fokker–Planck estimation and simulated ground‐truth. To demonstrate feasibility and clinical impact, we applied our approach to18F‐FDG‐PET images to estimate early metabolic response of patients (n = 57) undergoing definitive (chemo)radiation for OPC. Images were acquired pre‐treatment and 2‐weeks intra‐treatment (after 20 Gy). Dynamic radiomic features capturing changes in texture and morphology were then extracted. Patients were partitioned into two groups based on similar dynamic radiomic feature expression via k‐means clustering and compared by Kaplan–Meier analyses with log‐rank tests (p < 0.05). These results were compared to conventional delta radiomics to test the added value of our approach. ResultsNumerical results confirmed our technique can recover image noise characteristics given sparse input data as boundary conditions. Our technique was able to model tumor shrinkage and metabolic response. While no delta radiomics features proved prognostic, Kaplan–Meier analyses identified nine significant dynamic radiomic features. The most significant feature was Gray‐Level‐Size‐Zone‐Matrix gray‐level variance (p = 0.011), which demonstrated prognostic improvement over its corresponding delta radiomic feature (p = 0.722). ConclusionsWe developed, verified, and demonstrated the prognostic value of a novel, physics‐based radiomics approach over conventional delta radiomics via data assimilation of quantitative imaging and differential equations.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
