skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Prostate cancer histopathology using label-free multispectral deep-UV microscopy quantifies phenotypes of tumor aggressiveness and enables multiple diagnostic virtual stains
Abstract Identifying prostate cancer patients that are harboring aggressive forms of prostate cancer remains a significant clinical challenge. Here we develop an approach based on multispectral deep-ultraviolet (UV) microscopy that provides novel quantitative insight into the aggressiveness and grade of this disease, thus providing a new tool to help address this important challenge. We find that UV spectral signatures from endogenous molecules give rise to a phenotypical continuum that provides unique structural insight (i.e., molecular maps or “optical stains") of thin tissue sections with subcellular (nanoscale) resolution. We show that this phenotypical continuum can also be applied as a surrogate biomarker of prostate cancer malignancy, where patients with the most aggressive tumors show a ubiquitous glandular phenotypical shift. In addition to providing several novel “optical stains” with contrast for disease, we also adapt a two-part Cycle-consistent Generative Adversarial Network to translate the label-free deep-UV images into virtual hematoxylin and eosin (H&E) stained images, thus providing multiple stains (including the gold-standard H&E) from the same unlabeled specimen. Agreement between the virtual H&E images and the H&E-stained tissue sections is evaluated by a panel of pathologists who find that the two modalities are in excellent agreement. This work has significant implications towards improving our ability to objectively quantify prostate cancer grade and aggressiveness, thus improving the management and clinical outcomes of prostate cancer patients. This same approach can also be applied broadly in other tumor types to achieve low-cost, stain-free, quantitative histopathological analysis.  more » « less
Award ID(s):
1752011
PAR ID:
10412745
Author(s) / Creator(s):
; ; ; ; ; ; ;
Date Published:
Journal Name:
Scientific Reports
Volume:
12
Issue:
1
ISSN:
2045-2322
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Objective and Impact Statement . Identifying benign mimics of prostatic adenocarcinoma remains a significant diagnostic challenge. In this work, we developed an approach based on label-free, high-resolution molecular imaging with multispectral deep ultraviolet (UV) microscopy which identifies important prostate tissue components, including basal cells. This work has significant implications towards improving the pathologic assessment and diagnosis of prostate cancer. Introduction . One of the most important indicators of prostate cancer is the absence of basal cells in glands and ducts. However, identifying basal cells using hematoxylin and eosin (H&E) stains, which is the standard of care, can be difficult in a subset of cases. In such situations, pathologists often resort to immunohistochemical (IHC) stains for a definitive diagnosis. However, IHC is expensive and time-consuming and requires more tissue sections which may not be available. In addition, IHC is subject to false-negative or false-positive stains which can potentially lead to an incorrect diagnosis. Methods . We leverage the rich molecular information of label-free multispectral deep UV microscopy to uniquely identify basal cells, luminal cells, and inflammatory cells. The method applies an unsupervised geometrical representation of principal component analysis to separate the various components of prostate tissue leading to multiple image representations of the molecular information. Results . Our results show that this method accurately and efficiently identifies benign and malignant glands with high fidelity, free of any staining procedures, based on the presence or absence of basal cells. We further use the molecular information to directly generate a high-resolution virtual IHC stain that clearly identifies basal cells, even in cases where IHC stains fail. Conclusion . Our simple, low-cost, and label-free deep UV method has the potential to improve and facilitate prostate cancer diagnosis by enabling robust identification of basal cells and other important prostate tissue components. 
    more » « less
  2. Abstract Pathology is practiced by visual inspection of histochemically stained tissue slides. While the hematoxylin and eosin (H&E) stain is most commonly used, special stains can provide additional contrast to different tissue components. Here, we demonstrate the utility of supervised learning-based computational stain transformation from H&E to special stains (Masson’s Trichrome, periodic acid-Schiff and Jones silver stain) using kidney needle core biopsy tissue sections. Based on the evaluation by three renal pathologists, followed by adjudication by a fourth pathologist, we show that the generation of virtual special stains from existing H&E images improves the diagnosis of several non-neoplastic kidney diseases, sampled from 58 unique subjects (P = 0.0095). A second study found that the quality of the computationally generated special stains was statistically equivalent to those which were histochemically stained. This stain-to-stain transformation framework can improve preliminary diagnoses when additional special stains are needed, also providing significant savings in time and cost. 
    more » « less
  3. We present a deep learning-based framework to virtually transfer brightfield images of H&E-stained tissue slides to other types of stains using cascaded networks, providing high-quality images of special stains from existing H&E stained tissue images. 
    more » « less
  4. The ability to accurately define tumor margins may enhance tissue sparing and increase efficiency in the dermatologic surgery process, but no device exists that serves this role. Reflectance Confocal Microscopy (RCM) provides non-invasive cellular resolution of the skin. The only clinically-approved RCM device is bulky, non-portable, and requires a tissue cap which makes mapping of the underlying tissue impossible. We recently combined “virtual histology”, a machine learning algorithm with RCM images from this standard RCM device to generate biopsy-free histology to overcome these limitations. Whether virtual histology can be used with a portable, handheld RCM device to scan for residual tumor and tumor margins is currently unknown. We hypothesize that combining a handheld RCM device with virtual histology could provide accurate tumor margin assessment. We determined whether our established virtual histology algorithm could be applied to images from a portable RCM device and whether these pseudo-stained virtual histology images correlated with histology from skin specimens. The study was conducted as a prospective, consecutive non-randomized trial at a Veterans Affairs Medical Center dermatologic surgery clinic. All patients greater than 18 years of age with previously biopsied BCC, SCC, or SCCis were included. Successive confocal images from the epidermis to the dermis were obtained 1.5 microns apart from the handheld RCM device to detect residual skin cancer. The handheld, in-vivo RCM images were processed through a conditional generative adversarial network-based machine learning algorithm to digitally convert the images into H&E pseudo-stained virtual histology images. Virtual histology of in-vivo RCM images from unbiopsied skin captured with the portable RCM device were similar to those obtained with the standard RCM device and virtual histology applied to portable RCM images correctly correlated with frozen section histology. Residual tumors detected with virtual histology generated from the portable RCM images accurately corresponded with residual tumors shown in the frozen surgical tissue specimen. Residual tumor was also not detected when excised tissue was clear of tumor following surgical procedure. Thus, the combination of virtual histology with portable RCM may provide accurate histology-quality data for evaluation of residual skin cancer prior to surgery. Combining machine learning-based virtual histology with handheld RCM images demonstrates promise in providing insights into tumor characteristics and has the potential to assist the surgeon and better guide practice decisions to more efficiently serve patients, leading to decreased layers and appointment times. Future work is needed to provide real-time virtual histology, convert horizontal/confocal sections into vertical or 3D sections, and to perform clinical studies to map tumors in tissue. 
    more » « less
  5. Abstract Histological staining is a vital step in diagnosing various diseases and has been used for more than a century to provide contrast in tissue sections, rendering the tissue constituents visible for microscopic analysis by medical experts. However, this process is time consuming, labour intensive, expensive and destructive to the specimen. Recently, the ability to virtually stain unlabelled tissue sections, entirely avoiding the histochemical staining step, has been demonstrated using tissue-stain-specific deep neural networks. Here, we present a new deep-learning-based framework that generates virtually stained images using label-free tissue images, in which different stains are merged following a micro-structure map defined by the user. This approach uses a single deep neural network that receives two different sources of information as its input: (1) autofluorescence images of the label-free tissue sample and (2) a “digital staining matrix”, which represents the desired microscopic map of the different stains to be virtually generated in the same tissue section. This digital staining matrix is also used to virtually blend existing stains, digitally synthesizing new histological stains. We trained and blindly tested this virtual-staining network using unlabelled kidney tissue sections to generate micro-structured combinations of haematoxylin and eosin (H&E), Jones’ silver stain, and Masson’s trichrome stain. Using a single network, this approach multiplexes the virtual staining of label-free tissue images with multiple types of stains and paves the way for synthesizing new digital histological stains that can be created in the same tissue cross section, which is currently not feasible with standard histochemical staining methods. 
    more » « less