Micro-CT, also known as X-ray micro-computed tomography, has emerged as the primary instrument for pore-scale properties study in geological materials. Several studies have used deep learning to achieve super-resolution reconstruction in order to balance the trade-off between resolution of CT images and field of view. Nevertheless, most existing methods only work with single-scale CT scans, ignoring the possibility of using multi-scale image features for image reconstruction. In this study, we proposed a super-resolution approach via multi-scale fusion using residual U-Net for rock micro-CT image reconstruction (MS-ResUnet). The residual U-Net provides an encoder-decoder structure. In each encoder layer, several residual sequential blocks and improved residual blocks are used. The decoder is composed of convolutional ReLU residual blocks and residual chained pooling blocks. During the encoding-decoding method, information transfers between neighboring multi-resolution images are fused, resulting in richer rock characteristic information. Qualitative and quantitative comparisons of sandstone, carbonate, and coal CT images demonstrate that our proposed algorithm surpasses existing approaches. Our model accurately reconstructed the intricate details of pores in carbonate and sandstone, as well as clearly visible coal cracks.
more »
« less
Replication Data for: Adapting super-resolution reconstruction for skeletal analysis of clinical computed tomography data
{"Abstract":["3D model files supporting the findings of "Adapting super-resolution reconstruction for skeletal analysis of clinical computed tomography data." This study evaluates the utility of an automated super-resolution reconstruction (SRR) framework in generating high-resolution skeletal models from multiple orthogonal thick-slice CT stacks. Archived CT scans of long bones from 33 individuals (aged 0 to 16 years) were retrospectively collected from the National Taiwan University Hospital. Two sets of models were generated per bone: one derived from the original low-resolution, thick-slice CT image stacks (“pre”), and one from the reconstructed high-resolution volumes produced using the NiftyMIC pipeline (“post”)."]}
more »
« less
- Award ID(s):
- 1945797
- PAR ID:
- 10655375
- Publisher / Repository:
- George Mason University Dataverse
- Date Published:
- Edition / Version:
- 1.0
- Subject(s) / Keyword(s):
- Medicine, Health and Life Sciences Social Sciences computed tomography long bones 3D models non-adult
- Format(s):
- Medium: X Size: 9505084; 24262559; 6555084; 3970384; 25875861; 4050684; 43345996; 5887084; 34514839; 14528884; 4425184; 51384277; 16872084; 1150884; 1528284; 19483861; 12178868; 8407484; 7789284; 7759484; 20171678; 42707619; 945284; 5294388; 97248378; 5126484; 21371684; 5800084; 39187775; 48327134; 5078884; 62856712; 2841684; 8045284; 5463484; 6156684; 11364484; 2788984; 9789602; 30398673; 10432624; 16888584; 9335884; 2971084; 6705084; 24667758; 37714850; 19399533; 2424684; 3204684; 13677784; 5847584; 7399284; 42870708; 5020084; 56034537; 2857884; 59603862; 11854484; 30170346; 54563784; 3909284; 8578884; 6684484; 15263284; 24358291; 24867815; 11084084; 4081884; 73577666; 16875084; 1894084; 8308918; 7513598; 60528392; 5481684; 6170584; 33142726; 7783284; 6337084; 10650884; 38876609; 1074084; 7691284; 7297284; 6139084; 9299084; 23779858; 7817484; 41529114; 20024883; 15291284; 50967442; 143870344; 6047884; 25691712; 3207284; 9256284; 1987684; 33651832; 2910484; 19128854; 18351884; 3262684; 2937784; 17903884; 19496244; 4140884; 5170884; 6379035; 8007472; 12775884; 2664284; 6334884; 5298084; 7020884; 3779084; 6150684; 3587284; 12996884; 9926284; 17065084; 531; 716 Other: application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; application/vnd.ms-pki.stl; text/tab-separated-values; text/plain
- Size(s):
- 9505084 24262559 6555084 3970384 25875861 4050684 43345996 5887084 34514839 14528884 4425184 51384277 16872084 1150884 1528284 19483861 12178868 8407484 7789284 7759484 20171678 42707619 945284 5294388 97248378 5126484 21371684 5800084 39187775 48327134 5078884 62856712 2841684 8045284 5463484 6156684 11364484 2788984 9789602 30398673 10432624 16888584 9335884 2971084 6705084 24667758 37714850 19399533 2424684 3204684 13677784 5847584 7399284 42870708 5020084 56034537 2857884 59603862 11854484 30170346 54563784 3909284 8578884 6684484 15263284 24358291 24867815 11084084 4081884 73577666 16875084 1894084 8308918 7513598 60528392 5481684 6170584 33142726 7783284 6337084 10650884 38876609 1074084 7691284 7297284 6139084 9299084 23779858 7817484 41529114 20024883 15291284 50967442 143870344 6047884 25691712 3207284 9256284 1987684 33651832 2910484 19128854 18351884 3262684 2937784 17903884 19496244 4140884 5170884 6379035 8007472 12775884 2664284 6334884 5298084 7020884 3779084 6150684 3587284 12996884 9926284 17065084 531 716
- Institution:
- George Mason University
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Diffusible iodine-based contrast-enhanced computed tomography (diceCT) has emerged as a viable tool for discriminating soft tissues in serial CT slices, which can then be used for three-dimensional analysis. This technique has some potential to supplant histology as a tool for identification of body tissues. Here, we studied the head of an adult fruit bat ( Cynopterus sphinx ) and a late fetal vampire bat ( Desmodus rotundus ) using diceCT and µCT. Subsequently, we decalcified, serially sectioned and stained the same heads. The two CT volumes were rotated so that the sectional plane of the slice series closely matched that of histological sections, yielding the ideal opportunity to relate CT observations to corresponding histology. Olfactory epithelium is typically thicker, on average, than respiratory epithelium in both bats. Thus, one investigator (SK), blind to the histological sections, examined the diceCT slice series for both bats and annotated changes in thickness of epithelium on the first ethmoturbinal (ET I), the roof of the nasal fossa, and the nasal septum. A second trial was conducted with an added criterion: radioopacity of the lamina propria as an indicator of Bowman’s glands. Then, a second investigator (TS) annotated images of matching histological sections based on microscopic observation of epithelial type, and transferred these annotations to matching CT slices. Measurements of slices annotated according to changes in epithelial thickness alone closely track measurements of slices based on histologically-informed annotations; matching histological sections confirm blind annotations were effective based on epithelial thickness alone, except for a patch of unusually thick non-OE, mistaken for OE in one of the specimens. When characteristics of the lamina propria were added in the second trial, the blind annotations excluded the thick non-OE. Moreover, in the fetal bat the use of evidence for Bowman’s glands improved detection of olfactory mucosa, perhaps because the epithelium itself was thin enough at its margins to escape detection. We conclude that diceCT can by itself be highly effective in identifying distribution of OE, especially where observations are confirmed by histology from at least one specimen of the species. Our findings also establish that iodine staining, followed by stain removal, does not interfere with subsequent histological staining of the same specimen.more » « less
-
Deep learning (DL) has been increasingly explored in low-dose CT image denoising. DL products have also been submitted to the FDA for premarket clearance. While having the potential to improve image quality over the filtered back projection method (FBP) and produce images quickly, generalizability of DL approaches is a major concern because the performance of a DL network can depend highly on the training data. In this work we take a residual encoder-decoder convolutional neural network (REDCNN)-based CT denoising method as an example. We investigate the effect of the scan parameters associated with the training data on the performance of this DL-based CT denoising method and identifies the scan parameters that may significantly impact its performance generalizability. This abstract particularly examines these three parameters: reconstruction kernel, dose level and slice thickness. Our preliminary results indicate that the DL network may not generalize well between FBP reconstruction kernels, but is insensitive to slice thickness for slice-wise denoising. The results also suggest that training with mixed dose levels improves denoising performance.more » « less
-
Confocal microscopy is a standard approach for obtaining volumetric images of a sample with high axial and lateral resolution, especially when dealing with scattering samples. Unfortunately, a confocal microscope is quite expensive compared to traditional microscopes. In addition, the point scanning in confocal microscopy leads to slow imaging speed and photobleaching due to the high dose of laser energy. In this paper, we demonstrate how the advances in machine learning can be exploited to teach a traditional wide-field microscope, one that’s available in every lab, into producing 3D volumetric images like a confocal microscope. The key idea is to obtain multiple images with different focus settings using a wide-field microscope and use a 3D generative adversarial network (GAN) based neural network to learn the mapping between the blurry low-contrast image stacks obtained using a wide-field microscope and the sharp, high-contrast image stacks obtained using a confocal microscope. After training the network with widefield-confocal stack pairs, the network can reliably and accurately reconstruct 3D volumetric images that rival confocal images in terms of its lateral resolution, z-sectioning and image contrast. Our experimental results demonstrate generalization ability to handle unseen data, stability in the reconstruction results, high spatial resolution even when imaging thick (∼40 microns) highly-scattering samples. We believe that such learning-based microscopes have the potential to bring confocal imaging quality to every lab that has a wide-field microscope.more » « less
-
Summary Plant cuticles protect the interior tissues from ambient hazards, including desiccation, UV light, physical wear, herbivores and pathogens. Consequently, cuticle properties are shaped by evolutionary selection.We compiled a global dataset of leaf cuticle thickness (CT) and accompanying leaf traits for 1212 species, mostly angiosperms, from 293 sites representing all vegetated continents. We developed and tested 11 hypotheses concerning ecological drivers of interspecific variation in CT.CT showed clear patterning according to latitude, biome, taxonomic family, site climate and other leaf traits. Species with thick leaves and/or high leaf mass per area tended to have thicker cuticles, as did evergreen relative to deciduous woody species, and species from sites that during the growing season were warmer, had fewer frost days and lower wind speeds, and occurred at lower latitudes. CT–environment relationships were notably stronger among nonwoody than woody species.Heavy investment in cuticle may be disadvantaged at sites with high winds and frequent frosts for ‘economic’ or biomechanical reasons, or because of reduced herbivore pressure. Alternatively, cuticles may become more heavily abraded under such conditions. Robust quantification of CT–trait–environment relationships provides new insights into the multiple roles of cuticles, with additional potential use in paleo‐ecological reconstruction.more » « less
An official website of the United States government
