Abstract PurposeForward and backprojections are the basis of all model‐based iterative reconstruction (MBIR) methods. However, computing these accurately is time‐consuming. In this paper, we present a method for MBIR in parallel X‐ray beam geometry that utilizes a Gram filter to efficiently implement forward and backprojection. MethodsWe propose using voxel‐basis and modeling its footprint in a box spline framework to calculate the Gram filter exactly and improve the performance of backprojection. In the special case of parallel X‐ray beam geometry, the forward and backprojection can be implemented by an estimated Gram filter efficiently if the sinogram signal is bandlimited. In this paper, a specialized sinogram interpolation method is proposed to eliminate the bandlimited prerequisite and thus improve the reconstruction accuracy. We build on this idea by utilizing the continuity of the voxel‐basis' footprint, which provides a more accurate sinogram interpolation and further improves the efficiency and quality of backprojection. In addition, the detector blur effect can be efficiently accounted for in our method to better handle realistic scenarios. ResultsThe proposed method is tested on both phantom and real computed tomography (CT) images under different resolutions, sinogram sampling steps, and noise levels. The proposed method consistently outperforms other state‐of‐the‐art projection models in terms of speed and accuracy for both backprojection and reconstruction. ConclusionsWe proposed a iterative reconstruction methodology for 3D parallel‐beam X‐ray CT reconstruction. Our experimental results demonstrate that the proposed methodology is accurate, fast, and reproducible, and outperforms alternative state‐of‐the‐art projection models on both backprojection and reconstruction results significantly.
more »
« less
Addressing CT metal artifacts using photon‐counting detectors and one‐step spectral CT image reconstruction
Abstract PurposeThe constrained one‐step spectral CT image reconstruction (cOSSCIR) algorithm with a nonconvex alternating direction method of multipliers optimizer is proposed for addressing computed tomography (CT) metal artifacts caused by beam hardening, noise, and photon starvation. The quantitative performance of cOSSCIR is investigated through a series of photon‐counting CT simulations. MethodscOSSCIR directly estimates basis material maps from photon‐counting data using a physics‐based forward model that accounts for beam hardening. The cOSSCIR optimization framework places constraints on the basis maps, which we hypothesize will stabilize the decomposition and reduce streaks caused by noise and photon starvation. Another advantage of cOSSCIR is that the spectral data need not be registered, so that a ray can be used even if some energy window measurements are unavailable. Photon‐counting CT acquisitions of a virtual pelvic phantom with low‐contrast soft tissue texture and bilateral hip prostheses were simulated. Bone and water basis maps were estimated using the cOSSCIR algorithm and combined to form a virtual monoenergetic image for the evaluation of metal artifacts. The cOSSCIR images were compared to a “two‐step” decomposition approach that first estimated basis sinograms using a maximum likelihood algorithm and then reconstructed basis maps using an iterative total variation constrained least‐squares optimization (MLE+TV). Images were also compared to a nonspectral TV reconstruction of the total number of counts detected for each ray with and without normalized metal artifact reduction (NMAR) applied. The simulated metal density was increased to investigate the effects of increasing photon starvation. The quantitative error and standard deviation in regions of the phantom were compared across the investigated algorithms. The ability of cOSSCIR to reproduce the soft‐tissue texture, while reducing metal artifacts, was quantitatively evaluated. ResultsNoiseless simulations demonstrated the convergence of the cOSSCIR and MLE+TV algorithms to the correct basis maps in the presence of beam‐hardening effects. When noise was simulated, cOSSCIR demonstrated a quantitative error of −1 HU, compared to 2 HU error for the MLE+TV algorithm and −154 HU error for the nonspectral TV+NMAR algorithm. For the cOSSCIR algorithm, the standard deviation in the central iodine region of interest was 20 HU, compared to 299 HU for the MLE+TV algorithm, 41 HU for the MLE+TV+Mask algorithm that excluded rays through metal, and 55 HU for the nonspectral TV+NMAR algorithm. Increasing levels of photon starvation did not impact the bias or standard deviation of the cOSSCIR images. cOSSCIR was able to reproduce the soft‐tissue texture when an appropriate regularization constraint value was selected. ConclusionsBy directly inverting photon‐counting CT data into basis maps using an accurate physics‐based forward model and a constrained optimization algorithm, cOSSCIR avoids metal artifacts due to beam hardening, noise, and photon starvation. The cOSSCIR algorithm demonstrated improved stability and accuracy compared to a two‐step method of decomposition followed by reconstruction.
more »
« less
- PAR ID:
- 10388218
- Publisher / Repository:
- Wiley Blackwell (John Wiley & Sons)
- Date Published:
- Journal Name:
- Medical Physics
- Volume:
- 49
- Issue:
- 5
- ISSN:
- 0094-2405
- Format(s):
- Medium: X Size: p. 3021-3040
- Size(s):
- p. 3021-3040
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We develop a framework for reconstructing images that are sparse in an appropriate transform domain from polychromatic computed tomography (CT) measurements under the blind scenario where the material of the inspected object and incident-energy spectrum are unknown. Assuming that the object that we wish to reconstruct consists of a single material, we obtain a parsimonious measurement-model parameterization by changing the integral variable from photon energy to mass attenuation, which allows us to combine the variations brought by the unknown incident spectrum and mass attenuation into a single unknown mass-attenuation spectrum function; the resulting measurement equation has the Laplace-integral form. The mass-attenuation spectrum is then expanded into basis functions using B splines of order one. We consider a Poisson noise model and establish conditions for biconvexity of the corresponding negative log-likelihood (NLL) function with respect to the density-map and mass-attenuation spectrum parameters. We derive a block-coordinate descent algorithm for constrained minimization of a penalized NLL objective function, where penalty terms ensure nonnegativity of the mass-attenuation spline coefficients and nonnegativity and gradient-map sparsity of the density-map image, imposed using a convex total-variation (TV) norm; the resulting objective function is biconvex. This algorithm alternates between a Nesterov’s proximal-gradient (NPG) step and a limited-memory Broyden-Fletcher-Goldfarb-Shanno with box constraints (L-BFGS-B) iteration for updating the image and mass-attenuation spectrum parameters, respectively. We prove the Kurdyka-Łojasiewicz property of the objective function, which is important for establishing local convergence of block-coordinate descent schemes in biconvex optimization problems. Our framework applies to other NLLs and signal-sparsity penalties, such as lognormal NLL and ℓ₁ norm of 2D discrete wavelet transform (DWT) image coefficients. Numerical experiments with simulated and real X-ray CT data demonstrate the performance of the proposed scheme.more » « less
-
Features such as particles, pores, or cracks are challenging to measure accurately in CT data when they are small relative to the data resolution, characterized as a point-spread function (PSF). These challenges are particularly acute when paired with segmentation, as the PSF distributes some of the signal from a voxel among neighboring ones; effectively dispersing some of the signal from a given object to a region outside of it. Any feature of interest with one or more dimensions on the order of the PSF will be impacted by this effect, and measurements based on global thresholds necessarily fail. Measurements of the same features should be consistent across different instruments and data resolutions. The PVB (partial volume and blurring) method successfully compensates by quantifying features that are small in all three dimensions based on their attenuation anomaly. By calibrating the CT number of the phase of interest (in this case, gold) it is possible to accurately measure particles down to <6 voxels in data acquired on two instruments, 14 years apart, despite severe artifacts. Altogether, the PVB method is accurate, reproducible, resolution-invariant, and objective; it is also notable for its favorable error structure. The principal challenge is the need for representative effective CT numbers, which reflect not only the features of interest themselves, but also the X-ray spectrum, the size, shape and composition of the enclosing sample, and processing details such as beam-hardening correction. Empirical calibration is the most effective approach.more » « less
-
Abstract The goal of this study is to develop a new computed tomography (CT) image reconstruction method, aiming at improving the quality of the reconstructed images of existing methods while reducing computational costs. Existing CT reconstruction is modeled by pixel-based piecewise constant approximations of the integral equation that describes the CT projection data acquisition process. Using these approximations imposes a bottleneck model error and results in a discrete system of a large size. We propose to develop a content-adaptive unstructured grid (CAUG) based regularized CT reconstruction method to address these issues. Specifically, we design a CAUG of the image domain to sparsely represent the underlying image, and introduce a CAUG-based piecewise linear approximation of the integral equation by employing a collocation method. We further apply a regularization defined on the CAUG for the resulting ill-posed linear system, which may lead to a sparse linear representation for the underlying solution. The regularized CT reconstruction is formulated as a convex optimization problem, whose objective function consists of a weighted least square norm based fidelity term, a regularization term and a constraint term. Here, the corresponding weighted matrix is derived from the simultaneous algebraic reconstruction technique (SART). We then develop a SART-type preconditioned fixed-point proximity algorithm to solve the optimization problem. Convergence analysis is provided for the resulting iterative algorithm. Numerical experiments demonstrate the superiority of the proposed method over several existing methods in terms of both suppressing noise and reducing computational costs. These methods include the SART without regularization and with the quadratic regularization, the traditional total variation (TV) regularized reconstruction method and the TV superiorized conjugate gradient method on the pixel grid.more » « less
-
Abstract BackgroundDual‐energy CT (DECT) systems provide valuable material‐specific information by simultaneously acquiring two spectral measurements, resulting in superior image quality and contrast‐to‐noise ratio (CNR) while reducing radiation exposure and contrast agent usage. The selection of DECT scan parameters, including x‐ray tube settings and fluence, is critical for the stability of the reconstruction process and hence the overall image quality. PurposeThe goal of this study is to propose a systematic theoretical method for determining the optimal DECT parameters for minimal noise and maximum CNR in virtual monochromatic images (VMIs) for fixed subject size and total radiation dose. MethodsThe noise propagation in the process of projection based material estimation from DECT measurements is analyzed. The main components of the study are the mean pixel variances for the sinogram and monochromatic image and the CNR, which were shown to depend on the Jacobian matrix of the sinograms‐to‐DECT measurements map.Analytic estimates for the mean sinogram and monochromatic image pixel variances and the CNR as functions of tube potentials, fluence, and VMI energy are derived, and then used in a virtual phantom experiment as an objective function for optimizing the tube settings and VMI energy to minimize the image noise and maximize the CNR. ResultsIt was shown that DECT measurements corresponding to kV settings that maximize the square of Jacobian determinant values over a domain of interest lead to improved stability of basis material reconstructions.Instances of non‐uniqueness in DECT were addressed, focusing on scenarios where the Jacobian determinant becomes zero within the domain of interest despite significant spectral separation. The presence of non‐uniqueness can lead to singular solutions during the inversion of sinograms‐to‐DECT measurements, underscoring the importance of considering uniqueness properties in parameter selection.Additionally, the optimal VMI energy and tube potentials for maximal CNR was determined. When the x‐ray beam filter material was fixed at 2 mm of aluminum and the photon fluence for low and high kV scans were considered equal, the tube potential pair of 60/120 kV led to the maximal iodine CNR in the VMI at 53 keV. ConclusionsOptimizing DECT scan parameters to maximize the CNR can be done in a systematic way. Also, choosing the parameters that maximize the Jacobian determinant over the set of expected line integrals leads to more stable reconstructions due to the reduced amplification of the measurement noise. Since the values of the Jacobian determinant depend strongly on the imaging task, careful consideration of all of the relevant factors is needed when implementing the proposed framework.more » « less
An official website of the United States government
