skip to main content


Title: Accounting for Location Measurement Error in Imaging Data With Application to Atomic Resolution Images of Crystalline Materials
Scientists use imaging to identify objects of interest and infer properties of these objects. The locations of these objects are often measured with error, which when ignored leads to biased parameter estimates and inflated variance. Current measurement error methods require an estimate or knowledge of the measurement error variance to correct these estimates, which may not be available. Instead, we create a spatial Bayesian hierarchical model that treats the locations as parameters, using the image itself to incorporate positional uncertainty. We lower the computational burden by approximating the likelihood using a noncontiguous block design around the object locations. We use this model to quantify the relationship between the intensity and displacement of hundreds of atom columns in crystal structures directly imaged via scanning transmission electron microscopy (STEM). Atomic displacements are related to important phenomena such as piezoelectricity, a property useful for engineering applications like ultrasound. Quantifying the sign and magnitude of this relationship will help materials scientists more precisely design materials with improved piezoelectricity. A simulation study confirms our method corrects bias in the estimate of the parameter of interest and drastically improves coverage in high noise scenarios compared to non-measurement error models.  more » « less
Award ID(s):
1633587
NSF-PAR ID:
10295722
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Technometrics
ISSN:
0040-1706
Page Range / eLocation ID:
1 to 11
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Optimal designs minimize the number of experimental runs (samples) needed to accurately estimate model parameters, resulting in algorithms that, for instance, efficiently minimize parameter estimate variance. Governed by knowledge of past observations, adaptive approaches adjust sampling constraints online as model parameter estimates are refined, continually maximizing expected information gained or variance reduced. We apply adaptive Bayesian inference to estimate transition rates of Markov chains, a common class of models for stochastic processes in nature. Unlike most previous studies, our sequential Bayesian optimal design is updated with each observation and can be simply extended beyond two-state models to birth–death processes and multistate models. By iteratively finding the best time to obtain each sample, our adaptive algorithm maximally reduces variance, resulting in lower overall error in ground truth parameter estimates across a wide range of Markov chain parameterizations and conformations. 
    more » « less
  2. Abstract

    Multispecies occupancy models estimate dependence among multiple species of interest from patterns of co‐occurrence, but problems associated with separation and boundary estimates can lead to unreasonably large estimates of parameters and associated standard errors when species are rarely observed at the same site or when data are sparse. In this paper, we overcome these issues by implementing a penalized likelihood, which introduces a small bias in parameter estimates in exchange for a potentially large reduction in variance. We compare parameter estimates obtained from both penalized and unpenalized multispecies occupancy models fit to simulated data that exhibit various degrees of separation and to a real‐word data set of bird surveys with little apparent overlap between potentially interacting species. Our simulation results demonstrate that penalized multispecies occupancy models did not exhibit boundary estimates and produced lower bias, lower mean squared error, and improved inference relative to unpenalized models. When applied to real‐world data, our penalized multispecies occupancy model constrained boundary estimates and allowed for meaningful inference related to the interactions of two species of conservation concern. To facilitate the use of our penalized multispecies occupancy model, the techniques demonstrated in this paper have been integrated into theunmarkedpackage in R programing language.

     
    more » « less
  3. Purpose

    To improve the performance of neural networks for parameter estimation in quantitative MRI, in particular when the noise propagation varies throughout the space of biophysical parameters.

    Theory and Methods

    A theoretically well‐founded loss function is proposed that normalizes the squared error of each estimate with respective Cramér–Rao bound (CRB)—a theoretical lower bound for the variance of an unbiased estimator. This avoids a dominance of hard‐to‐estimate parameters and areas in parameter space, which are often of little interest. The normalization with corresponding CRB balances the large errors of fundamentally more noisy estimates and the small errors of fundamentally less noisy estimates, allowing the network to better learn to estimate the latter. Further, proposed loss function provides an absolute evaluation metric for performance: A network has an average loss of 1 if it is a maximally efficient unbiased estimator, which can be considered the ideal performance. The performance gain with proposed loss function is demonstrated at the example of an eight‐parameter magnetization transfer model that is fitted to phantom and in vivo data.

    Results

    Networks trained with proposed loss function perform close to optimal, that is, their loss converges to approximately 1, and their performance is superior to networks trained with the standard mean‐squared error (MSE). The proposed loss function reduces the bias of the estimates compared to the MSE loss, and improves the match of the noise variance to the CRB. This performance gain translates to in vivo maps that align better with the literature.

    Conclusion

    Normalizing the squared error with the CRB during the training of neural networks improves their performance in estimating biophysical parameters.

     
    more » « less
  4. We review select mature geomorphic transport laws for use in temperate ridge and valley landscapes and compile parameter estimates for use in applications. This work is motivated by a case study of sensitivity analysis, calibration, validation, multimodel comparison, and prediction under uncertainty, which required bounding values for parameter ranges. Considered geomorphic transport formulae span hillslope sediment transport, soil production, and erosion by surface water. We compile or derive estimates for the parameters in these transport formulae. Additionally, we address a common challenge—connecting changes in precipitation distribution to changes in effective erodibility—by using a simple hydrologic model and a method to estimate precipitation distribution parameters using commonly available data. While some parameters are reasonably well constrained, others span orders of magnitude. Some, such as soil infiltration capacity, have a direct physical meaning but are challenging to measure on geologically relevant timescales. Through the process of compiling these ranges we identify common challenges in parameter determination. The issue of comparable units derives from considering an exponent as an empirically inferred coefficient rather than as an expression of a fundamental relationship. The issue of appropriate timescales derives from the mismatch between human measurement and geologic timescales. This contribution thus serves both as a practical compilation for applications and as a synthesis of outstanding challenges in parameter selection for geomorphic transport laws.

     
    more » « less
  5. null (Ed.)
    Abstract Calibration of computer models and the use of those design models are two activities traditionally carried out separately. This paper generalizes existing Bayesian inverse analysis approaches for computer model calibration to present a methodology combining calibration and design in a unified Bayesian framework. This provides a computationally efficient means to undertake both tasks while quantifying all relevant sources of uncertainty. Specifically, compared with the traditional approach of design using parameter estimates from previously completed model calibration, this generalized framework inherently includes uncertainty from the calibration process in the design procedure. We demonstrate our approach to the design of a vibration isolation system. We also demonstrate how, when adaptive sampling of the phenomenon of interest is possible, the proposed framework may select new sampling locations using both available real observations and the computer model. This is especially useful when a misspecified model fails to reflect that the calibration parameter is functionally dependent upon the design inputs to be optimized. 
    more » « less