Nanoscale thermometry, an approach based on non-invasive, yet precise measurements of temperature with nanometer spatial resolution, has emerged as a very active field of research over the last few years. In transmission electron microscopy, nanoscale thermometry is particularly important during in situ experiments or to assess the effects of beam induced heating. In this article, we present a nanoscale thermometry approach based on electron energy-loss spectroscopy in a transmission electron microscope to measure locally the temperature of silicon nanoparticles using the energy shift of the plasmon resonance peak with respect to the zero-loss peak as a function of temperature. We demonstrate that using non-negative matrix factorization and curve fitting of stacked spectra, the temperature accuracy can be improved significantly over previously reported manual fitting approaches. We will discuss the necessary acquisition parameters to achieve a precision of 6 meV to determine the plasmon peak position.
more »
« less
Fast Zernike fitting of freeform surfaces using the Gauss-Legendre quadrature
Zernike polynomial orthogonality, an established mathematical principle, is leveraged with the Gauss-Legendre quadrature rule in a rapid novel approach to fitting data over a circular domain. This approach provides significantly faster fitting speeds, in the order of thousands of times, while maintaining comparable error rates achieved with conventional least-square fitting techniques. We demonstrate the technique for fitting mid-spatial-frequencies (MSF) prevalent in small-tool-manufacturing typical of aspheric and freeform optics that are poised to soon permeate a wide range of optical technologies.
more »
« less
- PAR ID:
- 10507353
- Publisher / Repository:
- Optical Society of America
- Date Published:
- Journal Name:
- Optics Express
- Volume:
- 32
- Issue:
- 11
- ISSN:
- 1094-4087; OPEXFF
- Format(s):
- Medium: X Size: Article No. 20011
- Size(s):
- Article No. 20011
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The SNP-set analysis is a powerful tool for dissecting the genetics of complex human diseases. There are three fundamental genetic association approaches to SNR-set analysis: the marginal model fitting approach, the joint model fitting approach, and the decorrelation approach. A problem of primary interest is how these approaches compare with each other. To address this problem, we develop a theoretical platform to compare the signal-to-noise ratio (SNR) of these approaches under the generalized linear model. We elaborate on how causal genetic effects give rise to statistically detectable association signals, and show that when causal effects spread over blocks of strong linkage disequilibrium (LD), the SNR of the marginal model fitting is usually higher than that of the decorrelation approach, which in turn is higher than that of the unbiased joint model fitting approach. We also scrutinize dense effects and LDs by a bivariate model and extensive simulations using the 1000 Genome Project data. Last, we compare the statistical power of two generic types of SNP-set tests (summation-based and supremum-based) by simulations and an osteoporosis study using large data from UK Biobank. Our results help develop powerful tools for SNP-set analysis and understand the signal detection problem in the presence of colored noise.more » « less
-
Adults with mild-to-moderate hearing loss can use over-the-counter hearing aids to treat their hearing loss at a fraction of traditional hearing care costs. These products incorporate self-fitting methods that allow end-users to configure their hearing aids without the help of an audiologist. A self-fitting method helps users configure the gain-frequency responses that control the amplification for each frequency band of the incoming sound. This paper considers how to guide the design of self-fitting methods by evaluating certain aspects of their design using computational tools before performing user studies. Most existing fitting methods provide various user interfaces to allow users to select a configuration from a predetermined set of presets. Accordingly, it is essential for the presets to meet the hearing needs of a large fraction of users who suffer from varying degrees of hearing loss and have unique hearing preferences. To this end, we propose a novel metric for evaluating the effectiveness of preset-based approaches by computing their population coverage. The population coverage estimates the fraction of users for which a self-fitting method can find a configuration they prefer. A unique aspect of our approach is a probabilistic model that captures how a user's unique preferences differ from other users with similar hearing loss. Next, we propose methods for building preset-based and slider-based self-fitting methods that maximize the population coverage. Simulation results demonstrate that the proposed algorithms can effectively select a small number of presets that provide higher population coverage than clustering-based approaches. Moreover, we may use our algorithms to configure the number of increments of slider-based methods. We expect that the computational tools presented in this article will help reduce the cost of developing new self-fitting methods by allowing researchers to evaluate population coverage before performing user studies.more » « less
-
It is increasingly common to encounter prediction tasks in the biomedical sciences for which multiple datasets are available for model training. Common approaches such as pooling datasets before model fitting can produce poor out‐of‐study prediction performance when datasets are heterogeneous. Theoretical and applied work has shownmultistudy ensemblingto be a viable alternative that leverages the variability across datasets in a manner that promotes model generalizability. Multistudy ensembling uses a two‐stagestackingstrategy which fits study‐specific models and estimates ensemble weights separately. This approach ignores, however, the ensemble properties at the model‐fitting stage, potentially resulting in performance losses. Motivated by challenges in the estimation of COVID‐attributable mortality, we proposeoptimal ensemble construction, an approach to multistudy stacking whereby we jointly estimate ensemble weights and parameters associated with study‐specific models. We prove that limiting cases of our approach yield existing methods such as multistudy stacking and pooling datasets before model fitting. We propose an efficient block coordinate descent algorithm to optimize the loss function. We use our method to perform multicountry COVID‐19 baseline mortality prediction. We show that when little data is available for a country before the onset of the pandemic, leveraging data from other countries can substantially improve prediction accuracy. We further compare and characterize the method's performance in data‐driven simulations and other numerical experiments. Our method remains competitive with or outperforms multistudy stacking and other earlier methods in the COVID‐19 data application and in a range of simulation settings.more » « less
-
A Maxwell relation for a reaction rate constant (or other dynamical timescale) obtained under constant pressure, p , and temperature, T , is introduced and discussed. Examination of this relationship in the context of fluctuation theory provides insight into the p and T dependence of the timescale and the underlying molecular origins. This Maxwell relation motivates a suggestion for the general form of the timescale as a function of pressure and temperature. This is illustrated by accurately fitting simulation results and existing experimental data on the self-diffusion coefficient and shear viscosity of liquid water. A key advantage of this approach is that each fitting parameter is physically meaningful.more » « less
An official website of the United States government
