Abstract The spectral model turbulence analysis technique is widely used to derive kinetic energy dissipation rates of turbulent structures (ɛ) from different in situ measurements in the Earth's atmosphere. The essence of this method is to fit a model spectrum to measured spectra of velocity or scalar quantity fluctuations and thereby to deriveɛonly from wavenumber dependence of turbulence spectra. Owing to the simplicity of spectral model of Heisenberg (1948),https://doi.org/10.1007/bf01668899its application dominates in the literature. Making use of direct numerical simulations which are able to resolve turbulence spectra down to the smallest scales in dissipation range, we advance the spectral model technique by quantifying uncertainties for two spectral models, the Heisenberg (1948),https://doi.org/10.1007/bf01668899and the Tatarskii (1971) model, depending on (a) resolution of measurements, (b) stage of turbulence evolution, (c) model used. We show that the model of Tatarskii (1971) can yield more accurate results and reveals higher sensitivity to the lowestɛ‐values. This study shows that the spectral model technique can reliably deriveɛif measured spectra only resolve half‐decade of power change within the viscous (viscous‐convective) subrange. In summary, we give some practical recommendations on how to derive the most precise and detailed turbulence dissipation field from in situ measurements depending on their quality. We also supply program code of the spectral models used in this study in Python, IDL, and Matlab.
more »
« less
Contact.engineering—Create, analyze and publish digital surface twins from topography measurements across many scales
Abstract The optimization of surface finish to improve performance, such as adhesion, friction, wear, fatigue life, or interfacial transport, occurs largely through trial and error, despite significant advancements in the relevant science. There are three central challenges that account for this disconnect: (1) the challenge of integration of many different types of measurement for the same surface to capture the multi-scale nature of roughness; (2) the technical complexity of implementing spectral analysis methods, and of applying mechanical or numerical models to describe surface performance; (3) a lack of consistency between researchers and industries in how surfaces are measured, quantified, and communicated. Here we present a freely-available internet-based application (available athttps://contact.engineering) which attempts to overcome all three challenges. First, the application enables the user to upload many different topography measurements taken from a single surface, including using different techniques, and then integrates all of them together to create a digital surface twin. Second, the application calculates many of the commonly used topography metrics, such as root-mean-square parameters, power spectral density (PSD), and autocorrelation function (ACF), as well as implementing analytical and numerical calculations, such as boundary element modeling (BEM) for elastic and plastic deformation. Third, the application serves as a repository for users to securely store surfaces, and if they choose, to share these with collaborators or even publish them (with a digital object identifier) for all to access. The primary goal of this application is to enable researchers and manufacturers to quickly and easily apply cutting-edge tools for the characterization and properties-modeling of real-world surfaces. An additional goal is to advance the use of open-science principles in surface engineering by providing a FAIR database where researchers can choose to publish surface measurements for all to use.
more »
« less
- Award ID(s):
- 1844739
- PAR ID:
- 10371556
- Publisher / Repository:
- IOP Publishing
- Date Published:
- Journal Name:
- Surface Topography: Metrology and Properties
- Volume:
- 10
- Issue:
- 3
- ISSN:
- 2051-672X
- Format(s):
- Medium: X Size: Article No. 035032
- Size(s):
- Article No. 035032
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Numerically generating synthetic surface topography that closely resembles the features and characteristics of experimental surface topography measurements reduces the need to perform these intricate and costly measurements. However, existing algorithms to numerically generated surface topography are not well-suited to create the specific characteristics and geometric features of as-built surfaces that result from laser powder bed fusion (LPBF), such as partially melted metal particles, porosity, laser scan lines, and balling. Thus, we present a method to generate synthetic as-built LPBF surface topography maps using a progressively growing generative adversarial network. We qualitatively and quantitatively demonstrate good agreement between synthetic and experimental as-built LPBF surface topography maps using areal and deterministic surface topography parameters, radially averaged power spectral density, and material ratio curves. The ability to accurately generate synthetic as-built LPBF surface topography maps reduces the experimental burden of performing a large number of surface topography measurements. Furthermore, it facilitates combining experimental measurements with synthetic surface topography maps to create large data-sets that facilitate, e.g. relating as-built surface topography to LPBF process parameters, or implementing digital surface twins to monitor complex end-use LPBF parts, amongst other applications.more » « less
-
A<sc>bstract</sc> We presentνDoBe, a Python tool for the computation of neutrinoless double beta decay (0νββ) rates in terms of lepton-number-violating operators in the Standard Model Effective Field Theory (SMEFT). The tool can be used for automated calculations of 0νββrates, electron spectra and angular correlations for all isotopes of experimental interest, for lepton-number-violating operators up to and including dimension 9. The tool takes care of renormalization-group running to lower energies and provides the matching to the low-energy effective field theory and, at lower scales, to a chiral effective field theory description of 0νββrates. The user can specify different sets of nuclear matrix elements from various many-body methods and hadronic low-energy constants. The tool can be used to quickly generate analytical and numerical expressions for 0νββrates and to generate a large variety of plots. In this work, we provide examples of possible use along with a detailed code documentation. The code can be accessed through: GitHub:https://github.com/OScholer/nudobe Online User-Interface:https://oscholer-nudobe-streamlit-4foz22.streamlit.app/more » « less
-
Abstract Gravitational waves (GWs) from merging compact objects encode direct information about the luminosity distance to the binary. When paired with a redshift measurement, this enables standard-siren cosmology: a Hubble diagram can be constructed to directly probe the Universe’s expansion. This can be done in the absence of electromagnetic measurements, as features in the mass distribution of GW sources provide self-calibrating redshift measurements without the need for a definite or probabilistic host galaxy association. This “spectral siren” technique has thus far only been applied with simple parametric representations of the mass distribution, and theoretical predictions for features in the mass distribution are commonly presumed to be fundamental to the measurement. However, the use of an inaccurate representation leads to biases in the cosmological inference, an acute problem given the current uncertainties in true source population. Furthermore, it is commonly presumed that the form of the mass distribution must be known a priori to obtain unbiased measurements of cosmological parameters in this fashion. Here, we demonstrate that spectral sirens can accurately infer cosmological parameters without such prior assumptions. We apply a flexible, nonparametric model for the mass distribution of compact binaries to a simulated catalog of 1000 GW signals, consistent with expectations for the next LIGO–Virgo–KAGRA observing run. We find that, despite our model’s flexibility, both the source mass model and cosmological parameters are correctly reconstructed. We predict a 11.2%✎measurement ofH0, keeping all other cosmological parameters fixed, and a 6.4%✎measurement ofH(z= 0.9)✎when fitting for multiple cosmological parameters (1σuncertainties). This astrophysically agnostic spectral siren technique will be essential to arrive at precise and unbiased cosmological constraints from GW source populations.more » « less
-
Summary High performance computing (HPC) has led to remarkable advances in science and engineering and has become an indispensable tool for research. Unfortunately, HPC use and adoption by many researchers is often hindered by the complex way these resources are accessed. Indeed, while the web has become the dominant access mechanism for remote computing services in virtually every computing area, HPC is a notable exception. Open OnDemand is an open source project negating this trend by providing web‐based access to HPC resources (https://openondemand.org). This article describes the challenges to adoption and other lessons learned over the 3‐year project that may be relevant to other science gateway projects. We end with a description of future plans the project team has during the Open OnDemand 2.0 project including specific developments in machine learning and GPU monitoring.more » « less
An official website of the United States government
