Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available December 1, 2024
-
In fluid physics, data-driven models to enhance or accelerate time to solution are becoming increasingly popular for many application domains, such as alternatives to turbulence closures, system surrogates, or for new physics discovery. In the context of reduced order models of high-dimensional time-dependent fluid systems, machine learning methods grant the benefit of automated learning from data, but the burden of a model lies on its reduced-order representation of both the fluid state and physical dynamics. In this work, we build a physics-constrained, data-driven reduced order model for Navier–Stokes equations to approximate spatiotemporal fluid dynamics in the canonical case of isotropic turbulence in a triply periodic box. The model design choices mimic numerical and physical constraints by, for example, implicitly enforcing the incompressibility constraint and utilizing continuous neural ordinary differential equations for tracking the evolution of the governing differential equation. We demonstrate this technique on a three-dimensional, moderate Reynolds number turbulent fluid flow. In assessing the statistical quality and characteristics of the machine-learned model through rigorous diagnostic tests, we find that our model is capable of reconstructing the dynamics of the flow over large integral timescales, favoring accuracy at the larger length scales. More significantly, comprehensive diagnostics suggest that physically interpretable model parameters, corresponding to the representations of the fluid state and dynamics, have attributable and quantifiable impact on the quality of the model predictions and computational complexity.
-
Recently, superhydrides have been computationally identified and subsequently synthesized with a variety of metals at very high pressures. In this work, we evaluate the possibility of synthesizing superhydrides by uniquely combining electrochemistry and applied pressure. We perform computational searches using density functional theory and particle swarm optimization calculations over a broad range of pressures and electrode potentials. Using a thermodynamic analysis, we construct pressure–potential phase diagrams and provide an alternate synthesis concept, pressure–potential (
), to access phases having high hydrogen content. Palladium–hydrogen is a widely studied material system with the highest hydride phase being Pd3H4. Most strikingly for this system, at potentials above hydrogen evolution and ∼ 300 MPa pressure, we find the possibility to make palladium superhydrides (e.g., PdH10). We predict the generalizability of this approach for La-H, Y-H, and Mg-H with 10- to 100-fold reduction in required pressure for stabilizing phases. In addition, the strategy allows stabilizing additional phases that cannot be done purely by either pressure or potential and is a general approach that is likely to work for synthesizing other hydrides at modest pressures. -
Abstract Machine-learning potentials are accelerating the development of energy materials, especially in identifying phase diagrams and other thermodynamic properties. In this work, we present a neural network potential based on atom-centered symmetry function descriptors to model the energetics of lithium intercalation into graphite. The potential was trained on a dataset of over 9000 diverse lithium–graphite configurations that varied in applied stress and strain, lithium concentration, lithium–carbon and lithium–lithium bond distances, and stacking order to ensure wide sampling of the potential atomic configurations during intercalation. We calculated the energies of these structures using density functional theory (DFT) through the Bayesian error estimation functional with van der Waals correlation exchange-correlation functional, which can accurately describe the van der Waals interactions that are crucial to determining the thermodynamics of this phase space. Bayesian optimization, as implemented in
Dragonfly , was used to select optimal set of symmetry function parameters, ultimately resulting in a potential with a prediction error of 8.24 meV atom−1on unseen test data. The potential can predict energies, structural properties, and elastic constants at an accuracy comparable to other DFT exchange-correlation functionals at a fraction of the computational cost. The accuracy of the potential is also comparable to similar machine-learned potentials describing other systems. We calculate the open circuit voltage with the calculator and find good agreement with experiment, especially in the regimex ≥ 0.3, forx in Lix C6. This study further illustrates the power of machine learning potentials, which promises to revolutionize design and optimization of battery materials.