skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Friday, December 13 until 2:00 AM ET on Saturday, December 14 due to maintenance. We apologize for the inconvenience.


Title: Uncertainty quantification in molecular simulations with dropout neural network potentials
Abstract

Machine learning interatomic potentials (IPs) can provide accuracy close to that of first-principles methods, such as density functional theory (DFT), at a fraction of the computational cost. This greatly extends the scope of accurate molecular simulations, providing opportunities for quantitative design of materials and devices on scales hitherto unreachable by DFT methods. However, machine learning IPs have a basic limitation in that they lack a physical model for the phenomena being predicted and therefore have unknown accuracy when extrapolating outside their training set. In this paper, we propose a class of Dropout Uncertainty Neural Network (DUNN) potentials that provide rigorous uncertainty estimates that can be understood from both Bayesian and frequentist statistics perspectives. As an example, we develop a DUNN potential for carbon and show how it can be used to predict uncertainty for static and dynamical properties, including stress and phonon dispersion in graphene. We demonstrate two approaches to propagate uncertainty in the potential energy and atomic forces to predicted properties. In addition, we show that DUNN uncertainty estimates can be used to detect configurations outside the training set, and in some cases, can serve as a predictor for the accuracy of a calculation.

 
more » « less
Award ID(s):
1834251 1931304
PAR ID:
10183948
Author(s) / Creator(s):
;
Publisher / Repository:
Nature Publishing Group
Date Published:
Journal Name:
npj Computational Materials
Volume:
6
Issue:
1
ISSN:
2057-3960
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Neural network potentials (NNPs) trained against density functional theory (DFT) are capable of reproducing the potential energy surface at a fraction of the computational cost. However, most NNP implementations focus on energy and forces. In this work, we modified the NNP model introduced by Behler and Parrinello to predict Fermi energy, band edges, and partial density of states of Cu 2 O. Our NNP can reproduce the DFT potential energy surface and properties at a fraction of the computational cost. We used our NNP to perform molecular dynamics (MD) simulations and validated the predicted properties against DFT calculations. Our model achieved a root mean squared error of 16 meV for the energy prediction. Furthermore, we show that the standard deviation of the energies predicted by the ensemble of training snapshots can be used to estimate the uncertainty in the predictions. This allows us to switch from the NNP to DFT on-the-fly during the MD simulation to evaluate the forces when the uncertainty is high. 
    more » « less
  2. Abstract

    Machine-learning potentials are accelerating the development of energy materials, especially in identifying phase diagrams and other thermodynamic properties. In this work, we present a neural network potential based on atom-centered symmetry function descriptors to model the energetics of lithium intercalation into graphite. The potential was trained on a dataset of over 9000 diverse lithium–graphite configurations that varied in applied stress and strain, lithium concentration, lithium–carbon and lithium–lithium bond distances, and stacking order to ensure wide sampling of the potential atomic configurations during intercalation. We calculated the energies of these structures using density functional theory (DFT) through the Bayesian error estimation functional with van der Waals correlation exchange-correlation functional, which can accurately describe the van der Waals interactions that are crucial to determining the thermodynamics of this phase space. Bayesian optimization, as implemented inDragonfly, was used to select optimal set of symmetry function parameters, ultimately resulting in a potential with a prediction error of 8.24 meV atom−1on unseen test data. The potential can predict energies, structural properties, and elastic constants at an accuracy comparable to other DFT exchange-correlation functionals at a fraction of the computational cost. The accuracy of the potential is also comparable to similar machine-learned potentials describing other systems. We calculate the open circuit voltage with the calculator and find good agreement with experiment, especially in the regimex ≥ 0.3, forxin LixC6. This study further illustrates the power of machine learning potentials, which promises to revolutionize design and optimization of battery materials.

     
    more » « less
  3. Machine learning potentials (MLPs) for atomistic simulations have an enormous prospective impact on materials modeling, offering orders of magnitude speedup over density functional theory (DFT) calculations without appreciably sacrificing accuracy in the prediction of material properties. However, the generation of large datasets needed for training MLPs is daunting. Herein, we show that MLP-based material property predictions converge faster with respect to precision for Brillouin zone integrations than DFT-based property predictions. We demonstrate that this phenomenon is robust across material properties for different metallic systems. Further, we provide statistical error metrics to accurately determine a priori the precision level required of DFT training datasets for MLPs to ensure accelerated convergence of material property predictions, thus significantly reducing the computational expense of MLP development. 
    more » « less
  4. In this paper, we consider the problem of quantifying parametric uncertainty in classical empirical interatomic potentials (IPs) using both Bayesian (Markov Chain Monte Carlo) and frequentist (profile likelihood) methods. We interface these tools with the Open Knowledgebase of Interatomic Models and study three models based on the Lennard-Jones, Morse, and Stillinger–Weber potentials. We confirm that IPs are typically sloppy, i.e., insensitive to coordinated changes in some parameter combinations. Because the inverse problem in such models is ill-conditioned, parameters are unidentifiable. This presents challenges for traditional statistical methods, as we demonstrate and interpret within both Bayesian and frequentist frameworks. We use information geometry to illuminate the underlying cause of this phenomenon and show that IPs have global properties similar to those of sloppy models from fields, such as systems biology, power systems, and critical phenomena. IPs correspond to bounded manifolds with a hierarchy of widths, leading to low effective dimensionality in the model. We show how information geometry can motivate new, natural parameterizations that improve the stability and interpretation of uncertainty quantification analysis and further suggest simplified, less-sloppy models. 
    more » « less
  5. Neural Network potentials are developed which accurately make and break bonds for use in molecular simulations. We report a neural network potential that can describe the potential energy surface for carbon–carbon bond dissociation with less than 1 kcal mol−1 error compared to complete active space second-order perturbation theory (CASPT2), and maintains this accuracy for both the minimum energy path and molecular dynamic calculations up to 2000 K. We utilize a transfer learning algorithm to develop neural network potentials to generate potential energy surfaces; this method aims to use the minimum amount of CASPT2 data on small systems to train neural network potentials while maintaining excellent transferability to larger systems. First, we generate homolytic carbon–carbon bond dissociation data of small size alkanes with density functional theory (DFT) energies to train the potentials to accurately predict bond dissociation at the DFT level. Then, using transfer learning, we retrained the neural network potential to the CASPT2 level of accuracy. We demonstrate that the neural network potential only requires bond dissociation data of a few small alkanes to accurately predict bond dissociation energy in larger alkanes. We then perform additional training on molecular dynamic simulations to refine our neural network potentials to obtain high accuracy for general use in molecular simulation. This training algorithm is generally applicable to any type of bond or any level of theory and will be useful for the generation of new reactive neural network potentials. 
    more » « less