skip to main content


Title: Uncertainty quantification in molecular simulations with dropout neural network potentials
Abstract

Machine learning interatomic potentials (IPs) can provide accuracy close to that of first-principles methods, such as density functional theory (DFT), at a fraction of the computational cost. This greatly extends the scope of accurate molecular simulations, providing opportunities for quantitative design of materials and devices on scales hitherto unreachable by DFT methods. However, machine learning IPs have a basic limitation in that they lack a physical model for the phenomena being predicted and therefore have unknown accuracy when extrapolating outside their training set. In this paper, we propose a class of Dropout Uncertainty Neural Network (DUNN) potentials that provide rigorous uncertainty estimates that can be understood from both Bayesian and frequentist statistics perspectives. As an example, we develop a DUNN potential for carbon and show how it can be used to predict uncertainty for static and dynamical properties, including stress and phonon dispersion in graphene. We demonstrate two approaches to propagate uncertainty in the potential energy and atomic forces to predicted properties. In addition, we show that DUNN uncertainty estimates can be used to detect configurations outside the training set, and in some cases, can serve as a predictor for the accuracy of a calculation.

 
more » « less
Award ID(s):
1834251 1931304
NSF-PAR ID:
10183948
Author(s) / Creator(s):
;
Publisher / Repository:
Nature Publishing Group
Date Published:
Journal Name:
npj Computational Materials
Volume:
6
Issue:
1
ISSN:
2057-3960
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Machine-learning potentials are accelerating the development of energy materials, especially in identifying phase diagrams and other thermodynamic properties. In this work, we present a neural network potential based on atom-centered symmetry function descriptors to model the energetics of lithium intercalation into graphite. The potential was trained on a dataset of over 9000 diverse lithium–graphite configurations that varied in applied stress and strain, lithium concentration, lithium–carbon and lithium–lithium bond distances, and stacking order to ensure wide sampling of the potential atomic configurations during intercalation. We calculated the energies of these structures using density functional theory (DFT) through the Bayesian error estimation functional with van der Waals correlation exchange-correlation functional, which can accurately describe the van der Waals interactions that are crucial to determining the thermodynamics of this phase space. Bayesian optimization, as implemented inDragonfly, was used to select optimal set of symmetry function parameters, ultimately resulting in a potential with a prediction error of 8.24 meV atom−1on unseen test data. The potential can predict energies, structural properties, and elastic constants at an accuracy comparable to other DFT exchange-correlation functionals at a fraction of the computational cost. The accuracy of the potential is also comparable to similar machine-learned potentials describing other systems. We calculate the open circuit voltage with the calculator and find good agreement with experiment, especially in the regimex ≥ 0.3, forxin LixC6. This study further illustrates the power of machine learning potentials, which promises to revolutionize design and optimization of battery materials.

     
    more » « less
  2. Neural network potentials (NNPs) trained against density functional theory (DFT) are capable of reproducing the potential energy surface at a fraction of the computational cost. However, most NNP implementations focus on energy and forces. In this work, we modified the NNP model introduced by Behler and Parrinello to predict Fermi energy, band edges, and partial density of states of Cu 2 O. Our NNP can reproduce the DFT potential energy surface and properties at a fraction of the computational cost. We used our NNP to perform molecular dynamics (MD) simulations and validated the predicted properties against DFT calculations. Our model achieved a root mean squared error of 16 meV for the energy prediction. Furthermore, we show that the standard deviation of the energies predicted by the ensemble of training snapshots can be used to estimate the uncertainty in the predictions. This allows us to switch from the NNP to DFT on-the-fly during the MD simulation to evaluate the forces when the uncertainty is high. 
    more » « less
  3. Abstract

    The successful discovery and isolation of graphene in 2004, and the subsequent synthesis of layered semiconductors and heterostructures beyond graphene have led to the exploding field of two-dimensional (2D) materials that explore their growth, new atomic-scale physics, and potential device applications. This review aims to provide an overview of theoretical, computational, and machine learning methods and tools at multiple length and time scales, and discuss how they can be utilized to assist/guide the design and synthesis of 2D materials beyond graphene. We focus on three methods at different length and time scales as follows: (i) nanoscale atomistic simulations including density functional theory (DFT) calculations and molecular dynamics simulations employing empirical and reactive interatomic potentials; (ii) mesoscale methods such as phase-field method; and (iii) macroscale continuum approaches by coupling thermal and chemical transport equations. We discuss how machine learning can be combined with computation and experiments to understand the correlations between structures and properties of 2D materials, and to guide the discovery of new 2D materials. We will also provide an outlook for the applications of computational approaches to 2D materials synthesis and growth in general.

     
    more » « less
  4. Abstract

    Despite the machine learning (ML) methods have been largely used recently, the predicted materials properties usually cannot exceed the range of original training data. We deployed a boundless objective-free exploration approach to combine traditional ML and density functional theory (DFT) in searching extreme material properties. This combination not only improves the efficiency for screening large-scale materials with minimal DFT inquiry, but also yields properties beyond original training range. We use Stein novelty to recommend outliers and then verify using DFT. Validated data are then added into the training dataset for next round iteration. We test the loop of training-recommendation-validation in mechanical property space. By screening 85,707 crystal structures, we identify 21 ultrahigh hardness structures and 11 negative Poisson’s ratio structures. The algorithm is very promising for future materials discovery that can push materials properties to the limit with minimal DFT calculations on only ~1% of the structures in the screening pool.

     
    more » « less
  5. Machine learning potentials (MLPs) for atomistic simulations have an enormous prospective impact on materials modeling, offering orders of magnitude speedup over density functional theory (DFT) calculations without appreciably sacrificing accuracy in the prediction of material properties. However, the generation of large datasets needed for training MLPs is daunting. Herein, we show that MLP-based material property predictions converge faster with respect to precision for Brillouin zone integrations than DFT-based property predictions. We demonstrate that this phenomenon is robust across material properties for different metallic systems. Further, we provide statistical error metrics to accurately determine a priori the precision level required of DFT training datasets for MLPs to ensure accelerated convergence of material property predictions, thus significantly reducing the computational expense of MLP development. 
    more » « less