skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Efficiently Trained Deep Learning Potential for Graphane
We have developed an accurate and efficient deep-learning potential (DP) for graphane, which is a fully hydrogenated version of graphene, using a very small training set consisting of 1000 snapshots from a 0.5 ps density functional theory (DFT) molecular dynamics simulation at 1000 K. We have assessed the ability of the DP to extrapolate to system sizes, temperatures, and lattice strains not included in the training set. The DP performs surprisingly well, outperforming an empirical many-body potential when compared with DFT data for the phonon density of states, thermodynamic properties, velocity autocorrelation function, and stress–strain curve up to the yield point. This indicates that our DP can reliably extrapolate beyond the limit of the training data. We have computed the thermal fluctuations as a function of system size for graphane. We found that graphane has larger thermal fluctuations compared with graphene, but having about the same out-of-plane stiffness.  more » « less
Award ID(s):
1703266
PAR ID:
10275421
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
The Journal of Physical Chemistry C
ISSN:
1932-7447
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Machine learning interatomic potentials (IPs) can provide accuracy close to that of first-principles methods, such as density functional theory (DFT), at a fraction of the computational cost. This greatly extends the scope of accurate molecular simulations, providing opportunities for quantitative design of materials and devices on scales hitherto unreachable by DFT methods. However, machine learning IPs have a basic limitation in that they lack a physical model for the phenomena being predicted and therefore have unknown accuracy when extrapolating outside their training set. In this paper, we propose a class of Dropout Uncertainty Neural Network (DUNN) potentials that provide rigorous uncertainty estimates that can be understood from both Bayesian and frequentist statistics perspectives. As an example, we develop a DUNN potential for carbon and show how it can be used to predict uncertainty for static and dynamical properties, including stress and phonon dispersion in graphene. We demonstrate two approaches to propagate uncertainty in the potential energy and atomic forces to predicted properties. In addition, we show that DUNN uncertainty estimates can be used to detect configurations outside the training set, and in some cases, can serve as a predictor for the accuracy of a calculation. 
    more » « less
  2. Abstract Smart materials are versatile material systems which exhibit a measurable response to external stimuli. Recently, smart material systems have been developed which incorporate graphene in order to share on its various advantageous properties, such as mechanical strength, electrical conductivity, and thermal conductivity as well as to achieve unique stimuli‐dependent responses. Here, a graphene fiber‐based smart material that exhibits reversible electrical conductivity switching at a relatively low temperature (60 °C), is reported. Using molecular dynamics (MD) simulation and density functional theory‐based non‐equilibrium Green's function (DFT‐NEGF) approach, it is revealed that this thermo‐response behavior is due to the change in configuration of amphiphilic triblock dispersant molecules occurring in the graphene fiber during heating or cooling. These conformational changes alter the total number of graphene‐graphene contacts within the composite material system, and thus the electrical conductivity as well. Additionally, this graphene fiber fabrication approach uses a scalable, facile, water‐based method, that makes it easy to modify material composition ratios. In all, this work represents an important step forward to enable complete functional tuning of graphene‐based smart materials at the nanoscale while increasing commercialization viability. 
    more » « less
  3. Understanding the nucleation and growth mechanisms of highentropy alloy (HEA) nanoparticles is crucial for developing functional nanocrystals with tailored properties. This study investigates the thermal decomposition of mixed metal salt precursors (Fe, Ni, Pt, Ir, Ru) on reduced graphene oxide (rGO) using in situ transmission electron microscopy (TEM) when heated to 1000 °C at both slow (20 °C min−1) and fast (103 °C s−1) heating/cooling rates. Slow heating to 1000 °C revealed the following: (1) The nanoparticles' nucleation occurred through multistage decomposition at lower temperatures (250−300 °C) than single metal salt precursors (300−450 °C). (2) Pt-dominant nanocrystals autocatalytically reduced other elements, leading to the formation of multimetallic FeNiPtIrRu nanoparticles. (3) At 1000 °C, the nanoparticles were single-phase with noble metals enriched compared to transition metals. (4) Slow cooling induced structural heterogeneity and phase segregation due to element diffusion and thermodynamic miscibility. (5) Adding polyvinylpyrrolidone (PVP) suppressed segregation, promoting HEA nanoparticle formation even during slow cooling by limiting atomic diffusion. Under fast heating/cooling, nanoparticles formed as a solid solution of fcc HEA, indicating kinetic control and limited atomic diffusion. The density function theory (DFT) calculations illustrate that the simultaneous presence of metal elements on rGO, as expected by the fast heating process, favors the formation of an fcc HEA structure, with strong interactions between HEA nanoparticles and rGO enhancing stability. This study provides insights into how heating rates and additives like PVP can control phase composition, chemical homogeneity, and stability, enabling the rational design of complex nanomaterials for catalytic, energy, and functional applications. 
    more » « less
  4. Machine learning potentials (MLPs) are poised to combine the accuracy of ab initio predictions with the computational efficiency of classical molecular dynamics (MD) simulation. While great progress has been made over the last two decades in developing MLPs, there is still much to be done to evaluate their model transferability and facilitate their development. In this work, we construct two deep potential (DP) models for liquid water near graphene surfaces, Model S and Model F, with the latter having more training data. A concurrent learning algorithm (DP-GEN) is adopted to explore the configurational space beyond the scope of conventional ab initio MD simulation. By examining the performance of Model S, we find that an accurate prediction of atomic force does not imply an accurate prediction of system energy. The deviation from the relative atomic force alone is insufficient to assess the accuracy of the DP models. Based on the performance of Model F, we propose that the relative magnitude of the model deviation and the corresponding root-mean-square error of the original test dataset, including energy and atomic force, can serve as an indicator for evaluating the accuracy of the model prediction for a given structure, which is particularly applicable for large systems where density functional theory calculations are infeasible. In addition to the prediction accuracy of the model described above, we also briefly discuss simulation stability and its relationship to the former. Both are important aspects in assessing the transferability of the MLP model. 
    more » « less
  5. Precup, Doina; Chandar, Sarath; Pascanu, Razvan (Ed.)
    In this paper, we show that the process of continually learning new tasks and memorizing previous tasks introduces unknown privacy risks and challenges to bound the privacy loss. Based upon this, we introduce a formal definition of Lifelong DP, in which the participation of any data tuples in the training set of any tasks is protected, under a consistently bounded DP protection, given a growing stream of tasks. A consistently bounded DP means having only one fixed value of the DP privacy budget, regardless of the number of tasks. To preserve Lifelong DP, we propose a scalable and heterogeneous algorithm, called L2DP-ML with a streaming batch training, to efficiently train and continue releasing new versions of an L2M model, given the heterogeneity in terms of data sizes and the training order of tasks, without affecting DP protection of the private training set. An end-to-end theoretical analysis and thorough evaluations show that our mechanism is significantly better than baseline approaches in preserving Lifelong DP. The implementation of L2DP-ML is available at: https://github.com/haiphanNJIT/PrivateDeepLearning. 
    more » « less