skip to main content


Title: Efficient force field and energy emulation through partition of permutationally equivalent atoms
Gaussian process (GP) emulator has been used as a surrogate model for predicting force field and molecular potential, to overcome the computational bottleneck of ab initio molecular dynamics simulation. Integrating both atomic force and energy in predictions was found to be more accurate than using energy alone, yet it requires O(( NM) 3 ) computational operations for computing the likelihood function and making predictions, where N is the number of atoms and M is the number of simulated configurations in the training sample due to the inversion of a large covariance matrix. The high computational cost limits its applications to the simulation of small molecules. The computational challenge of using both gradient information and function values in GPs was recently noticed in machine learning communities, whereas conventional approximation methods may not work well. Here, we introduce a new approach, the atomized force field model, that integrates both force and energy in the emulator with many fewer computational operations. The drastic reduction in computation is achieved by utilizing the naturally sparse covariance structure that satisfies the constraints of the energy conservation and permutation symmetry of atoms. The efficient machine learning algorithm extends the limits of its applications on larger molecules under the same computational budget, with nearly no loss of predictive accuracy. Furthermore, our approach contains an uncertainty assessment of predictions of atomic forces and energies, useful for developing a sequential design over the chemical input space.  more » « less
Award ID(s):
1940118 2053423
PAR ID:
10337855
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
The Journal of Chemical Physics
Volume:
156
Issue:
18
ISSN:
0021-9606
Page Range / eLocation ID:
184304
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Monte Carlo (MC) methods are important computational tools for molecular structure optimizations and predictions. When solvent effects are explicitly considered, MC methods become very expensive due to the large degree of freedom associated with the water molecules and mobile ions. Alternatively implicit-solvent MC can largely reduce the computational cost by applying a mean field approximation to solvent effects and meanwhile maintains the atomic detail of the target molecule. The two most popular implicit-solvent models are the Poisson-Boltzmann (PB) model and the Generalized Born (GB) model in a way such that the GB model is an approximation to the PB model but is much faster in simulation time. In this work, we develop a machine learning-based implicit-solvent Monte Carlo (MLIMC) method by combining the advantages of both implicit solvent models in accuracy and efficiency. Specifically, the MLIMC method uses a fast and accurate PB-based machine learning (PBML) scheme to compute the electrostatic solvation free energy at each step. We validate our MLIMC method by using a benzene-water system and a protein-water system. We show that the proposed MLIMC method has great advantages in speed and accuracy for molecular structure optimization and prediction. 
    more » « less
  2. Gradient-domain machine learning (GDML) force fields have shown excellent accuracy, data efficiency, and applicability for molecules with hundreds of atoms, but the employed global descriptor limits transferability to ensembles of molecules. Many-body expansions (MBEs) should provide a rigorous procedure for size-transferable GDML by training models on fundamental n-body interactions. We developed many-body GDML (mbGDML) force fields for water, acetonitrile, and methanol by training 1-, 2-, and 3-body models on only 1000 MP2/def2-TZVP calculations each. Our mbGDML force field includes intramolecular flexibility and intermolecular interactions, providing that the reference data adequately describe these effects. Energy and force predictions of clusters containing up to 20 molecules are within 0.38 kcal/mol per monomer and 0.06 kcal/(mol Å) per atom of reference supersystem calculations. This deviation partially arises from the restriction of the mbGDML model to 3-body interactions. GAP and SchNet in this MBE framework achieved similar accuracies but occasionally had abnormally high errors up to 17 kcal/mol. NequIP trained on total energies and forces of trimers experienced much larger energy errors (at least 15 kcal/mol) as the number of monomers increased—demonstrating the effectiveness of size transferability with MBEs. Given these approximations, our automated mbGDML training schemes also resulted in fair agreement with reference radial distribution functions (RDFs) of bulk solvents. These results highlight mbGDML as valuable for modeling explicitly solvated systems with quantum-mechanical accuracy. 
    more » « less
  3. Machine learning potentials (MLPs) are poised to combine the accuracy of ab initio predictions with the computational efficiency of classical molecular dynamics (MD) simulation. While great progress has been made over the last two decades in developing MLPs, there is still much to be done to evaluate their model transferability and facilitate their development. In this work, we construct two deep potential (DP) models for liquid water near graphene surfaces, Model S and Model F, with the latter having more training data. A concurrent learning algorithm (DP-GEN) is adopted to explore the configurational space beyond the scope of conventional ab initio MD simulation. By examining the performance of Model S, we find that an accurate prediction of atomic force does not imply an accurate prediction of system energy. The deviation from the relative atomic force alone is insufficient to assess the accuracy of the DP models. Based on the performance of Model F, we propose that the relative magnitude of the model deviation and the corresponding root-mean-square error of the original test dataset, including energy and atomic force, can serve as an indicator for evaluating the accuracy of the model prediction for a given structure, which is particularly applicable for large systems where density functional theory calculations are infeasible. In addition to the prediction accuracy of the model described above, we also briefly discuss simulation stability and its relationship to the former. Both are important aspects in assessing the transferability of the MLP model.

     
    more » « less
  4. Abstract

    Despite their importance in a wide variety of applications, the estimation of ionization cross sections for large molecules continues to present challenges for both experiment and theory. Machine learning (ML) algorithms have been shown to be an effective mechanism for estimating cross section data for atomic targets and a select number of molecular targets. We present an efficient ML model for predicting ionization cross sections for a broad array of molecular targets. Our model is a 3-layer neural network that is trained using published experimental datasets. There is minimal input to the network, making it widely applicable. We show that with training on as few as 10 molecular datasets, the network is able to predict the experimental cross sections of additional molecules with an accuracy similar to experimental uncertainties in existing data. As the number of training molecular datasets increased, the network’s predictions became more accurate and, in the worst case, were within 30% of accepted experimental values. In many cases, predictions were within 10% of accepted values. Using a network trained on datasets for 25 different molecules, we present predictions for an additional 27 molecules, including alkanes, alkenes, molecules with ring structures, and DNA nucleotide bases.

     
    more » « less
  5. Prediction of a molecule's 3D conformer ensemble from the molecular graph holds a key role in areas of cheminformatics and drug discovery. Existing generative models have several drawbacks including lack of modeling important molecular geometry elements (e.g. torsion angles), separate optimization stages prone to error accumulation, and the need for structure fine-tuning based on approximate classical force-fields or computationally expensive methods such as metadynamics with approximate quantum mechanics calculations at each geometry. We propose GeoMol--an end-to-end, non-autoregressive and SE(3)-invariant machine learning approach to generate distributions of low-energy molecular 3D conformers. Leveraging the power of message passing neural networks (MPNNs) to capture local and global graph information, we predict local atomic 3D structures and torsion angles, avoiding unnecessary over-parameterization of the geometric degrees of freedom (e.g. one angle per non-terminal bond). Such local predictions suffice both for the training loss computation, as well as for the full deterministic conformer assembly (at test time). We devise a non-adversarial optimal transport based loss function to promote diverse conformer generation. GeoMol predominantly outperforms popular open-source, commercial, or state-of-the-art machine learning (ML) models, while achieving significant speed-ups. We expect such differentiable 3D structure generators to significantly impact molecular modeling and related applications. 
    more » « less