skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Compositionally restricted attention-based network for materials property predictions
Abstract In this paper, we demonstrate an application of the Transformer self-attention mechanism in the context of materials science. Our network, the Compositionally Restricted Attention-Based network (), explores the area of structure-agnostic materials property predictions when only a chemical formula is provided. Our results show that ’s performance matches or exceeds current best-practice methods on nearly all of 28 total benchmark datasets. We also demonstrate how ’s architecture lends itself towards model interpretability by showing different visualization approaches that are made possible by its design. We feel confident that and its attention-based framework will be of keen interest to future materials informatics researchers.  more » « less
Award ID(s):
1651668
PAR ID:
10248518
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
npj Computational Materials
Volume:
7
Issue:
1
ISSN:
2057-3960
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Point set is a major type of 3D structure representation format characterized by its data availability and compactness. Most former deep learning-based point set models pay equal attention to different point set regions and channels, thus having limited ability in focusing on small regions and specific channels that are important for characterizing the object of interest. In this paper, we introduce a novel model named Attention-based Point Network (AttPNet). It uses attention mechanism for both global feature masking and channel weighting to focus on characteristic regions and channels. There are two branches in our model. The first branch calculates an attention mask for every point. The second branch uses convolution layers to abstract global features from point sets, where channel attention block is adapted to focus on important channels. Evaluations on the ModelNet40 benchmark dataset show that our model outperforms the existing best model in classification tasks by 0.7% without voting. In addition, experiments on augmented data demonstrate that our model is robust to rotational perturbations and missing points. We also design a Electron Cryo-Tomography (ECT) point cloud dataset and further demonstrate our model’s ability in dealing with fine-grained structures on the ECT dataset. 
    more » « less
  2. null (Ed.)
    Due to its high theoretical energy density and relative abundancy of active materials, the magnesium–sulfur battery has attracted research attention in recent years. A closely related system, the lithium-sulfur battery, can suffer from serious self-discharge behavior. Until now, the self-discharge of Mg–S has been rarely addressed. Herein, we demonstrate for a wide variety of Mg–S electrolytes and conditions that Mg–S batteries also suffer from serious self-discharge. For a common Mg–S electrolyte, we identify a multi-step self-discharge pathway. Covalent S 8 diffuses to the metal Mg anode and is converted to ionic Mg polysulfide in a non-faradaic reaction. Mg polysulfides in solution are found to be meta-stable, continuing to react and precipitate as solid magnesium polysulfide species during both storage and active use. Mg–S electrolytes from the early, middle, and state-of-the-art stages of the Mg–S literature are all found to enable the self-discharge. The self-discharge behavior is found to decrease first cycle discharge capacity by at least 32%, and in some cases up to 96%, indicating this is a phenomenon of the Mg–S chemistry that deserves focused attention. 
    more » « less
  3. Developing refractory high-entropy superalloys (RSAs) with performance advantages over nickel-based alloys is a critical frontier in materials science. Body-centered cubic (bcc)-based RSAs have attracted significant attention, with ruthenium (Ru) playing a key role in forming two-phase regions of A2 (disordered bcc) + B2 (ordered bcc), which could lead to superalloy-like microstructures. This study introduces the application of the Kolmogorov-Arnold Network (KAN) model to predict the mechanical and thermodynamic properties of Ru while comparing its performance against other commonly used machine-learned models. Utilizing density functional theory calculations as training data, the KAN model demonstrates superior accuracy and computational efficiency compared to conventional methods, while reducing descriptor complexity. The model accurately predicts a range of properties, including elastic constants, thermal expansion coefficients, and various moduli, with discrepancies within 6% of experimental reference data. Molecular dynamics simulations further validate the model’s efficacy, accurately capturing Ru’s phase transitions from hexagonal close-packed (hcp) to face-centered cubic structure and the melting point. This work presents the first application of KAN in materials science, demonstrating how its balanced performance and efficiency provide a new pathway for designing advanced materials, with unique advantages over conventional machine learning approaches in predicting material properties. 
    more » « less
  4. Abstract With rapid progress across platforms for quantum systems, the problem of many-body quantum state reconstruction for noisy quantum states becomes an important challenge. There has been a growing interest in approaching the problem of quantum state reconstruction using generative neural network models. Here we propose the ‘attention-based quantum tomography’ (AQT), a quantum state reconstruction using an attention mechanism-based generative network that learns the mixed state density matrix of a noisy quantum state. AQT is based on the model proposed in ‘Attention is all you need’ by Vaswani et al (2017 NIPS ) that is designed to learn long-range correlations in natural language sentences and thereby outperform previous natural language processing (NLP) models. We demonstrate not only that AQT outperforms earlier neural-network-based quantum state reconstruction on identical tasks but that AQT can accurately reconstruct the density matrix associated with a noisy quantum state experimentally realized in an IBMQ quantum computer. We speculate the success of the AQT stems from its ability to model quantum entanglement across the entire quantum system much as the attention model for NLP captures the correlations among words in a sentence. 
    more » « less
  5. null (Ed.)
    The development of an efficient and powerful machine learning (ML) model for materials property prediction (MPP) remains an important challenge in materials science. While various techniques have been proposed to extract physicochemical features in MPP, graph neural networks (GNN) have also shown very strong capability in capturing effective features for high-performance MPP. Nevertheless, current GNN models do not effectively differentiate the contributions from different atoms. In this paper we develop a novel graph neural network model called GATGNN for predicting properties of inorganic materials. GATGNN is characterized by its composition of augmented graph-attention layers (AGAT) and a global attention layer. The application of AGAT layers and global attention layers respectively learn the local relationship among neighboring atoms and overall contribution of the atoms to the material's property; together making our framework achieve considerably better prediction performance on various tested properties. Through extensive experiments, we show that our method is able to outperform existing state-of-the-art GNN models while it can also provide a measurable insight into the correlation between the atoms and their material property. Our code can found on – https://github.com/superlouis/GATGNN. 
    more » « less