skip to main content


Title: Predicting solid state material platforms for quantum technologies
Abstract Semiconductor materials provide a compelling platform for quantum technologies (QT). However, identifying promising material hosts among the plethora of candidates is a major challenge. Therefore, we have developed a framework for the automated discovery of semiconductor platforms for QT using material informatics and machine learning methods. Different approaches were implemented to label data for training the supervised machine learning (ML) algorithms logistic regression, decision trees, random forests and gradient boosting. We find that an empirical approach relying exclusively on findings from the literature yields a clear separation between predicted suitable and unsuitable candidates. In contrast to expectations from the literature focusing on band gap and ionic character as important properties for QT compatibility, the ML methods highlight features related to symmetry and crystal structure, including bond length, orientation and radial distribution, as influential when predicting a material as suitable for QT.  more » « less
Award ID(s):
2013047
NSF-PAR ID:
10382944
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
npj Computational Materials
Volume:
8
Issue:
1
ISSN:
2057-3960
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Why the new findings matter

    The process of teaching and learning is complex, multifaceted and dynamic. This paper contributes a seminal resource to highlight the digitisation of the educational sciences by demonstrating how new machine learning methods can be effectively and reliably used in research, education and practical application.

    Implications for educational researchers and policy makers

    The progressing digitisation of societies around the globe and the impact of the SARS‐COV‐2 pandemic have highlighted the vulnerabilities and shortcomings of educational systems. These developments have shown the necessity to provide effective educational processes that can support sometimes overwhelmed teachers to digitally impart knowledge on the plan of many governments and policy makers. Educational scientists, corporate partners and stakeholders can make use of machine learning techniques to develop advanced, scalable educational processes that account for individual needs of learners and that can complement and support existing learning infrastructure. The proper use of machine learning methods can contribute essential applications to the educational sciences, such as (semi‐)automated assessments, algorithmic‐grading, personalised feedback and adaptive learning approaches. However, these promises are strongly tied to an at least basic understanding of the concepts of machine learning and a degree of data literacy, which has to become the standard in education and the educational sciences.

    Demonstrating both the promises and the challenges that are inherent to the collection and the analysis of large educational data with machine learning, this paper covers the essential topics that their application requires and provides easy‐to‐follow resources and code to facilitate the process of adoption.

     
    more » « less
  2. Abstract

    Antibiotic discovery has experienced a severe slowdown in terms of discovery of new candidates. In vitro screening methods using phospholipids to model the bacterial membrane provide a route to identify molecules that specifically disrupt bacterial membranes causing cell death. Thanks to the electrically insulating properties of the major component of the cell membrane, phospholipids, electronic devices are highly suitable transducers of membrane disruption. The organic electrochemical transistor (OECT) is a highly sensitive ion‐to‐electron converter. Here, the OECT is used as a transducer of the permeability of a lipid monolayer (ML) at a liquid:liquid interface, designed to read out changes in ion flux caused by compounds that interact with, and disrupt, lipid assembly. This concept is illustrated using the well‐documented antibiotic Polymixin B and the highly sensitive quantitation of permeability of the lipid ML induced by two novel recently described antibacterial amine‐based oligothioetheramides is shown, highlighting molecular scale differences in their disruption capabilities. It is anticipated that this platform has the potential to play a role in front‐line antimicrobial compound design and characterization thanks to the compatibility of semiconductor microfabrication technology with high‐throughput readouts.

     
    more » « less
  3. As societies rely increasingly on computers for critical functions, the importance of cybersecurity becomes ever more paramount. Even in recent months there have been attacks that halted oil production, disrupted online learning at the height of COVID, and put medical records at risk at prominent hospitals. This constant threat of privacy leaks and infrastructure disruption has led to an increase in the adoption of artificial intelligence (AI) techniques, mainly machine learning (ML), in state-of-the-art cybersecurity approaches. Oftentimes, these techniques are borrowed from other disciplines without context and devoid of the depth of understanding as to why such techniques are best suited to solve the problem at hand. This is largely due to the fact that in many ways cybersecurity curricula have failed to keep up with advances in cybersecurity research and integrating AI and ML into cybersecurity curricula is extremely difficult. To address this gap, we propose a new methodology to integrate AI and ML techniques into cybersecurity education curricula. Our methodology consists of four components: i) Analysis of Literature which aims to understand the prevalence of AI and ML in cybersecurity research, ii) Analysis of Cybersecurity Curriculum that intends to determine the materials already present in the curriculum and the possible intersection points in the curricula for the new AI material, iii) Design of Adaptable Modules that aims to design highly adaptable modules that can be directly used by cybersecurity educators where new AI material can naturally supplement/substitute for concepts or material already present in the cybersecurity curriculum, and iv) Curriculum Level Evaluation that aims to evaluate the effectiveness of the proposed methodology from both student and instructor perspectives. In this paper, we focus on the first component of our methodology - Analysis of Literature and systematically analyze over 5000 papers that were published in the top cybersecurity conferences during the last five years. Our results clearly indicate that more than 78% of the cybersecurity papers mention AI terminology. To determine the prevalence of the use of AI, we randomly selected 300 papers and performed a thorough analysis. Our results show that more than 19% of the papers implement ML techniques. These findings suggest that AI and ML techniques should be considered for future integration into cybersecurity curriculum to better align with advancements in the field. 
    more » « less
  4. Scientific literature presents a wellspring of cutting-edge knowledge for materials science, including valuable data (e.g., numerical data from experiment results, material properties and structure). These data are critical for accelerating materials discovery by data-driven machine learning (ML) methods. The challenge is, it is impossible for humans to manually extract and retain this knowledge due to the extensive and growing volume of publications.To this end, we explore a fine-tuned BERT model for extracting knowledge. Our preliminary results show that our fine-tuned Bert model reaches an f-score of 85% for the materials named entity recognition task. The paper covers background, related work, methodology including tuning parameters, and our overall performance evaluation. Our discussion offers insights into our results, and points to directions for next steps. 
    more » « less
  5. Background

    Metamodels can address some of the limitations of complex simulation models by formulating a mathematical relationship between input parameters and simulation model outcomes. Our objective was to develop and compare the performance of a machine learning (ML)–based metamodel against a conventional metamodeling approach in replicating the findings of a complex simulation model.

    Methods

    We constructed 3 ML-based metamodels using random forest, support vector regression, and artificial neural networks and a linear regression-based metamodel from a previously validated microsimulation model of the natural history hepatitis C virus (HCV) consisting of 40 input parameters. Outcomes of interest included societal costs and quality-adjusted life-years (QALYs), the incremental cost-effectiveness (ICER) of HCV treatment versus no treatment, cost-effectiveness analysis curve (CEAC), and expected value of perfect information (EVPI). We evaluated metamodel performance using root mean squared error (RMSE) and Pearson’s R2on the normalized data.

    Results

    The R2values for the linear regression metamodel for QALYs without treatment, QALYs with treatment, societal cost without treatment, societal cost with treatment, and ICER were 0.92, 0.98, 0.85, 0.92, and 0.60, respectively. The corresponding R2values for our ML-based metamodels were 0.96, 0.97, 0.90, 0.95, and 0.49 for support vector regression; 0.99, 0.83, 0.99, 0.99, and 0.82 for artificial neural network; and 0.99, 0.99, 0.99, 0.99, and 0.98 for random forest. Similar trends were observed for RMSE. The CEAC and EVPI curves produced by the random forest metamodel matched the results of the simulation output more closely than the linear regression metamodel.

    Conclusions

    ML-based metamodels generally outperformed traditional linear regression metamodels at replicating results from complex simulation models, with random forest metamodels performing best.

    Highlights

    Decision-analytic models are frequently used by policy makers and other stakeholders to assess the impact of new medical technologies and interventions. However, complex models can impose limitations on conducting probabilistic sensitivity analysis and value-of-information analysis, and may not be suitable for developing online decision-support tools. Metamodels, which accurately formulate a mathematical relationship between input parameters and model outcomes, can replicate complex simulation models and address the above limitation. The machine learning–based random forest model can outperform linear regression in replicating the findings of a complex simulation model. Such a metamodel can be used for conducting cost-effectiveness and value-of-information analyses or developing online decision support tools.

     
    more » « less