skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Beck, David A."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Data science and machine learning are revolutionizing enzyme engineering; however, high-throughput simulations for screening large libraries of enzyme variants remain a challenge. Here, we present a novel but highly simple approach to comparing enzyme variants with fully atomistic classical molecular dynamics (MD) simulations on a tractable timescale. Our method greatly simplifies the problem by restricting sampling only to the reaction transition state, and we show that the resulting measurements of transition-state stability are well correlated with experimental activity measurements across two highly distinct enzymes, even for mutations with effects too small to resolve with free energy methods. This method will enable atomistic simulations to achieve sampling coverage for enzyme variant prescreening and machine learning model training on a scale that was previously not possible. 
    more » « less
  2. null (Ed.)
    Attention mechanisms have led to many breakthroughs in sequential data modeling but have yet to be incorporated into any generative algorithms for molecular design. Here we explore the impact of adding self-attention layers to generative β -VAE models and show that those with attention are able to learn a complex “molecular grammar” while improving performance on downstream tasks such as accurately sampling from the latent space (“model memory”) or exploring novel chemistries not present in the training data. There is a notable relationship between a model's architecture, the structure of its latent memory and its performance during inference. We demonstrate that there is an unavoidable tradeoff between model exploration and validity that is a function of the complexity of the latent memory. However, novel sampling schemes may be used that optimize this tradeoff. We anticipate that attention will play an important role in future molecular design algorithms that can make efficient use of the detailed molecular substructures learned by the transformer. 
    more » « less
  3. null (Ed.)