skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Pipelines for automating compliance-based elimination and extension (PACE 2 ): a systematic framework for high-throughput biomolecular materials simulation workflows
Abstract The formation of biomolecular materials via dynamical interfacial processes, such as self-assembly and fusion, for diverse compositions and external conditions can be efficiently probed using ensemble Molecular Dynamics (MD). However, this approach requires many simulations when investigating a large composition phase space. In addition, there is difficulty in predicting whether each simulation will yield biomolecular materials with the desired properties or outcomes and how long each simulation will run. These difficulties can be overcome by rules-based management systems, including intermittent inspection, variable sampling, and premature termination or extension of the individual MD simulations. Automating such a management system can significantly improve runtime efficiency and reduce the burden of organizing large ensembles of MD simulations. To this end, a computational framework, the Pipelines for Automating Compliance-based Elimination and Extension (PACE2), is proposed for high-throughput ensemble biomolecular materials simulations. The PACE2framework encompasses Candidate pipelines, where each pipeline includes temporally separated simulation and analysis tasks. When a MD simulation is completed, an analysis task is triggered, which evaluates the MD trajectory for compliance. Compliant simulations are extended to the next MD phase with a suitable sample rate to allow additional, detailed analysis. Non-compliant simulations are eliminated, and their computational resources are reallocated or released. The framework is designed to run on local desktop computers and high-performance computing resources. Preliminary scientific results enabled by the use of PACE2framework are presented, which demonstrate its potential and validates its function. In the future, the framework will be extended to address generalized workflows and investigate composition-structure-property relations for other classes of materials.  more » « less
Award ID(s):
1654325 2118860 1835449
PAR ID:
10474331
Author(s) / Creator(s):
; ; ; ; ; ; ;
Publisher / Repository:
IOP Publishing
Date Published:
Journal Name:
Journal of Physics: Materials
Volume:
7
Issue:
1
ISSN:
2515-7639
Format(s):
Medium: X Size: Article No. 015006
Size(s):
Article No. 015006
Sponsoring Org:
National Science Foundation
More Like this
  1. The formation of biomolecular materials via dynamical interfacial processes such as self-assembly and fusion, for diverse compositions and external conditions, can be efficiently probed using ensemble Molecular Dynamics. However, this approach requires a large number of simulations when investigating a large composition phase space. In addition, there is difficulty in predicting whether each simulation is yielding biomolecular materials with the desired properties or outcomes and how long each simulation will run for. These difficulties can be overcome by rules-based management systems which include intermittent inspection, variable sampling, premature termination and extension of the individual Molecular Dynamics simulations. The automation of such a management system can significantly reduce the overhead of managing large ensembles of Molecular Dynamics simulations. To this end, a high-throughput workflows-based computational framework, Pipeline for Automating Compliance-based Elimination and Extension (PACE2), for biomolecular materials simulations is proposed. The PACE2 framework encompasses Simulation-Analysis Pipelines. Each Pipeline includes temporally separated simulation and analysis tasks. When a Molecular Dynamics simulation completes, an analysis task is triggered which evaluates the Molecular Dynamics trajectory for compliance. Compliant Molecular Dynamics simulations are extended to the next Molecular Dynamics phase with a suitable sample rate to allow additional, detailed analysis. Non-compliant Molecular Dynamics simulations are eliminated, and their computational resources are either reallocated or released. The framework is designed to run on local desktop computers and high performance computing resources. In the future, the framework will be extended to address generalized workflows and investigate other classes of materials. 
    more » « less
  2. Efficient sampling in biomolecular simulations is critical for accurately capturing the complex dynamic behaviors of biological systems. Adaptive sampling techniques aim to improve efficiency by focusing computational resources on the most relevant regions of the phase space. In this work, we present a framework for identifying the optimal sampling policy through metric-driven ranking. Our approach systematically evaluates the policy ensemble and ranks the policies based on their ability to explore the conformational space effectively. Through a series of biomolecular simulation case studies, we demonstrate that the choice of a different adaptive sampling policy at each round significantly outperforms single policy sampling, leading to faster convergence and improved sampling performance. This approach takes an ensemble of adaptive sampling policies and identifies the optimal policy for the next round based on current data. Beyond presenting this ensemble view of adaptive sampling, we also propose two sampling algorithms that approximate this ranking framework on the fly. The modularity of this framework allows incorporation of any adaptive sampling policy, making it versatile and suitable as a comprehensive adaptive sampling scheme. 
    more » « less
  3. Scientific breakthroughs in biomolecular methods and improvements in hardware technology have shifted from a single long-running simulation to a large set of shorter simulations running simultaneously, called an ensemble. In an ensemble, each independent simulation is usually coupled with several analyses that apply identical or distinct algorithms on data produced by the corresponding simulation. Today, In situ methods are used to analyze large volumes of data generated by scientific simulations at runtime. This work studies the execution of ensemble-based simulations paired with In situ analyses using in-memory staging methods. Because simulations and analyses forming an ensemble typically run concurrently, deploying an ensemble requires efficient co-location-aware strategies, making sure the data flow between simulations and analyses that form an In situ workflow is efficient. Using an ensemble of molecular dynamics In situ workflows with multiple simulations and analyses, we first show that collecting traditional metrics such as makespan, instructions per cycle, memory usage, or cache miss ratio is not sufficient to characterize the complex behaviors of ensembles. Thus, we propose a method to evaluate the performance of ensembles of workflows that captures resource usage (efficiency), resource allocation, and component placement. Experimental results demonstrate that our proposed method can effectively capture the performance of different component placements in an ensemble. By evaluating different co-location scenarios, our performance indicator demonstrates improvements of up to four orders of magnitude when co-locating simulation and coupled analyses within a single computational host. 
    more » « less
  4. In response to the growing federal mandates for data management and sharing, the California Digital Library (CDL) and the Association of Research Libraries (ARL) have launched a pilot project to test new workflows at 10 U.S. universities. The project utilizes machine-actionable DMPs (maDMPs) to streamline processes, improve communication, and ensure compliance with federal requirements. This lightning talk will discuss the pilot by exploring three case studies that showcase the use of generative AI for crafting compliant data management plans, maDMPs for deploying computing and storage resources, and tracking research outputs through integration with internal data management platforms. 
    more » « less
  5. Workflow management systems (WMS) are widely used to describe and execute large computational or data intensive applications. However, when a large ensemble of workflows is run on a cluster, new resource management problems occur. Each WMS itself consumes otherwise unmanaged resources, such as the shared head node where the WMS coordinator runs, the shared filesystem where intermediate data is stored, and the shared batch queue itself. We introduce Mufasa, a meta-workflow management system, which is designed to control the concurrency of multiple workflows in an ensemble, by observing and controlling the resources required by each WMS. We show some initial results demonstrating that Mufasa correctly handles the overcommitment of different resource types by starting, pausing, and cancelling workflows with unexpected behavior. 
    more » « less