skip to main content


Title: Review of applications and challenges of quantitative systems pharmacology modeling and machine learning for heart failure
Abstract

Quantitative systems pharmacology (QSP) is an important approach in pharmaceutical research and development that facilitates in silico generation of quantitative mechanistic hypotheses and enables in silico trials. As demonstrated by applications from numerous industry groups and interest from regulatory authorities, QSP is becoming an increasingly critical component in clinical drug development. With rapidly evolving computational tools and methods, QSP modeling has achieved important progress in pharmaceutical research and development, including for heart failure (HF). However, various challenges exist in the QSP modeling and clinical characterization of HF. Machine/deep learning (ML/DL) methods have had success in a wide variety of fields and disciplines. They provide data-driven approaches in HF diagnosis and modeling, and offer a novel strategy to inform QSP model development and calibration. The combination of ML/DL and QSP modeling becomes an emergent direction in the understanding of HF and clinical development new therapies. In this work, we review the current status and achievement in QSP and ML/DL for HF, and discuss remaining challenges and future perspectives in the field.

 
more » « less
NSF-PAR ID:
10307578
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
Springer Science + Business Media
Date Published:
Journal Name:
Journal of Pharmacokinetics and Pharmacodynamics
Volume:
49
Issue:
1
ISSN:
1567-567X
Page Range / eLocation ID:
p. 39-50
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Cryo‐electron microscopy (cryo‐EM) has become a major experimental technique to determine the structures of large protein complexes and molecular assemblies, as evidenced by the 2017 Nobel Prize. Although cryo‐EM has been drastically improved to generate high‐resolution three‐dimensional maps that contain detailed structural information about macromolecules, the computational methods for using the data to automatically build structure models are lagging far behind. The traditional cryo‐EM model building approach is template‐based homology modeling. Manual de novo modeling is very time‐consuming when no template model is found in the database. In recent years, de novo cryo‐EM modeling using machine learning (ML) and deep learning (DL) has ranked among the top‐performing methods in macromolecular structure modeling. DL‐based de novo cryo‐EM modeling is an important application of artificial intelligence, with impressive results and great potential for the next generation of molecular biomedicine. Accordingly, we systematically review the representative ML/DL‐based de novo cryo‐EM modeling methods. Their significances are discussed from both practical and methodological viewpoints. We also briefly describe the background of cryo‐EM data processing workflow. Overall, this review provides an introductory guide to modern research on artificial intelligence for de novo molecular structure modeling and future directions in this emerging field.

    This article is categorized under:

    Structure and Mechanism > Molecular Structures

    Structure and Mechanism > Computational Biochemistry and Biophysics

    Data Science > Artificial Intelligence/Machine Learning

     
    more » « less
  2. Introduction: Artificial intelligence (AI) technologies continue to attract interest from a broad range of disciplines in recent years, including health. The increase in computer hardware and software applications in medicine, as well as digitization of health-related data together fuel progress in the development and use of AI in medicine. This progress provides new opportunities and challenges, as well as directions for the future of AI in health. Objective: The goals of this survey are to review the current state of AI in health, along with opportunities, challenges, and practical implications. This review highlights recent developments over the past five years and directions for the future. Methods: Publications over the past five years reporting the use of AI in health in clinical and biomedical informatics journals, as well as computer science conferences, were selected according to Google Scholar citations. Publications were then categorized into five different classes, according to the type of data analyzed. Results: The major data types identified were multi-omics, clinical, behavioral, environmental and pharmaceutical research and development (R&D) data. The current state of AI related to each data type is described, followed by associated challenges and practical implications that have emerged over the last several years. Opportunities and future directions based on these advances are discussed. Conclusion: Technologies have enabled the development of AI-assisted approaches to healthcare. However, there remain challenges. Work is currently underway to address multi-modal data integration, balancing quantitative algorithm performance and qualitative model interpretability, protection of model security, federated learning, and model bias. 
    more » « less
  3. Abstract

    Modeling and simulation is transforming modern materials science, becoming an important tool for the discovery of new materials and material phenomena, for gaining insight into the processes that govern materials behavior, and, increasingly, for quantitative predictions that can be used as part of a design tool in full partnership with experimental synthesis and characterization. Modeling and simulation is the essential bridge from good science to good engineering, spanning from fundamental understanding of materials behavior to deliberate design of new materials technologies leveraging new properties and processes. This Roadmap presents a broad overview of the extensive impact computational modeling has had in materials science in the past few decades, and offers focused perspectives on where the path forward lies as this rapidly expanding field evolves to meet the challenges of the next few decades. The Roadmap offers perspectives on advances within disciplines as diverse as phase field methods to model mesoscale behavior and molecular dynamics methods to deduce the fundamental atomic-scale dynamical processes governing materials response, to the challenges involved in the interdisciplinary research that tackles complex materials problems where the governing phenomena span different scales of materials behavior requiring multiscale approaches. The shift from understanding fundamental materials behavior to development of quantitative approaches to explain and predict experimental observations requires advances in the methods and practice in simulations for reproducibility and reliability, and interacting with a computational ecosystem that integrates new theory development, innovative applications, and an increasingly integrated software and computational infrastructure that takes advantage of the increasingly powerful computational methods and computing hardware.

     
    more » « less
  4. Abstract

    With advances in biomedical research, biomarkers are becoming increasingly important prognostic factors for predicting overall survival, while the measurement of biomarkers is often censored due to instruments' lower limits of detection. This leads to two types of censoring: random censoring in overall survival outcomes and fixed censoring in biomarker covariates, posing new challenges in statistical modeling and inference. Existing methods for analyzing such data focus primarily on linear regression ignoring censored responses or semiparametric accelerated failure time models with covariates under detection limits (DL). In this paper, we propose a quantile regression for survival data with covariates subject to DL. Comparing to existing methods, the proposed approach provides a more versatile tool for modeling the distribution of survival outcomes by allowing covariate effects to vary across conditional quantiles of the survival time and requiring no parametric distribution assumptions for outcome data. To estimate the quantile process of regression coefficients, we develop a novel multiple imputation approach based on another quantile regression for covariates under DL, avoiding stringent parametric restrictions on censored covariates as often assumed in the literature. Under regularity conditions, we show that the estimation procedure yields uniformly consistent and asymptotically normal estimators. Simulation results demonstrate the satisfactory finite‐sample performance of the method. We also apply our method to the motivating data from a study of genetic and inflammatory markers of Sepsis.

     
    more » « less
  5. Abstract

    In this study, we aimed to democratize access to convolutional neural networks (CNN) for segmenting cartilage volumes, generating state‐of‐the‐art results for specialized, real‐world applications in hospitals and research. Segmentation of cross‐sectional and/or longitudinal magnetic resonance (MR) images of articular cartilage facilitates both clinical management of joint damage/disease and fundamental research. Manual delineation of such images is a time‐consuming task susceptible to high intra‐ and interoperator variability and prone to errors. Thus, enabling reliable and efficient analyses of MRIs of cartilage requires automated segmentation of cartilage volumes. Two main limitations arise in the development of hospital‐ or population‐specific deep learning (DL) models for image segmentation: specialized knowledge and specialized hardware. We present a relatively easy and accessible implementation of a DL model to automatically segment MRIs of human knees with state‐of‐the‐art accuracy. In representative examples, we trained CNN models in 6‐8 h and obtained results quantitatively comparable to state‐of‐the‐art for every anatomical structure. We established and evaluated our methods using two publicly available MRI data sets originating from the Osteoarthritis Initiative, Stryker Imorphics, and Zuse Institute Berlin (ZIB), as representative test cases. We use Google Colabfor editing and adapting the Python codes and selecting the runtime environment leveraging high‐performance graphical processing units. We designed our solution for novice users to apply to any data set with relatively few adaptations requiring only basic programming skills. To facilitate the adoption of our methods, we provide a complete guideline for using our methods and software, as well as the software tools themselves. Clinical significance: We establish and detail methods that clinical personal can apply to create their own DL models without specialized knowledge of DL nor specialized hardware/infrastructure and obtain results comparable with the state‐of‐the‐art to facilitate both clinical management of joint damage/disease and fundamental research.

     
    more » « less