skip to main content

Title: Integrated Image-Based Computational Fluid Dynamics Modeling Software as an Instructional Tool
Abstract Computational modeling of cardiovascular flows is becoming increasingly important in a range of biomedical applications, and understanding the fundamentals of computational modeling is important for engineering students. In addition to their purpose as research tools, integrated image-based computational fluid dynamics (CFD) platforms can be used to teach the fundamental principles involved in computational modeling and generate interest in studying cardiovascular disease. We report the results of a study performed at five institutions designed to investigate the effectiveness of an integrated modeling platform as an instructional tool and describe “best practices” for using an integrated modeling platform in the classroom. Use of an integrated modeling platform as an instructional tool in nontraditional educational settings (workshops, study abroad programs, in outreach) is also discussed. Results of the study show statistically significant improvements in understanding after using the integrated modeling platform, suggesting such platforms can be effective tools for teaching fundamental cardiovascular computational modeling principles.
Authors:
; ; ; ; ; ; ; ; ; ;
Award ID(s):
1663671
Publication Date:
NSF-PAR ID:
10201653
Journal Name:
Journal of Biomechanical Engineering
Volume:
142
Issue:
11
ISSN:
0148-0731
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    There is no consensus on how quickly the earth's ice sheets are melting due to global warming, nor on the ramifications to sea level rise. Due to its potential effects on coastal populations and global economies, sea level rise is a grave concern, making ice melt rates an important area of study. The ice‐sheet science community consists of two groups that perform related but distinct kinds of research: a data community, and a model building community. The data community characterizes past and current states of the ice sheets by assembling data from field and satellite observations. The modeling community forecasts the rate of ice‐sheet decline with computational models validated against observations. Although observational data and models depend on one another, these two groups are not well integrated. Better coordination between data collection efforts and modeling efforts is imperative if we are to improve our understanding of ice sheet loss rates. We present a new science gateway,GHub, a collaboration space for ice sheet scientists. This web‐accessible gateway will host datasets and modeling workflows, and provide access to codes that enable tool building by the ice sheet science community. Using GHub, we will collect and centralize existing datasets, creating data productsmore »that more completely catalog the ice sheets of Greenland and Antarctica. We will build workflows for model validation and uncertainty quantification, extending existing ice sheet models. Finally, we will host existing community codes, enabling scientists to build new tools utilizing them. With this new cyberinfrastructure, ice sheet scientists will gain integrated tools to quantify the rate and extent of sea level rise, benefitting human societies around the globe.

    « less
  2. Abstract There has been a strong need for simulation environments that are capable of modeling deep interdependencies between complex systems encountered during natural hazards, such as the interactions and coupled effects between civil infrastructure systems response, human behavior, and social policies, for improved community resilience. Coupling such complex components with an integrated simulation requires continuous data exchange between different simulators simulating separate models during the entire simulation process. This can be implemented by means of distributed simulation platforms or data passing tools. In order to provide a systematic reference for simulation tool choice and facilitating the development of compatible distributed simulators for deep interdependent study in the context of natural hazards, this article focuses on generic tools suitable for integration of simulators from different fields but not the platforms that are mainly used in some specific fields. With this aim, the article provides a comprehensive review of the most commonly used generic distributed simulation platforms (Distributed Interactive Simulation (DIS), High Level Architecture (HLA), Test and Training Enabling Architecture (TENA), and Distributed Data Services (DDS)) and data passing tools (Robot Operation System (ROS) and Lightweight Communication and Marshalling (LCM)) and compares their advantages and disadvantages. Three specific limitations in existing platformsmore »are identified from the perspective of natural hazard simulation. For mitigating the identified limitations, two platform design recommendations are provided, namely message exchange wrappers and hybrid communication, to help improve data passing capabilities in existing solutions and provide some guidance for the design of a new domain-specific distributed simulation framework.« less
  3. Abstract This project is funded by the US National Science Foundation (NSF) through their NSF RAPID program under the title “Modeling Corona Spread Using Big Data Analytics.” The project is a joint effort between the Department of Computer & Electrical Engineering and Computer Science at FAU and a research group from LexisNexis Risk Solutions. The novel coronavirus Covid-19 originated in China in early December 2019 and has rapidly spread to many countries around the globe, with the number of confirmed cases increasing every day. Covid-19 is officially a pandemic. It is a novel infection with serious clinical manifestations, including death, and it has reached at least 124 countries and territories. Although the ultimate course and impact of Covid-19 are uncertain, it is not merely possible but likely that the disease will produce enough severe illness to overwhelm the worldwide health care infrastructure. Emerging viral pandemics can place extraordinary and sustained demands on public health and health systems and on providers of essential community services. Modeling the Covid-19 pandemic spread is challenging. But there are data that can be used to project resource demands. Estimates of the reproductive number (R) of SARS-CoV-2 show that at the beginning of the epidemic, each infectedmore »person spreads the virus to at least two others, on average (Emanuel et al. in N Engl J Med. 2020, Livingston and Bucher in JAMA 323(14):1335, 2020). A conservatively low estimate is that 5 % of the population could become infected within 3 months. Preliminary data from China and Italy regarding the distribution of case severity and fatality vary widely (Wu and McGoogan in JAMA 323(13):1239–42, 2020). A recent large-scale analysis from China suggests that 80 % of those infected either are asymptomatic or have mild symptoms; a finding that implies that demand for advanced medical services might apply to only 20 % of the total infected. Of patients infected with Covid-19, about 15 % have severe illness and 5 % have critical illness (Emanuel et al. in N Engl J Med. 2020). Overall, mortality ranges from 0.25 % to as high as 3.0 % (Emanuel et al. in N Engl J Med. 2020, Wilson et al. in Emerg Infect Dis 26(6):1339, 2020). Case fatality rates are much higher for vulnerable populations, such as persons over the age of 80 years (> 14 %) and those with coexisting conditions (10 % for those with cardiovascular disease and 7 % for those with diabetes) (Emanuel et al. in N Engl J Med. 2020). Overall, Covid-19 is substantially deadlier than seasonal influenza, which has a mortality of roughly 0.1 %. Public health efforts depend heavily on predicting how diseases such as those caused by Covid-19 spread across the globe. During the early days of a new outbreak, when reliable data are still scarce, researchers turn to mathematical models that can predict where people who could be infected are going and how likely they are to bring the disease with them. These computational methods use known statistical equations that calculate the probability of individuals transmitting the illness. Modern computational power allows these models to quickly incorporate multiple inputs, such as a given disease’s ability to pass from person to person and the movement patterns of potentially infected people traveling by air and land. This process sometimes involves making assumptions about unknown factors, such as an individual’s exact travel pattern. By plugging in different possible versions of each input, however, researchers can update the models as new information becomes available and compare their results to observed patterns for the illness. In this paper we describe the development a model of Corona spread by using innovative big data analytics techniques and tools. We leveraged our experience from research in modeling Ebola spread (Shaw et al. Modeling Ebola Spread and Using HPCC/KEL System. In: Big Data Technologies and Applications 2016 (pp. 347-385). Springer, Cham) to successfully model Corona spread, we will obtain new results, and help in reducing the number of Corona patients. We closely collaborated with LexisNexis, which is a leading US data analytics company and a member of our NSF I/UCRC for Advanced Knowledge Enablement. The lack of a comprehensive view and informative analysis of the status of the pandemic can also cause panic and instability within society. Our work proposes the HPCC Systems Covid-19 tracker, which provides a multi-level view of the pandemic with the informative virus spreading indicators in a timely manner. The system embeds a classical epidemiological model known as SIR and spreading indicators based on causal model. The data solution of the tracker is built on top of the Big Data processing platform HPCC Systems, from ingesting and tracking of various data sources to fast delivery of the data to the public. The HPCC Systems Covid-19 tracker presents the Covid-19 data on a daily, weekly, and cumulative basis up to global-level and down to the county-level. It also provides statistical analysis for each level such as new cases per 100,000 population. The primary analysis such as Contagion Risk and Infection State is based on causal model with a seven-day sliding window. Our work has been released as a publicly available website to the world and attracted a great volume of traffic. The project is open-sourced and available on GitHub. The system was developed on the LexisNexis HPCC Systems, which is briefly described in the paper.« less
  4. Abstract
    This dataset contains monthly average output files from the iCAM6 simulations used in the manuscript "Enhancing understanding of the hydrological cycle via pairing of process-oriented and isotope ratio tracers," in review at the Journal of Advances in Modeling Earth Systems. A file corresponding to each of the tagged and isotopic variables used in this manuscript is included. Files are at 0.9° latitude x 1.25° longitude, and are in NetCDF format. Data from two simulations are included: 1) a simulation where the atmospheric model was "nudged" to ERA5 wind and surface pressure fields, by adding an additional tendency (see section 3.1 of associated manuscript), and 2) a simulation where the atmospheric state was allowed to freely evolve, using only boundary conditions imposed at the surface and top of atmosphere. Specific information about each of the variables provided is located in the "usage notes" section below. Associated article abstract: The hydrologic cycle couples the Earth's energy and carbon budgets through evaporation, moisture transport, and precipitation. Despite a wealth of observations and models, fundamental limitations remain in our capacity to deduce even the most basic properties of the hydrological cycle, including the spatial pattern of the residence time (RT) of water inMore>>
  5. Abstract

    Modeling and simulation is transforming modern materials science, becoming an important tool for the discovery of new materials and material phenomena, for gaining insight into the processes that govern materials behavior, and, increasingly, for quantitative predictions that can be used as part of a design tool in full partnership with experimental synthesis and characterization. Modeling and simulation is the essential bridge from good science to good engineering, spanning from fundamental understanding of materials behavior to deliberate design of new materials technologies leveraging new properties and processes. This Roadmap presents a broad overview of the extensive impact computational modeling has had in materials science in the past few decades, and offers focused perspectives on where the path forward lies as this rapidly expanding field evolves to meet the challenges of the next few decades. The Roadmap offers perspectives on advances within disciplines as diverse as phase field methods to model mesoscale behavior and molecular dynamics methods to deduce the fundamental atomic-scale dynamical processes governing materials response, to the challenges involved in the interdisciplinary research that tackles complex materials problems where the governing phenomena span different scales of materials behavior requiring multiscale approaches. The shift from understanding fundamental materials behavior tomore »development of quantitative approaches to explain and predict experimental observations requires advances in the methods and practice in simulations for reproducibility and reliability, and interacting with a computational ecosystem that integrates new theory development, innovative applications, and an increasingly integrated software and computational infrastructure that takes advantage of the increasingly powerful computational methods and computing hardware.

    « less