skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on May 1, 2026

Title: Predicting Milk Flow Behavior in Human Lactating Breast: An Integrated Machine Learning and Computational Fluid Dynamics Approach
This study develops a comprehensive framework that integrates computational fluid dynamics (CFD) and machine learning (ML) to predict milk flow behavior in lactating breasts. Utilizing CFD and other high-fidelity simulation techniques to tackle fluid flow challenges often entails significant computational resources and time investment. Artificial neural networks (ANNs) offer a promising avenue for grasping complex relationships among high-dimensional variables. This study leverages this potential to introduce an innovative data-driven approach to CFD. The initial step involved using CFD simulations to generate the necessary training and validation datasets. A machine learning pipeline was then crafted to train the ANN. Furthermore, various ANN architectures were explored, and their predictive performance was compared. The design of experiments method was also harnessed to identify the minimum number of simulations needed for precise predictions. This study underscores the synergy between CFD and ML methodologies, designated as ML-CFD. This novel integration enables a neural network to generate CFD-like results, resulting in significant savings in time and computational resources typically required for traditional CFD simulations. The models developed through this ML-CFD approach demonstrate remarkable efficiency and robustness, enabling faster exploration of milk flow behavior in individual lactating breasts compared to conventional CFD solvers.  more » « less
Award ID(s):
2121075
PAR ID:
10652256
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
ASME journal of Biomedical Engineering
Date Published:
Journal Name:
Journal of Biomechanical Engineering
Volume:
147
Issue:
5
ISSN:
0148-0731
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Continuous provision of quality supply air to data center’s IT pod room is a key parameter in ensuring effective data center operation without any down time. Due to number of possible operating conditions and non-linear relations between operating parameters make the working mechanism of data center difficult to optimize energy use. At present industries are using computational fluid dynamics (CFD) to simulate thermal behaviour for all types of operating conditions. The focus of this study is to predict Supply Air Temperature using Artificial Neural Network (ANN) which can overcome limitations of CFD such as high cost, need of an expertise and large computation time. For developing ANN, input parameters, number of neurons and hidden layers, activation function and the period of training data set were studied. A commercial CFD software package 6sigma room is used to develop a modular data center consisting of an IT pod room and an air-handling unit. CFD analysis is carried out for different outside air conditions. Historical weather data of 1 year was considered as an input for CFD analysis. The ANN model is “trained” using data generated from these CFD results. The predictions of ANN model and the results of CFD analysis for a set of example scenarios were compared to measure the agreement between the two. The results show that the prediction of ANN model is much faster than full computational fluid dynamics simulations with good prediction accuracy. This demonstrates that ANN is an effective way for predicting the performance of an air handling unit. 
    more » « less
  2. In this study, pressure drop ( ) across air-cooled heat sinks (HSs) are predicted using an artificial neural network (ANN). A multilayer feed-forward ANN architecture with two hidden layers is developed. Backpropagation algorithm is used for training the network, and the accuracy of the network is evaluated by the root mean square error. The input data for training the neural network is prepared through three-dimensional simulation of air inside the channels of heat sinks using a computational fluid dynamics (CFD) approach. The developed ANN-based model in this study predicts with a high accuracy and within of the CFD-based data. The present study suggests that developing an ANN-based model with a high level of accuracy overcomes the limitations of physics-based correlations that their accuracy strongly depends on identifying and implementing key variables that affect the physics of a thermo-fluid phenomenon. 
    more » « less
  3. For energy-assisted compression ignition (EACI) engine propulsion at high-altitude operating conditions using sustainable jet fuels with varying cetane numbers, it is essential to develop an efficient engine control system for robust and optimal operation. Control systems are typically trained using experimental data, which can be costly and time consuming to generate due to setup time of experiments, unforeseen delays/issues with manufacturing, mishaps/engine failures and the consequent repairs (which can take weeks), and errors in measurements. Computational fluid dynamics (CFD) simulations can overcome such burdens by complementing experiments with simulated data for control system training. Such simulations, however, can be computationally expensive. Existing data-driven machine learning (ML) models have shown promise for emulating the expensive CFD simulator, but encounter key limitations here due to the expensive nature of the training data and the range of differing combustion behaviors (e.g. misfires and partial/delayed ignition) observed at such broad operating conditions. We thus develop a novel physics-integrated emulator, called the Misfire-Integrated GP (MInt-GP), which integrates important auxiliary information on engine misfires within a Gaussian process surrogate model. With limited CFD training data, we show the MInt-GP model can yield reliable predictions of in-cylinder pressure evolution profiles and subsequent heat release profiles and engine CA50 predictions at a broad range of input conditions. We further demonstrate much better prediction capabilities of the MInt-GP at different combustion behaviors compared to existing data-driven ML models such as kriging and neural networks, while also observing up to 80 times computational speed-up over CFD, thus establishing its effectiveness as a tool to assist CFD for fast data generation in control system training. 
    more » « less
  4. Brehm, Christoph; Pandya, Shishir (Ed.)
    Computational fluid dynamics (CFD) and its uncertainty quantification are computationally expensive. We use Gaussian Process (GP) methods to demonstrate that machine learning can build efficient and accurate surrogate models to replace CFD simulations with significantly reduced computational cost without compromising the physical accuracy. We also demonstrate that both epistemic uncertainty (machine learning model uncertainty) and aleatory uncertainty (randomness in the inputs of CFD) can be accommodated when the machine learning model is used to reveal fluid dynamics. The demonstration is performed by applying simulation of Hagen-Poiseuille and Womersley flows that involve spatial and spatial-tempo responses, respectively. Training points are generated by using the analytical solutions with evenly discretized spatial or spatial-temporal variables. Then GP surrogate models are built using supervised machine learning regression. The error of the GP model is quantified by the estimated epistemic uncertainty. The results are compared with those from GPU-accelerated volumetric lattice Boltzmann simulations. The results indicate that surrogate models can produce accurate fluid dynamics (without CFD simulations) with quantified uncertainty when both epistemic and aleatory uncertainties exist. 
    more » « less
  5. In this work, an artificial neural network (ANN) aided vapor–liquid equilibrium (VLE) model is developed and coupled with a fully compressible computational fluid dynamics (CFD) solver to simulate the transcritical processes occurring in high-pressure liquid-fueled propulsion systems. The ANN is trained in Python using TensorFlow, optimized for inference using Open Neural Network Exchange Runtime, and coupled with a C++ based CFD solver. This plug-and-play model/methodology can be used to convert any multi-component CFD solver to simulate transcritical processes using only open-source packages, without the need of in-house VLE model development. The solver is then used to study high-pressure transcritical shock-droplet interaction in both two- and four-component systems and a turbulent temporal mixing layer (TML), where both qualitative and quantitative agreement (maximum relative error less than 5%) is shown with respect to results based on both direct evaluation and the state-of-the-art in situ adaptive tabulation (ISAT) method. The ANN method showed a 6 times speed-up over the direct evaluation and a 2.2-time speed-up over the ISAT method for the two-component shock-droplet interaction case. The ANN method is faster than the ISAT method by 12 times for the four-component shock-droplet interaction. A 7 times speed-up is observed for the TML case for the ANN method compared to the ISAT method while achieving a data compression factor of 2881. The ANN method also shows intrinsic load balancing, unlike traditional VLE solvers. A strong parallel scalability of this ANN method with the number of processors was observed for all the three test cases. Code repository for 0D VLE solvers, and C++ ANN interface—https://github.com/UMN-CRFEL/ANN_VLE.git. 
    more » « less