skip to main content


Title: Regression Trees on Grassmann Manifold for Adapting Reduced-Order Models

Low-dimensional and computationally less-expensive reduced-order models (ROMs) have been widely used to capture the dominant behaviors of high-4dimensional systems. An ROM can be obtained, using the well-known proper orthogonal decomposition (POD), by projecting the full-order model to a subspace spanned by modal basis modes that are learned from experimental, simulated, or observational data, i.e., training data. However, the optimal basis can change with the parameter settings. When an ROM, constructed using the POD basis obtained from training data, is applied to new parameter settings, the model often lacks robustness against the change of parameters in design, control, and other real-time operation problems. This paper proposes to use regression trees on Grassmann manifold to learn the mapping between parameters and POD bases that span the low-dimensional subspaces onto which full-order models are projected. Motivated by the observation that a subspace spanned by a POD basis can be viewed as a point in the Grassmann manifold, we propose to grow a tree by repeatedly splitting the tree node to maximize the Riemannian distance between the two subspaces spanned by the predicted POD bases on the left and right daughter nodes. Five numerical examples are presented to comprehensively demonstrate the performance of the proposed method, and compare the proposed tree-based method to the existing interpolation method for POD basis and the use of global POD basis. The results show that the proposed tree-based method is capable of establishing the mapping between parameters and POD bases, and thus adapt ROMs for new parameters.

 
more » « less
Award ID(s):
2143695
NSF-PAR ID:
10467108
Author(s) / Creator(s):
;
Publisher / Repository:
American Institute of Aeronautics and Astronautics
Date Published:
Journal Name:
AIAA Journal
Volume:
61
Issue:
3
ISSN:
0001-1452
Page Range / eLocation ID:
1318 to 1333
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Traditional linear subspace-based reduced order models (LS-ROMs) can be used to significantly accelerate simulations in which the solution space of the discretized system has a small dimension (with a fast decaying Kolmogorov 𝑛-width). However, LS-ROMs struggle to achieve speed-ups in problems whose solution space has a large dimension, such as highly nonlinear problems whose solutions have large gradients. Such an issue can be alleviated by combining nonlinear model reduction with operator learning. Over the past decade, many nonlinear manifold-based reduced order models (NM-ROM) have been proposed. In particular, NM-ROMs based on deep neural networks (DNN) have received increasing interest. This work takes inspiration from adaptive basis methods and specifically focuses on developing an NM-ROM based on Convolutional Neural Network-based autoencoders (CNNAE) with iteration-dependent trainable kernels. Additionally, we investigate DNN-based and quadratic operator inference strategies between latent spaces. A strategy to perform vectorized implicit time integration is also proposed. We demonstrate that the proposed CNN-based NM-ROM, combined with DNN- based operator inference, generally performs better than commonly employed strategies (in terms of prediction accuracy) on a benchmark advection-dominated problem. The method also presents substantial gain in terms of training speed per epoch, with a training time about one order of magnitude smaller than the one associated with a state-of-the-art technique performing with the same level of accuracy. 
    more » « less
  2. Principal Component Analysis (PCA) and Kernel Principal Component Analysis (KPCA) are fundamental methods in machine learning for dimensionality reduction. The former is a technique for finding this approximation in finite dimensions and the latter is often in an infinite dimensional Reproducing Kernel Hilbert-space (RKHS). In this paper, we present a geometric framework for computing the principal linear subspaces in both situations as well as for the robust PCA case, that amounts to computing the intrinsic average on the space of all subspaces: the Grassmann manifold. Points on this manifold are defined as the subspaces spanned by K -tuples of observations. The intrinsic Grassmann average of these subspaces are shown to coincide with the principal components of the observations when they are drawn from a Gaussian distribution. We show similar results in the RKHS case and provide an efficient algorithm for computing the projection onto the this average subspace. The result is a method akin to KPCA which is substantially faster. Further, we present a novel online version of the KPCA using our geometric framework. Competitive performance of all our algorithms are demonstrated on a variety of real and synthetic data sets. 
    more » « less
  3. State estimation is key to both analysing physical mechanisms and enabling real-time control of fluid flows. A common estimation approach is to relate sensor measurements to a reduced state governed by a reduced-order model (ROM). (When desired, the full state can be recovered via the ROM.) Current methods in this category nearly always use a linear model to relate the sensor data to the reduced state, which often leads to restrictions on sensor locations and has inherent limitations in representing the generally nonlinear relationship between the measurements and reduced state. We propose an alternative methodology whereby a neural network architecture is used to learn this nonlinear relationship. A neural network is a natural choice for this estimation problem, as a physical interpretation of the reduced state–sensor measurement relationship is rarely obvious. The proposed estimation framework is agnostic to the ROM employed, and can be incorporated into any choice of ROMs derived on a linear subspace (e.g. proper orthogonal decomposition) or a nonlinear manifold. The proposed approach is demonstrated on a two-dimensional model problem of separated flow around a flat plate, and is found to outperform common linear estimation alternatives. 
    more » « less
  4. Designing and/or controlling complex systems in science and engineering relies on appropriate mathematical modeling of systems dynamics. Classical differential equation based solutions in applied and computational mathematics are often computationally demanding. Recently, the connection between reduced-order models of high-dimensional differential equation systems and surrogate machine learning models has been explored. However, the focus of both existing reduced-order and machine learning models for complex systems has been how to best approximate the high fidelity model of choice. Due to high complexity and often limited training data to derive reduced-order or machine learning surrogate models, it is critical for derived reduced-order models to have reliable uncertainty quantification at the same time. In this paper, we propose such a novel framework of Bayesian reduced-order models naturally equipped with uncertainty quantification as it learns the distributions of the parameters of the reduced-order models instead of their point estimates. In particular, we develop learnable Bayesian proper orthogonal decomposition (BayPOD) that learns the distributions of both the POD projection bases and the mapping from the system input parameters to the projected scores/coefficients so that the learned BayPOD can help predict high-dimensional systems dynamics/fields as quantities of interest in different setups with reliable uncertainty estimates. The developed learnable BayPOD inherits the capability of embedding physics constraints when learning the POD-based surrogate reduced-order models, a desirable feature when studying complex systems in science and engineering applications where the available training data are limited. Furthermore, the proposed BayPOD method is an end-to-end solution, which unlike other surrogate-based methods, does not require separate POD and machine learning steps. The results from a real-world case study of the pressure field around an airfoil. 
    more » « less
  5. Summary

    In this paper, we propose a new evolve‐then‐filter reduced order model (EF‐ROM). This is a regularized ROM (Reg‐ROM), which aims to add numerical stabilization to proper orthogonal decomposition (POD) ROMs for convection‐dominated flows. We also consider the Leray ROM (L‐ROM). These two Reg‐ROMs use explicit ROM spatial filtering to smooth (regularize) various terms in the ROMs. Two spatial filters are used: a POD projection onto a POD subspace (Proj) and a POD differential filter (DF). The four Reg‐ROM/filter combinations are tested in the numerical simulation of the three‐dimensional flow past a circular cylinder at a Reynolds numberRe=1000. Overall, the most accurate Reg‐ROM/filter combination is EF‐ROM‐DF. Furthermore, the spatial filter has a higher impact on the Reg‐ROM than the regularization used. Indeed, the DF generally yields better results than Proj for both the EF‐ROM and L‐ROM. Finally, the CPU times of the four Reg‐ROM/filter combinations are orders of magnitude lower than the CPU time of the DNS. Copyright © 2017 John Wiley & Sons, Ltd.

     
    more » « less