skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Regression Trees on Grassmann Manifold for Adapting Reduced-Order Models
Low-dimensional and computationally less-expensive reduced-order models (ROMs) have been widely used to capture the dominant behaviors of high-4dimensional systems. An ROM can be obtained, using the well-known proper orthogonal decomposition (POD), by projecting the full-order model to a subspace spanned by modal basis modes that are learned from experimental, simulated, or observational data, i.e., training data. However, the optimal basis can change with the parameter settings. When an ROM, constructed using the POD basis obtained from training data, is applied to new parameter settings, the model often lacks robustness against the change of parameters in design, control, and other real-time operation problems. This paper proposes to use regression trees on Grassmann manifold to learn the mapping between parameters and POD bases that span the low-dimensional subspaces onto which full-order models are projected. Motivated by the observation that a subspace spanned by a POD basis can be viewed as a point in the Grassmann manifold, we propose to grow a tree by repeatedly splitting the tree node to maximize the Riemannian distance between the two subspaces spanned by the predicted POD bases on the left and right daughter nodes. Five numerical examples are presented to comprehensively demonstrate the performance of the proposed method, and compare the proposed tree-based method to the existing interpolation method for POD basis and the use of global POD basis. The results show that the proposed tree-based method is capable of establishing the mapping between parameters and POD bases, and thus adapt ROMs for new parameters.  more » « less
Award ID(s):
2143695
PAR ID:
10467108
Author(s) / Creator(s):
;
Publisher / Repository:
American Institute of Aeronautics and Astronautics
Date Published:
Journal Name:
AIAA Journal
Volume:
61
Issue:
3
ISSN:
0001-1452
Page Range / eLocation ID:
1318 to 1333
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Traditional linear subspace-based reduced order models (LS-ROMs) can be used to significantly accelerate simulations in which the solution space of the discretized system has a small dimension (with a fast decaying Kolmogorov đť‘›-width). However, LS-ROMs struggle to achieve speed-ups in problems whose solution space has a large dimension, such as highly nonlinear problems whose solutions have large gradients. Such an issue can be alleviated by combining nonlinear model reduction with operator learning. Over the past decade, many nonlinear manifold-based reduced order models (NM-ROM) have been proposed. In particular, NM-ROMs based on deep neural networks (DNN) have received increasing interest. This work takes inspiration from adaptive basis methods and specifically focuses on developing an NM-ROM based on Convolutional Neural Network-based autoencoders (CNNAE) with iteration-dependent trainable kernels. Additionally, we investigate DNN-based and quadratic operator inference strategies between latent spaces. A strategy to perform vectorized implicit time integration is also proposed. We demonstrate that the proposed CNN-based NM-ROM, combined with DNN- based operator inference, generally performs better than commonly employed strategies (in terms of prediction accuracy) on a benchmark advection-dominated problem. The method also presents substantial gain in terms of training speed per epoch, with a training time about one order of magnitude smaller than the one associated with a state-of-the-art technique performing with the same level of accuracy. 
    more » « less
  2. Traditional linear subspace-based reduced order models (LS-ROMs) can be used to significantly accelerate simulations in which the solution space of the discretized system has a small dimension (with a fast decaying Kolmogorov n-width). However, LS-ROMs struggle to achieve speed-ups in problems whose solution space has a large dimension, such as highly nonlinear problems whose solutions have large gradients. Such an issue can be alleviated by combining nonlinear model reduction with operator learning. Over the past decade, many nonlinear manifold-based reduced order models (NM-ROM) have been proposed. In particular, NM-ROMs based on deep neural networks (DNN) have received increasing interest. This work takes inspiration from adaptive basis methods and specifically focuses on developing an NM-ROM based on Convolutional Neural Network-based autoencoders (CNNAE) with iteration-dependent trainable kernels. Additionally, we investigate DNN-based and quadratic operator inference strategies between latent spaces. A strategy to perform vectorized implicit time integration is also proposed. We demonstrate that the proposed CNN-based NM-ROM, combined with DNN- based operator inference, generally performs better than commonly employed strategies (in terms of prediction accuracy) on a benchmark advection-dominated problem. The method also presents substantial gain in terms of training speed per epoch, with a training time about one order of magnitude smaller than the one associated with a state-of-the-art technique performing with the same level of accuracy. 
    more » « less
  3. Principal Component Analysis (PCA) and Kernel Principal Component Analysis (KPCA) are fundamental methods in machine learning for dimensionality reduction. The former is a technique for finding this approximation in finite dimensions and the latter is often in an infinite dimensional Reproducing Kernel Hilbert-space (RKHS). In this paper, we present a geometric framework for computing the principal linear subspaces in both situations as well as for the robust PCA case, that amounts to computing the intrinsic average on the space of all subspaces: the Grassmann manifold. Points on this manifold are defined as the subspaces spanned by K -tuples of observations. The intrinsic Grassmann average of these subspaces are shown to coincide with the principal components of the observations when they are drawn from a Gaussian distribution. We show similar results in the RKHS case and provide an efficient algorithm for computing the projection onto the this average subspace. The result is a method akin to KPCA which is substantially faster. Further, we present a novel online version of the KPCA using our geometric framework. Competitive performance of all our algorithms are demonstrated on a variety of real and synthetic data sets. 
    more » « less
  4. null (Ed.)
    There are two main strategies for improving the projection-based reduced order model (ROM) accuracy—(i) improving the ROM, that is, adding new terms to the standard ROM; and (ii) improving the ROM basis, that is, constructing ROM bases that yield more accurate ROMs. In this paper, we use the latter. We propose two new Lagrangian inner products that we use together with Eulerian and Lagrangian data to construct two new Lagrangian ROMs, which we denote α-ROM and λ-ROM. We show that both Lagrangian ROMs are more accurate than the standard Eulerian ROMs, that is, ROMs that use standard Eulerian inner product and data to construct the ROM basis. Specifically, for the quasi-geostrophic equations, we show that the new Lagrangian ROMs are more accurate than the standard Eulerian ROMs in approximating not only Lagrangian fields (e.g., the finite time Lyapunov exponent (FTLE)), but also Eulerian fields (e.g., the streamfunction). In particular, the α-ROM can be orders of magnitude more accurate than the standard Eulerian ROMs. We emphasize that the new Lagrangian ROMs do not employ any closure modeling to model the effect of discarded modes (which is standard procedure for low-dimensional ROMs of complex nonlinear systems). Thus, the dramatic increase in the new Lagrangian ROMs’ accuracy is entirely due to the novel Lagrangian inner products used to build the Lagrangian ROM basis. 
    more » « less
  5. State estimation is key to both analysing physical mechanisms and enabling real-time control of fluid flows. A common estimation approach is to relate sensor measurements to a reduced state governed by a reduced-order model (ROM). (When desired, the full state can be recovered via the ROM.) Current methods in this category nearly always use a linear model to relate the sensor data to the reduced state, which often leads to restrictions on sensor locations and has inherent limitations in representing the generally nonlinear relationship between the measurements and reduced state. We propose an alternative methodology whereby a neural network architecture is used to learn this nonlinear relationship. A neural network is a natural choice for this estimation problem, as a physical interpretation of the reduced state–sensor measurement relationship is rarely obvious. The proposed estimation framework is agnostic to the ROM employed, and can be incorporated into any choice of ROMs derived on a linear subspace (e.g. proper orthogonal decomposition) or a nonlinear manifold. The proposed approach is demonstrated on a two-dimensional model problem of separated flow around a flat plate, and is found to outperform common linear estimation alternatives. 
    more » « less