skip to main content


Title: A new approach to model reduction of nonlinear control systems using smooth orthogonal decomposition
Summary

A new approach to model order reduction of nonlinear control systems is aimed at developing persistentreduced order models(ROMs) that are robust to the changes in system's energy level. A multivariate analysis method calledsmooth orthogonal decomposition(SOD) is used to identify the dynamically relevant modal structures of the control system. The identified SOD subspaces are used to develop persistent ROMs. Performance of the resultant SOD‐based ROM is compared withproper orthogonal decomposition(POD)–based ROM by evaluating their robustness to the changes in system's energy level. Results show that SOD‐based ROMs are valid for a relatively wider range of the nonlinear control system's energy when compared with POD‐based models. In addition, the SOD‐based ROMs show considerably faster computations compared to the POD‐based ROMs of same order. For the considered dynamic system, SOD provides more effective reduction in dimension and complexity compared to POD.

 
more » « less
NSF-PAR ID:
10453362
Author(s) / Creator(s):
 ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
International Journal of Robust and Nonlinear Control
Volume:
28
Issue:
15
ISSN:
1049-8923
Page Range / eLocation ID:
p. 4367-4381
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Summary

    In this paper, we propose a new evolve‐then‐filter reduced order model (EF‐ROM). This is a regularized ROM (Reg‐ROM), which aims to add numerical stabilization to proper orthogonal decomposition (POD) ROMs for convection‐dominated flows. We also consider the Leray ROM (L‐ROM). These two Reg‐ROMs use explicit ROM spatial filtering to smooth (regularize) various terms in the ROMs. Two spatial filters are used: a POD projection onto a POD subspace (Proj) and a POD differential filter (DF). The four Reg‐ROM/filter combinations are tested in the numerical simulation of the three‐dimensional flow past a circular cylinder at a Reynolds numberRe=1000. Overall, the most accurate Reg‐ROM/filter combination is EF‐ROM‐DF. Furthermore, the spatial filter has a higher impact on the Reg‐ROM than the regularization used. Indeed, the DF generally yields better results than Proj for both the EF‐ROM and L‐ROM. Finally, the CPU times of the four Reg‐ROM/filter combinations are orders of magnitude lower than the CPU time of the DNS. Copyright © 2017 John Wiley & Sons, Ltd.

     
    more » « less
  2. Low-dimensional and computationally less-expensive reduced-order models (ROMs) have been widely used to capture the dominant behaviors of high-4dimensional systems. An ROM can be obtained, using the well-known proper orthogonal decomposition (POD), by projecting the full-order model to a subspace spanned by modal basis modes that are learned from experimental, simulated, or observational data, i.e., training data. However, the optimal basis can change with the parameter settings. When an ROM, constructed using the POD basis obtained from training data, is applied to new parameter settings, the model often lacks robustness against the change of parameters in design, control, and other real-time operation problems. This paper proposes to use regression trees on Grassmann manifold to learn the mapping between parameters and POD bases that span the low-dimensional subspaces onto which full-order models are projected. Motivated by the observation that a subspace spanned by a POD basis can be viewed as a point in the Grassmann manifold, we propose to grow a tree by repeatedly splitting the tree node to maximize the Riemannian distance between the two subspaces spanned by the predicted POD bases on the left and right daughter nodes. Five numerical examples are presented to comprehensively demonstrate the performance of the proposed method, and compare the proposed tree-based method to the existing interpolation method for POD basis and the use of global POD basis. The results show that the proposed tree-based method is capable of establishing the mapping between parameters and POD bases, and thus adapt ROMs for new parameters.

     
    more » « less
  3. Traditional linear subspace-based reduced order models (LS-ROMs) can be used to significantly accelerate simulations in which the solution space of the discretized system has a small dimension (with a fast decaying Kolmogorov 𝑛-width). However, LS-ROMs struggle to achieve speed-ups in problems whose solution space has a large dimension, such as highly nonlinear problems whose solutions have large gradients. Such an issue can be alleviated by combining nonlinear model reduction with operator learning. Over the past decade, many nonlinear manifold-based reduced order models (NM-ROM) have been proposed. In particular, NM-ROMs based on deep neural networks (DNN) have received increasing interest. This work takes inspiration from adaptive basis methods and specifically focuses on developing an NM-ROM based on Convolutional Neural Network-based autoencoders (CNNAE) with iteration-dependent trainable kernels. Additionally, we investigate DNN-based and quadratic operator inference strategies between latent spaces. A strategy to perform vectorized implicit time integration is also proposed. We demonstrate that the proposed CNN-based NM-ROM, combined with DNN- based operator inference, generally performs better than commonly employed strategies (in terms of prediction accuracy) on a benchmark advection-dominated problem. The method also presents substantial gain in terms of training speed per epoch, with a training time about one order of magnitude smaller than the one associated with a state-of-the-art technique performing with the same level of accuracy. 
    more » « less
  4. Traditional linear subspace-based reduced order models (LS-ROMs) can be used to significantly accelerate simulations in which the solution space of the discretized system has a small dimension (with a fast decaying Kolmogorov n-width). However, LS-ROMs struggle to achieve speed-ups in problems whose solution space has a large dimension, such as highly nonlinear problems whose solutions have large gradients. Such an issue can be alleviated by combining nonlinear model reduction with operator learning. Over the past decade, many nonlinear manifold-based reduced order models (NM-ROM) have been proposed. In particular, NM-ROMs based on deep neural networks (DNN) have received increasing interest. This work takes inspiration from adaptive basis methods and specifically focuses on developing an NM-ROM based on Convolutional Neural Network-based autoencoders (CNNAE) with iteration-dependent trainable kernels. Additionally, we investigate DNN-based and quadratic operator inference strategies between latent spaces. A strategy to perform vectorized implicit time integration is also proposed. We demonstrate that the proposed CNN-based NM-ROM, combined with DNN- based operator inference, generally performs better than commonly employed strategies (in terms of prediction accuracy) on a benchmark advection-dominated problem. The method also presents substantial gain in terms of training speed per epoch, with a training time about one order of magnitude smaller than the one associated with a state-of-the-art technique performing with the same level of accuracy. 
    more » « less
  5. null (Ed.)
    Developing accurate, efficient, and robust closure models is essential in the construction of reduced order models (ROMs) for realistic nonlinear systems, which generally require drastic ROM mode truncations. We propose a deep residual neural network (ResNet) closure learning framework for ROMs of nonlinear systems. The novel ResNet-ROM framework consists of two steps: (i) In the first step, we use ROM projection to filter the given nonlinear system and construct a spatially filtered ROM. This filtered ROM is low-dimensional, but is not closed. (ii) In the second step, we use ResNet to close the filtered ROM, i.e., to model the interaction between the resolved and unresolved ROM modes. We emphasize that in the new ResNet-ROM framework, data is used only to complement classical physical modeling (i.e., only in the closure modeling component), not to completely replace it. We also note that the new ResNet-ROM is built on general ideas of spatial filtering and deep learning and is independent of (restrictive) phenomenological arguments, e.g., of eddy viscosity type. The numerical experiments for the 1D Burgers equation show that the ResNet-ROM is significantly more accurate than the standard projection ROM. The new ResNet-ROM is also more accurate and significantly more efficient than other modern ROM closure models. 
    more » « less