Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract More than three decades ago, Boyd and Balakrishnan established a regularity result for the two-norm of a transfer function at maximizers. Their result extends easily to the statement that the maximum eigenvalue of a univariate real analytic Hermitian matrix family is twice continuously differentiable, with Lipschitz second derivative, at all local maximizers, a property that is useful in several applications that we describe. We also investigate whether this smoothness property extends to max functions more generally. We show that the pointwise maximum of a finite set ofq-times continuously differentiable univariate functions must have zero derivative at a maximizer for$$q=1$$ , but arbitrarily close to the maximizer, the derivative may not be defined, even when$$q=3$$ and the maximizer is isolated.more » « less
-
This review discusses Operator Inference, a nonintrusive reduced modeling approach that incorporates physical governing equations by defining a structured polynomial form for the reduced model, and then learns the corresponding reduced operators from simulated training data. The polynomial model form of Operator Inference is sufficiently expressive to cover a wide range of nonlinear dynamics found in fluid mechanics and other fields of science and engineering, while still providing efficient reduced model computations. The learning steps of Operator Inference are rooted in classical projection-based model reduction; thus, some of the rich theory of model reduction can be applied to models learned with Operator Inference. This connection to projection-based model reduction theory offers a pathway toward deriving error estimates and gaining insights to improve predictions. Furthermore, through formulations of Operator Inference that preserve Hamiltonian and other structures, important physical properties such as energy conservation can be guaranteed in the predictions of the reduced model beyond the training horizon. This review illustrates key computational steps of Operator Inference through a large-scale combustion example.more » « less
-
This work introduces a data-driven control approach for stabilizing high-dimensional dynamical systems from scarce data. The proposed context-aware controller inference approach is based on the observation that controllers need to act locally only on the unstable dynamics to stabilize systems. This means it is sufficient to learn the unstable dynamics alone, which are typically confined to much lower dimensional spaces than the high-dimensional state spaces of all system dynamics and thus few data samples are sufficient to identify them. Numerical experiments demonstrate that context-aware controller inference learns stabilizing controllers from orders of magnitude fewer data samples than traditional data-driven control techniques and variants of reinforcement learning. The experiments further show that the low data requirements of context-aware controller inference are especially beneficial in data-scarce engineering problems with complex physics, for which learning complete system dynamics is often intractable in terms of data and training costs.more » « less
An official website of the United States government
