“Active structures” are physical structures that incorporate real-time monitoring and control. Examples include active vibration damping or blast mitigation systems. Evaluating physics-based models in real-time is generally not feasible for such systems having high-rate dynamics which require microsecond response times, but data-driven machine-learning-based models can potentially offer a solution. This paper compares the cost and performance of two FPGA-based implementations of real-time, continuously-trained models for forecasting timeseries signals with non-stationarities, with one using HighLevel Synthesis (HLS) and the other a programmable overlay architecture. The proposed model accepts a uni-variate vibration signal and seeks to forecast future samples to inform highrate controllers. The proposed forecasting method performs two concurrent neural inference operations. One inference forecasts the state of the signal f samples into the future as a function of the most recent h samples, while the other forecasts the current sample given h samples starting from h + f − 1 samples into the past. The first forecast produces the forecast while the second forecast allows the system to calculate the model’s loss and perform an immediate model update before the next sample period.
more »
« less
Global forecasts in reservoir computers
A reservoir computer is a machine learning model that can be used to predict the future state(s) of time-dependent processes, e.g., dynamical systems. In practice, data in the form of an input-signal are fed into the reservoir. The trained reservoir is then used to predict the future state of this signal. We develop a new method for not only predicting the future dynamics of the input-signal but also the future dynamics starting at an arbitrary initial condition of a system. The systems we consider are the Lorenz, Rossler, and Thomas systems restricted to their attractors. This method, which creates a global forecast, still uses only a single input-signal to train the reservoir but breaks the signal into many smaller windowed signals. We examine how well this windowed method is able to forecast the dynamics of a system starting at an arbitrary point on a system’s attractor and compare this to the standard method without windows. We find that the standard method has almost no ability to forecast anything but the original input-signal while the windowed method can capture the dynamics starting at most points on an attractor with significant accuracy.
more »
« less
- Award ID(s):
- 2205837
- PAR ID:
- 10524534
- Publisher / Repository:
- Chaos: An Interdisciplinary Journal of Nonlinear Science
- Date Published:
- Journal Name:
- Chaos: An Interdisciplinary Journal of Nonlinear Science
- Volume:
- 34
- Issue:
- 2
- ISSN:
- 1054-1500
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
“Active structures” are physical structures that incorporate real-time monitoring and control. Examples includeactive vibration damping or blast mitigation systems. Evaluating physics-based models in real-time is generally not feasible for such systems having high-rate dynamics which require microsecond response times, but data-driven machine-learning-based models can potentially offer a solution. This paper compares the cost and performance of two FPGA-based implementations of real-time, continuously-trained models for forecasting timeseries signals with non-stationarities, with one using HighLevel Synthesis (HLS) and the other a programmable overlay architecture. The proposed model accepts a uni-variate vibration signal and seeks to forecast future samples to inform highrate controllers. The proposed forecasting method performs two concurrent neural inference operations. One inference forecasts the state of the signal f samples into the future as a function of the most recent h samples, while the other forecasts the current sample given h samples starting from h + f − 1 samples into the past. The first forecast produces the forecast while the second forecast allows the system to calculate the model’s loss and perform an immediate model update before the next sample period.more » « less
-
Recent work has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of unknown chaotic dynamical systems. Short-term predictions of the state evolution and long-term predictions of the statistical patterns of the dynamics (``climate'') can be produced by employing a feedback loop, whereby the model is trained to predict forward one time step, then the model output is used as input for multiple time steps. In the absence of mitigating techniques, however, this technique can result in artificially rapid error growth. In this article, we systematically examine the technique of adding noise to the ML model input during training to promote stability and improve prediction accuracy. Furthermore, we introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training. Our case study uses reservoir computing, a machine-learning method using recurrent neural networks, to predict the spatiotemporal chaotic Kuramoto-Sivashinsky equation. We find that reservoir computers trained with noise or with LMNT produce climate predictions that appear to be indefinitely stable and have a climate very similar to the true system, while reservoir computers trained without regularization are unstable. Compared with other regularization techniques that yield stability in some cases, we find that both short-term and climate predictions from reservoir computers trained with noise or with LMNT are substantially more accurate. Finally, we show that the deterministic aspect of our LMNT regularization facilitates fast hyperparameter tuning when compared to training with noise.more » « less
-
We articulate the design imperatives for machine learning based digital twins for nonlinear dynamical systems, which can be used to monitor the “health” of the system and anticipate future collapse. The fundamental requirement for digital twins of nonlinear dynamical systems is dynamical evolution: the digital twin must be able to evolve its dynamical state at the present time to the next time step without further state input—a requirement that reservoir computing naturally meets. We conduct extensive tests using prototypical systems from optics, ecology, and climate, where the respective specific examples are a chaotic CO2 laser system, a model of phytoplankton subject to seasonality, and the Lorenz-96 climate network. We demonstrate that, with a single or parallel reservoir computer, the digital twins are capable of a variety of challenging forecasting and monitoring tasks. Our digital twin has the following capabilities: (1) extrapolating the dynamics of the target system to predict how it may respond to a changing dynamical environment, e.g., a driving signal that it has never experienced before, (2) making continual forecasting and monitoring with sparse real-time updates under non-stationary external driving, (3) inferring hidden variables in the target system and accurately reproducing/predicting their dynamical evolution, (4) adapting to external driving of different waveform, and (5) extrapolating the global bifurcation behaviors to network systems of different sizes. These features make our digital twins appealing in applications, such as monitoring the health of critical systems and forecasting their potential collapse induced by environmental changes or perturbations. Such systems can be an infrastructure, an ecosystem, or a regional climate system.more » « less
-
Abstract A simple and efficient Bayesian machine learning (BML) training algorithm, which exploits only a 20‐year short observational time series and an approximate prior model, is developed to predict the Niño 3 sea surface temperature (SST) index. The BML forecast significantly outperforms model‐based ensemble predictions and standard machine learning forecasts. Even with a simple feedforward neural network (NN), the BML forecast is skillful for 9.5 months. Remarkably, the BML forecast overcomes the spring predictability barrier to a large extent: the forecast starting from spring remains skillful for nearly 10 months. The BML algorithm can also effectively utilize multiscale features: the BML forecast of SST using SST, thermocline, and windburst improves on the BML forecast using just SST by at least 2 months. Finally, the BML algorithm also reduces the forecast uncertainty of NNs and is robust to input perturbations.more » « less
An official website of the United States government

