skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: A Data-Driven Hybrid Automaton Framework to Modeling Complex Dynamical Systems
In this paper, a computationally efficient data-driven hybrid automaton model is proposed to capture unknown complex dynamical system behaviors using multiple neural networks. The sampled data of the system is divided by valid partitions into groups corresponding to their topologies and based on which, transition guards are defined. Then, a collection of small-scale neural networks that are computationally efficient are trained as the local dynamical description for their corresponding topologies. After modeling the system with a neural-network-based hybrid automaton, the set-valued reachability analysis with low computation cost is provided based on interval analysis and a split and combined process. At last, a numerical example of the limit cycle is presented to illustrate that the developed models can significantly reduce the computational cost in reachable set computation without sacrificing any modeling precision.  more » « less
Award ID(s):
2223035 2143351
PAR ID:
10437761
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
2023 IEEE International Conference on Industrial Technology (ICIT)
Page Range / eLocation ID:
1 to 6
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. In this paper, a data-driven neural hybrid system modeling framework via the Maximum Entropy partitioning approach is proposed for complex dynamical system modeling such as human motion dynamics. The sampled data collected from the system is partitioned into segmented data sets using the Maximum Entropy approach, and the mode transition logic is then defined. Then, as the local dynamical description for their corresponding partitions, a collection of small-scale neural networks is trained. Following a neural hybrid system model of the system, a set-valued reachability analysis with low computation cost is provided based on interval analysis and a split and combined process to demonstrate the benefits of our approach in computationally expensive tasks. Finally, a numerical examples of the limit cycle and a human behavior modeling example are provided to demonstrate the effectiveness and efficiency of the developed methods. 
    more » « less
  2. Abstract This article proposes a neural network hybrid modeling framework for dynamics learning to promote an interpretable, computationally efficient method of dynamics learning and system identification. First, a low-level model is trained to learn the system dynamics, which utilizes multiple simple neural networks to approximate the local dynamics generated from data-driven partitions. Then, based on the low-level model, a high-level model is trained to abstract the low-level neural hybrid system model into a transition system that allows computational tree logic (CTL) verification to promote model’s ability to handle human interaction and verification efficiency. 
    more » « less
  3. Deformable image registration (DIR), aiming to find spatial correspondence between images, is one of the most critical problems in the domain of medical image analysis. In this paper, we present a novel, generic, and accurate diffeomorphic image registration framework that utilizes neural ordinary differential equations (NODEs). We model each voxel as a moving particle and consider the set of all voxels in a 3D image as a high-dimensional dynamical system whose trajectory determines the targeted deformation field. Our method leverages deep neural networks for their expressive power in modeling dynamical systems, and simultaneously optimizes for a dynamical system between the image pairs and the corresponding transformation. Our formulation allows various constraints to be imposed along the transformation to maintain desired regularities. Our experiment results show that our method outperforms the benchmarks under various metrics. Additionally, we demonstrate the feasibility to expand our framework to register multiple image sets using a unified form of transformation, which could possibly serve a wider range of applications. 
    more » « less
  4. An adjoint sensitivity-based approach to determine the gradient and Hessian of cost functions for system identification of dynamical systems is presented. The motivation is the development of a computationally efficient approach relative to the direct differentiation (DD) technique and which overcomes the challenges of the step-size selection in finite difference (FD) approaches. An optimization framework is used to determine the parameters of a dynamical system which minimizes a summation of a scalar cost function evaluated at the discrete measurement instants. The discrete time measurements result in discontinuities in the Lagrange multipliers. Two approaches labeled as the Adjoint and the Hybrid are developed for the calculation of the gradient and Hessian for gradient-based optimization algorithms. The proposed approach is illustrated on the Lorenz 63 model where part of the initial conditions and model parameters are estimated using synthetic data. Examples of identifying model parameters of light curves of type 1a supernovae and a two-tank dynamic model using publicly available data are also included. 
    more » « less
  5. Cai, Ming Bo (Ed.)
    Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time, thus making it crucial for context-dependent computation. Here, we use a top-down modeling approach to examine network-level mechanisms of working memory, an enigmatic issue and central topic of study in neuroscience. We optimize thousands of recurrent rate-based neural networks on a working memory task and then perform dynamical systems analysis on the ensuing optimized networks, wherein we find that four distinct dynamical mechanisms can emerge. In particular, we show the prevalence of a mechanism in which memories are encoded along slow stable manifolds in the network state space, leading to a phasic neuronal activation profile during memory periods. In contrast to mechanisms in which memories are directly encoded at stable attractors, these networks naturally forget stimuli over time. Despite this seeming functional disadvantage, they are more efficient in terms of how they leverage their attractor landscape and paradoxically, are considerably more robust to noise. Our results provide new hypotheses regarding how working memory function may be encoded within the dynamics of neural circuits. 
    more » « less