skip to main content


Search for: All records

Award ID contains: 1924548

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract

    We prove, under mild conditions, the convergence of a Riemannian gradient descent method for a hyperbolic neural network regression model, both in batch gradient descent and stochastic gradient descent. We also discuss a Riemannian version of the Adam algorithm. We show numerical simulations of these algorithms on various benchmarks.

     
    more » « less
  2. In this paper, we study the convergence analysis for a robust stochastic structure-preserving Lagrangian numerical scheme in computing effective diffusivity of time-dependent chaotic flows, which are modeled by stochastic differential equations (SDEs). Our numerical scheme is based on a splitting method to solve the corresponding SDEs in which the deterministic subproblem is discretized using a structure-preserving scheme while the random subproblem is discretized using the Euler-Maruyama scheme. We obtain a sharp and uniform-in-time convergence analysis for the proposed numerical scheme that allows us to accurately compute long-time solutions of the SDEs. As such, we can compute the effective diffusivity for time-dependent chaotic flows. Finally, we present numerical results to demonstrate the accuracy and efficiency of the proposed method in computing effective diffusivity for the time-dependent Arnold-Beltrami-Childress (ABC) flow and Kolmogorov flow in three-dimensional space. 
    more » « less