skip to main content

Search for: All records

Creators/Authors contains: "Somefun, Oluwasegun A"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. This paper unifies commonly used accelerated stochastic gradient methods (Polyak’s Heavy Ball, Nesterov’s Accelerated Gradient and Adaptive Moment Estimation (Adam)) as specific cases of a general lowpass regularized learning framework, the Automatic Stochastic Gradient Method (AutoSGM). For AutoSGM, we derive an optimal iteration-dependent learning rate function and realize an approximation. Adam is also an approximation of this optimal approach that replaces the iteration-dependent learning-rate with a constant. Empirical results on deep neural networks comparing the learning behavior of AutoSGM equipped with this iteration-dependent learning-rate algorithm demonstrate fast learning behavior, robustness to the initial choice of the learning rate, and can tune an initial constant learning-rate in applications where a good constant learning rate approximation is unknown. 
    more » « less
    Free, publicly-accessible full text available April 14, 2025