skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum
Award ID(s):
1846690 1952339 2208361 2143915
PAR ID:
10412408
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
Journal of Scientific Computing
Volume:
95
Issue:
1
ISSN:
0885-7474
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Gradient sampling (GS) methods for the minimization of objective functions that may be nonconvex and/or nonsmooth are proposed, analyzed, and tested. One of the most computationally expensive components of contemporary GS methods is the need to solve a convex quadratic subproblem in each iteration. By contrast, the methods proposed in this paper allow the use of inexact solutions of these subproblems, which, as proved in the paper, can be incorporated without the loss of theoretical convergence guarantees. Numerical experiments show that, by exploiting inexact subproblem solutions, one can consistently reduce the computational effort required by a GS method. Additionally, a strategy is proposed for aggregating gradient information after a subproblem is solved (potentially inexactly) as has been exploited in bundle methods for nonsmooth optimization. It is proved that the aggregation scheme can be introduced without the loss of theoretical convergence guarantees. Numerical experiments show that incorporating this gradient aggregation approach can also reduce the computational effort required by a GS method. 
    more » « less
  2. null (Ed.)