skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Block-Activated Algorithms For Multicomponent Fully Nonsmooth Minimization
Under consideration are multicomponent minimization problems in- volving a separable nonsmooth convex function penalizing the com- ponents individually, and nonsmooth convex coupling terms penal- izing linear mixtures of the components. We investigate the appli- cation of block-activated proximal algorithms for solving such prob- lems, i.e., algorithms which, at each iteration, need to use only a block of the underlying functions, as opposed to all of them as in standard methods. For smooth coupling functions, several block- activated algorithms exist and they are well understood. By con- trast, in the fully nonsmooth case, few block-activated methods are available and little effort has been devoted to assessing them. Our goal is to shed more light on the implementation, the features, and the behavior of these algorithms, compare their merits, and provide machine learning and image recovery experiments illustrating their performance.  more » « less
Award ID(s):
1715671
PAR ID:
10335315
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing
Volume:
2022
Page Range / eLocation ID:
5428 to 5432
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We investigate a modular convex Nash equilibrium problem involving nonsmooth functions acting on linear mixtures of strategies, as well as smooth coupling functions. An asynchronous block-iterative decomposition method is proposed to solve it. 
    more » « less
  2. We investigate a modular convex Nash equilibrium problem involving nonsmooth functions acting on linear mixtures of strategies, as well as smooth coupling functions. An asynchronous block-iterative decomposition method is proposed to solve it. 
    more » « less
  3. The paper proposes and develops a novel inexact gradient method (IGD) for minimizing smooth functions with Lipschitzian gradients. We show that the sequence of gradients generated by IGD converges to zero. The convergence of iterates to stationary points is guaranteed under the Kurdyka- Lojasiewicz property of the objective function with convergence rates depending on the KL exponent. The newly developed IGD is applied to designing two novel gradient-based methods of nonsmooth convex optimization such as the inexact proximal point methods (GIPPM) and the inexact augmented Lagrangian method (GIALM) for convex programs with linear equality constraints. These two methods inherit global convergence properties from IGD and are confirmed by numerical experiments to have practical advantages over some well-known algorithms of nonsmooth convex optimization 
    more » « less
  4. Subgradient methods comprise a fundamental class of nonsmooth optimization algorithms. Classical results show that certain subgradient methods converge sublinearly for general Lipschitz convex functions and converge linearly for convex functions that grow sharply away from solutions. Recent work has moreover extended these results to certain nonconvex problems. In this work, we seek to improve the complexity of these algorithms by asking the following question. Is it possible to design a superlinearly convergent subgradient method? We provide a positive answer to this question for a broad class of sharp semismooth functions. Funding: The research of D. Davis was supported by the Division of Mathematical Sciences [Grant 2047637] and the Alfred P. Sloan Foundation. 
    more » « less
  5. Structured convex optimization problems in image recovery typically involve a mix of smooth and nonsmooth functions. The common practice is to activate the smooth functions via their gradient and the nonsmooth ones via their proximity operator. We show that, although intuitively natural, this approach is not necessarily the most efficient numerically and that, in particular, activating all the functions proximally may be advantageous. To make this viewpoint viable computationally, we derive a number of new examples of proximity operators of smooth convex functions arising in applications. 
    more » « less