skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: The Douglas-Rachford algorithm converges only weakly
We show that the weak convergence of the Douglas--Rachford algorithm for finding a zero of the sum of two maximally monotone operators cannot be improved to strong convergence. Likewise, we show that strong convergence can fail for the method of partial inverses.  more » « less
Award ID(s):
1715671
PAR ID:
10162528
Author(s) / Creator(s):
;
Date Published:
Journal Name:
SIAM journal on control and optimization
Volume:
58
ISSN:
0363-0129
Page Range / eLocation ID:
1118-1120
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract In previous work of the authors, we investigated the Born and inverse Born series for a scalar wave equation with linear and nonlinear terms, the nonlinearity being cubic of Kerr type (DeFilippiset al2023Inverse Problems39125015). We reported conditions which guarantee convergence of the inverse Born series, enabling recovery of the coefficients of the linear and nonlinear terms. In this work, we show that if the coefficient of the linear term is known, an arbitrarily strong Kerr nonlinearity can be reconstructed, for sufficiently small data. Additionally, we show that similar convergence results hold for general polynomial nonlinearities. Our results are illustrated with numerical examples. 
    more » « less
  2. Banach's fixed point theorem for contraction maps has been widely used to analyze the convergence of iterative methods in non-convex problems. It is a common experience, however, that iterative maps fail to be globally contracting under the natural metric in their domain, making the applicability of Banach's theorem limited. We explore how generally we can apply Banach's fixed point theorem to establish the convergence of iterative methods when pairing it with carefully designed metrics. Our first result is a strong converse of Banach's theorem, showing that it is a universal analysis tool for establishing global convergence of iterative methods to unique fixed points, and for bounding their convergence rate. In other words, we show that, whenever an iterative map globally converges to a unique fixed point, there exists a metric under which the iterative map is contracting and which can be used to bound the number of iterations until convergence. We illustrate our approach in the widely used power method, providing a new way of bounding its convergence rate through contraction arguments. We next consider the computational complexity of Banach's fixed point theorem. Making the proof of our converse theorem constructive, we show that computing a fixed point whose existence is guaranteed by Banach's fixed point theorem is CLS-complete. We thus provide the first natural complete problem for the class CLS, which was defined in [Daskalakis, Papadimitriou 2011] to capture the complexity of problems such as P-matrix LCP, computing KKT-points, and finding mixed Nash equilibria in congestion and network coordination games. 
    more » « less
  3. Banach's fixed point theorem for contraction maps has been widely used to analyze the convergence of iterative methods in non-convex problems. It is a common experience, however, that iterative maps fail to be globally contracting under the natural metric in their domain, making the applicability of Banach's theorem limited. We explore how generally we can apply Banach's fixed point theorem to establish the convergence of iterative methods when pairing it with carefully designed metrics. Our first result is a strong converse of Banach's theorem, showing that it is a universal analysis tool for establishing global convergence of iterative methods to unique fixed points, and for bounding their convergence rate. In other words, we show that, whenever an iterative map globally converges to a unique fixed point, there exists a metric under which the iterative map is contracting and which can be used to bound the number of iterations until convergence. We illustrate our approach in the widely used power method, providing a new way of bounding its convergence rate through contraction arguments. We next consider the computational complexity of Banach's fixed point theorem. Making the proof of our converse theorem constructive, we show that computing a fixed point whose existence is guaranteed by Banach's fixed point theorem is CLS-complete. We thus provide the first natural complete problem for the class CLS, which was defined in [DP11] to capture the complexity of problems such as P-matrix LCP, computing KKT-points, and finding mixed Nash equilibria in congestion and network coordination games. 
    more » « less
  4. Berry, Jonathan; Shmoys, David; Cowen, Lenore; Naumann, Uwe (Ed.)
    Continuous DR-submodular functions are a class of functions that satisfy the Diminishing Returns (DR) property, which implies that they are concave along non-negative directions. Existing works have studied monotone continuous DR-submodular maximization subject to a convex constraint and have proposed efficient algorithms with approximation guarantees. However, in many applications, e. g., computing the stability number of a graph and mean-field inference for probabilistic log-submodular models, the DR-submodular function has the additional property of being strongly concave along non-negative directions that could be utilized for obtaining faster convergence rates. In this paper, we first introduce and characterize the class of strongly DR-submodular functions and show how such a property implies strong concavity along non-negative directions. Then, we study L-smooth monotone strongly DR-submodular functions that have bounded curvature, and we show how to exploit such additional structure to obtain algorithms with improved approximation guarantees and faster convergence rates for the maximization problem. In particular, we propose the SDRFW algorithm that matches the provably optimal approximation ratio after only iterations, where c ∈ [0,1] and μ ≥ 0 are the curvature and the strong DR-submodularity parameter. Furthermore, we study the Projected Gradient Ascent (PGA) method for this problem and provide a refined analysis of the algorithm with an improved approximation ratio (compared to ½ in prior works) and a linear convergence rate. Given that both algorithms require knowledge of the smoothness parameter L, we provide a novel characterization of L for DR-submodular functions showing that in many cases, computing L could be formulated as a convex optimization problem, i. e., a geometric program, that could be solved efficiently. Experimental results illustrate and validate the efficiency and effectiveness of our algorithms. 
    more » « less
  5. Abstract Gradient‐type iterative methods for solving Hermitian eigenvalue problems can be accelerated by using preconditioning and deflation techniques. A preconditioned steepest descent iteration with implicit deflation (PSD‐id) is one of such methods. The convergence behavior of the PSD‐id is recently investigated based on the pioneering work of Samokish on the preconditioned steepest descent method (PSD). The resulting non‐asymptotic estimates indicate a superlinear convergence of the PSD‐id under strong assumptions on the initial guess. The present paper utilizes an alternative convergence analysis of the PSD by Neymeyr under much weaker assumptions. We embed Neymeyr's approach into the analysis of the PSD‐id using a restricted formulation of the PSD‐id. More importantly, we extend the new convergence analysis of the PSD‐id to a practically preferred block version of the PSD‐id, or BPSD‐id, and show the cluster robustness of the BPSD‐id. Numerical examples are provided to validate the theoretical estimates. 
    more » « less