Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract We analyze and test a simple-to-implement two-step iteration for the incompressible Navier-Stokes equations that consists of first applying the Picard iteration and then applying the Newton iteration to the Picard output. We prove that this composition of Picard and Newton converges quadratically, and our analysis (which covers both the unique solution and non-unique solution cases) also suggests that this solver has a larger convergence basin than usual Newton because of the improved stability properties of Picard-Newton over Newton. Numerical tests show that Picard-Newton converges more reliably for higher Reynolds numbers and worse initial conditions than Picard and Newton iterations. We also consider enhancing the Picard step with Anderson acceleration (AA), and find that the AAPicard-Newton iteration has even better convergence properties on several benchmark test problems.more » « less
-
The purpose of this paper is to develop a practical strategy to accelerate Newton’s method in the vicinity of singular points. We present an adaptive safeguarding scheme with a tunable parameter, which we call adaptive γ-safeguarding, that one can use in tandem with Anderson acceleration to improve the performance of Newton’s method when solving problems at or near singular points. The key features of adaptive γ-safeguarding are that it converges locally for singular problems, and it can detect nonsingular problems automatically, in which case the Newton-Anderson iterates are scaled towards a standard Newton step. The result is a flexible algorithm that performs well for singular and nonsingular problems, and can recover convergence from both standard Newton and Newton-Anderson with the right parameter choice. This leads to faster local convergence compared to both Newton’s method, and Newton-Anderson without safeguarding, with effectively no additional computational cost. We demonstrate three strategies one can use when implementing Newton-Anderson and γ-safeguarded Newton-Anderson to solve parameter-dependent problems near singular points. For our benchmark problems, we take two parameter-dependent incompressible flow systems: flow in a channel and Rayleigh-Benard convection.more » « less
-
The incremental Picard Yosida (IPY) method has recently been developed as an iteration for nonlinear saddle point problems that is as effective as Picard but more efficient. By combining ideas from algebraic splitting of linear saddle point solvers with incremental Picard‐type iterations and grad‐div stabilization, IPY improves on the standard Picard method by allowing for easier linear solves at each iteration—but without creating more total nonlinear iterations compared to Picard. This paper extends the IPY methodology by studying it together with Anderson acceleration (AA). We prove that IPY for Navier–Stokes and regularized Bingham fits the recently developed analysis framework for AA, which implies that AA improves the linear convergence rate of IPY by scaling the rate with the gain of the AA optimization problem. Numerical tests illustrate a significant improvement in convergence behavior of IPY methods from AA, for both Navier–Stokes and regularized Bingham.more » « less
An official website of the United States government
