On the Asymptotic Linear Convergence Speed of Anderson Acceleration, Nesterov Acceleration, and Nonlinear GMRES

07/04/2020
by   Hans De Sterck, et al.
0

We consider nonlinear convergence acceleration methods for fixed-point iteration x_k+1=q(x_k), including Anderson acceleration (AA), nonlinear GMRES (NGMRES), and Nesterov-type acceleration (corresponding to AA with window size one). We focus on fixed-point methods that converge asymptotically linearly with convergence factor ρ<1 and that solve an underlying fully smooth and non-convex optimization problem. It is often observed that AA and NGMRES substantially improve the asymptotic convergence behavior of the fixed-point iteration, but this improvement has not been quantified theoretically. We investigate this problem under simplified conditions. First, we consider stationary versions of AA and NGMRES, and determine coefficients that result in optimal asymptotic convergence factors, given knowledge of the spectrum of q'(x) at the fixed point x^*. This allows us to understand and quantify the asymptotic convergence improvement that can be provided by nonlinear convergence acceleration, viewing x_k+1=q(x_k) as a nonlinear preconditioner for AA and NGMRES. Second, for the case of infinite window size, we consider linear asymptotic convergence bounds for GMRES applied to the fixed-point iteration linearized about x^*. Since AA and NGMRES are equivalent to GMRES in the linear case, one may expect the GMRES convergence factors to be relevant for AA and NGMRES as x_k → x^*. Our results are illustrated numerically for a class of test problems from canonical tensor decomposition, comparing steepest descent and alternating least squares (ALS) as the fixed-point iterations that are accelerated by AA and NGMRES. Our numerical tests show that both approaches allow us to estimate asymptotic convergence speed for nonstationary AA and NGMRES with finite window size.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2021

Linear Asymptotic Convergence of Anderson Acceleration: Fixed-Point Analysis

We study the asymptotic convergence of AA(m), i.e., Anderson acceleratio...
research
07/06/2020

Quantifying the asymptotic linear convergence speed of Anderson Acceleration applied to ADMM

We explain how Anderson Acceleration (AA) speeds up the Alternating Dire...
research
09/29/2021

Anderson Acceleration as a Krylov Method with Application to Asymptotic Convergence Analysis

Anderson acceleration is widely used for accelerating the convergence of...
research
02/10/2020

Anderson Acceleration Using the H^-s Norm

Anderson acceleration (AA) is a technique for accelerating the convergen...
research
07/21/2021

Neural Fixed-Point Acceleration for Convex Optimization

Fixed-point iterations are at the heart of numerical computing and are o...
research
10/18/2019

Anderson Acceleration of Proximal Gradient Methods

Anderson acceleration is a well-established and simple technique for spe...
research
06/08/2022

Anderson acceleration with approximate calculations: applications to scientific computing

We provide rigorous theoretical bounds for Anderson acceleration (AA) th...

Please sign up or login with your details

Forgot password? Click here to reset