Convergence of Constrained Anderson Acceleration

10/29/2020
by   Mathieu Barré, et al.
0

We prove non asymptotic linear convergence rates for the constrained Anderson acceleration extrapolation scheme. These guarantees come from new upper bounds on the constrained Chebyshev problem, which consists in minimizing the maximum absolute value of a polynomial on a bounded real interval with l_1 constraints on its coefficients vector. Constrained Anderson Acceleration has a numerical cost comparable to that of the original scheme.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/18/2019

Anderson Acceleration of Proximal Gradient Methods

Anderson acceleration is a well-established and simple technique for spe...
research
02/03/2012

Minimax Rates of Estimation for Sparse PCA in High Dimensions

We study sparse principal components analysis in the high-dimensional se...
research
02/11/2021

A Continuized View on Nesterov Acceleration

We introduce the "continuized" Nesterov acceleration, a close variant of...
research
07/02/2019

Absolute root separation

The absolute separation of a polynomial is the minimum nonzero differenc...
research
04/01/2017

Stochastic L-BFGS: Improved Convergence Rates and Practical Acceleration Strategies

We revisit the stochastic limited-memory BFGS (L-BFGS) algorithm. By pro...
research
02/23/2017

Convergence acceleration of alternating series

A new simple convergence acceleration method is proposed for a certain w...
research
09/10/2019

Anderson acceleration for contractive and noncontractive operators

A one-step analysis of Anderson acceleration with general algorithmic de...

Please sign up or login with your details

Forgot password? Click here to reset