Alternating cyclic extrapolation methods for optimization algorithms

04/11/2021 ∙ by Nicolas Lepage-Saucier, et al. ∙ 0

This article introduces new acceleration methods for fixed point iterations. Speed and stability are achieved by alternating the number of mappings to compute step lengths and using them multiple times by cycling. A new type of step length is also proposed with good properties for nonlinear mappings. The methods require no specific adaptation and are especially efficient for high-dimensional problems. Computation uses few objective function evaluations, no matrix inversion and little extra memory. A convergence analysis is followed by seven applications, including gradient descent acceleration for unconstrained optimization. Performances are on par or better than alternatives. The algorithm is available as a stand-alone Julia package and may be downloaded at https://github.com/NicolasL-S/ACX.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

Code Repositories

ACX

Alternating cyclic extrapolation methods for optimization algorithms


view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.