Acceleration Methods

01/23/2021
by   Alexandre d'Aspremont, et al.
0

This monograph covers some recent advances on a range of acceleration techniques frequently used in convex optimization. We first use quadratic optimization problems to introduce two key families of methods, momentum and nested optimization schemes, which coincide in the quadratic case to form the Chebyshev method whose complexity is analyzed using Chebyshev polynomials. We discuss momentum methods in detail, starting with the seminal work of Nesterov (1983) and structure convergence proofs using a few master templates, such as that of optimized gradient methods which have the key benefit of showing how momentum methods maximize convergence rates. We further cover proximal acceleration techniques, at the heart of the Catalyst and Accelerated Hybrid Proximal Extragradient frameworks, using similar algorithmic patterns. Common acceleration techniques directly rely on the knowledge of some regularity parameters of the problem at hand, and we conclude by discussing restart schemes, a set of simple techniques to reach nearly optimal convergence rates while adapting to unobserved regularity parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2020

Factorial Powers for Stochastic Optimization

The convergence rates for convex and non-convex optimization methods dep...
research
11/11/2021

Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum

Stochastic gradient descent with momentum (SGDM) is the dominant algorit...
research
06/22/2022

Provable Acceleration of Heavy Ball beyond Quadratics for a Class of Polyak-Łojasiewicz Functions when the Non-Convexity is Averaged-Out

Heavy Ball (HB) nowadays is one of the most popular momentum methods in ...
research
02/12/2020

Average-case Acceleration Through Spectral Density Estimation

We develop a framework for designing optimal quadratic optimization meth...
research
09/30/2022

Momentum Tracking: Momentum Acceleration for Decentralized Deep Learning on Heterogeneous Data

SGD with momentum acceleration is one of the key components for improvin...
research
02/23/2017

Convergence acceleration of alternating series

A new simple convergence acceleration method is proposed for a certain w...
research
11/05/2020

Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart

Based on an observation that additive Schwarz methods for general convex...

Please sign up or login with your details

Forgot password? Click here to reset