Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice

12/15/2017
by   Hongzhou Lin, et al.
0

We introduce a generic scheme for accelerating gradient-based optimization methods in the sense of Nesterov. The approach, called Catalyst, builds upon the inexact acceler- ated proximal point algorithm for minimizing a convex objective function, and consists of approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. One of the key to achieve acceleration in theory and in practice is to solve these sub-problems with appropriate accuracy by using the right stopping criterion and the right warm-start strategy. In this paper, we give practical guidelines to use Catalyst and present a comprehensive theoretical analysis of its global complexity. We show that Catalyst applies to a large class of algorithms, including gradient descent, block coordinate descent, incremental algorithms such as SAG, SAGA, SDCA, SVRG, Finito/MISO, and their proximal variants. For all of these methods, we provide acceleration and explicit sup- port for non-strongly convex objectives. We conclude with extensive experiments showing that acceleration is useful in practice, especially for ill-conditioned problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/19/2020

Anderson acceleration of coordinate descent

Acceleration of first order methods is mainly obtained via inertial tech...
research
07/26/2019

Incremental Methods for Weakly Convex Optimization

We consider incremental algorithms for solving weakly convex optimizatio...
research
05/14/2013

Optimization with First-Order Surrogate Functions

In this paper, we study optimization methods consisting of iteratively m...
research
01/04/2021

First-Order Methods for Convex Optimization

First-order methods for solving convex optimization problems have been a...
research
06/01/2023

Improving Energy Conserving Descent for Machine Learning: Theory and Practice

We develop the theory of Energy Conserving Descent (ECD) and introduce E...
research
02/19/2018

On the Optimization of Deep Networks: Implicit Acceleration by Overparameterization

Conventional wisdom in deep learning states that increasing depth improv...
research
04/17/2020

First-Order Methods for Optimal Experimental Design Problems with Bound Constraints

We consider a class of convex optimization problems over the simplex of ...

Please sign up or login with your details

Forgot password? Click here to reset