Locally Accelerated Conditional Gradients

06/19/2019
by   Alejandro Carderera, et al.
0

Conditional gradient methods form a class of projection-free first-order algorithms for solving smooth convex optimization problems. Apart from eschewing projections, these methods are attractive because of their simplicity, numerical performance, and the sparsity of the solutions outputted. However, they do not achieve optimal convergence rates. We present the Locally Accelerated Conditional Gradients algorithm that relaxes the projection-freeness requirement to only require projection onto (typically low-dimensional) simplices and mixes accelerated steps with conditional gradient steps to achieve local acceleration. We derive asymptotically optimal convergence rates for this algorithm. Our experimental results demonstrate the practicality of our approach; in particular, the speedup is achieved both in wall-clock time and per-iteration progress compared to standard conditional gradient methods and a Catalyst-accelerated Away-Step Frank-Wolfe algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2021

Parameter-free Locally Accelerated Conditional Gradients

Projection-free conditional gradient (CG) methods are the algorithms of ...
research
05/18/2018

Blended Conditional Gradients: the unconditioning of conditional gradients

We present a blended conditional gradient approach for minimizing a smoo...
research
03/13/2020

Boosting Frank-Wolfe by Chasing Gradients

The Frank-Wolfe algorithm has become a popular first-order optimization ...
research
02/28/2018

Parametrized Accelerated Methods Free of Condition Number

Analyses of accelerated (momentum-based) gradient descent usually assume...
research
04/20/2023

On the Effects of Data Heterogeneity on the Convergence Rates of Distributed Linear System Solvers

We consider the fundamental problem of solving a large-scale system of l...
research
03/12/2018

Neural Conditional Gradients

The move from hand-designed to learned optimizers in machine learning ha...
research
06/15/2020

Walking in the Shadow: A New Perspective on Descent Directions for Constrained Minimization

Descent directions such as movement towards Frank-Wolfe vertices, away s...

Please sign up or login with your details

Forgot password? Click here to reset