Parameter-free Locally Accelerated Conditional Gradients

02/12/2021
by   Alejandro Carderera, et al.
0

Projection-free conditional gradient (CG) methods are the algorithms of choice for constrained optimization setups in which projections are often computationally prohibitive but linear optimization over the constraint set remains computationally feasible. Unlike in projection-based methods, globally accelerated convergence rates are in general unattainable for CG. However, a very recent work on Locally accelerated CG (LaCG) has demonstrated that local acceleration for CG is possible for many settings of interest. The main downside of LaCG is that it requires knowledge of the smoothness and strong convexity parameters of the objective function. We remove this limitation by introducing a novel, Parameter-Free Locally accelerated CG (PF-LaCG) algorithm, for which we provide rigorous convergence guarantees. Our theoretical results are complemented by numerical experiments, which demonstrate local acceleration and showcase the practical improvements of PF-LaCG over non-accelerated algorithms, both in terms of iteration count and wall-clock time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2019

Locally Accelerated Conditional Gradients

Conditional gradient methods form a class of projection-free first-order...
research
05/31/2018

On Acceleration with Noise-Corrupted Gradients

Accelerated algorithms have broad applications in large-scale optimizati...
research
02/28/2018

Parametrized Accelerated Methods Free of Condition Number

Analyses of accelerated (momentum-based) gradient descent usually assume...
research
01/07/2021

Accelerated, Optimal, and Parallel: Some Results on Model-Based Stochastic Optimization

We extend the Approximate-Proximal Point (aProx) family of model-based m...
research
11/26/2022

Accelerated Riemannian Optimization: Handling Constraints with a Prox to Bound Geometric Penalties

We propose a globally-accelerated, first-order method for the optimizati...
research
04/20/2023

On the Effects of Data Heterogeneity on the Convergence Rates of Distributed Linear System Solvers

We consider the fundamental problem of solving a large-scale system of l...
research
09/29/2020

Projection-Free Adaptive Gradients for Large-Scale Optimization

The complexity in large-scale optimization can lie in both handling the ...

Please sign up or login with your details

Forgot password? Click here to reset