Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions

05/28/2021
by   Alejandro Carderera, et al.
10

Generalized self-concordance is a key property present in the objective function of many important learning problems. We establish the convergence rate of a simple Frank-Wolfe variant that uses the open-loop step size strategy γ_t = 2/(t+2), obtaining a 𝒪(1/t) convergence rate for this class of functions in terms of primal gap and Frank-Wolfe gap, where t is the iteration count. This avoids the use of second-order information or the need to estimate local smoothness parameters of previous work. We also show improved convergence rates for various common cases, e.g., when the feasible region under consideration is uniformly convex or polyhedral.

READ FULL TEXT

page 30

page 31

page 32

page 34

page 35

research
04/06/2018

Adaptive Three Operator Splitting

We propose and analyze a novel adaptive step size variant of the Davis-Y...
research
08/12/2015

Convergence rates of sub-sampled Newton methods

We consider the problem of minimizing a sum of n functions over a convex...
research
08/22/2018

Convergence of Cubic Regularization for Nonconvex Optimization under KL Property

Cubic-regularized Newton's method (CR) is a popular algorithm that guara...
research
02/13/2023

Beyond Uniform Smoothness: A Stopped Analysis of Adaptive SGD

This work considers the problem of finding a first-order stationary poin...
research
09/29/2022

META-STORM: Generalized Fully-Adaptive Variance Reduced SGD for Unbounded Functions

We study the application of variance reduction (VR) techniques to genera...
research
04/01/2017

Faster Subgradient Methods for Functions with Hölderian Growth

The purpose of this manuscript is to derive new convergence results for ...
research
12/09/2020

Enhancing Parameter-Free Frank Wolfe with an Extra Subproblem

Aiming at convex optimization under structural constraints, this work in...

Please sign up or login with your details

Forgot password? Click here to reset