Self-concordant analysis of Frank-Wolfe algorithms

02/11/2020
by   Pavel Dvurechensky, et al.
0

Projection-free optimization via different variants of the Frank-Wolfe (FW) method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k), k being the iteration counter. If the problem can be represented by a local linear minimization oracle, we are the first to propose a FW method with linear convergence rate without assuming neither strong convexity nor a Lipschitz continuous gradient.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/13/2022

Self-adaptive algorithms for quasiconvex programming and applications to machine learning

For solving a broad class of nonconvex programming problems on an unboun...
research
03/20/2018

Frank-Wolfe with Subsampling Oracle

We analyze two novel randomized variants of the Frank-Wolfe (FW) or cond...
research
09/14/2015

Dropping Convexity for Faster Semi-definite Optimization

We study the minimization of a convex function f(X) over the set of n× n...
research
05/25/2021

Saddle Point Optimization with Approximate Minimization Oracle and its Application to Robust Berthing Control

We propose an approach to saddle point optimization relying only on an o...
research
11/18/2015

On the Global Linear Convergence of Frank-Wolfe Optimization Variants

The Frank-Wolfe (FW) optimization algorithm has lately re-gained popular...
research
12/22/2020

Iteratively Reweighted Least Squares for ℓ_1-minimization with Global Linear Convergence Rate

Iteratively Reweighted Least Squares (IRLS), whose history goes back mor...
research
02/04/2015

Composite convex minimization involving self-concordant-like cost functions

The self-concordant-like property of a smooth convex function is a new a...

Please sign up or login with your details

Forgot password? Click here to reset