
Adaptive Three Operator Splitting
We propose and analyze a novel adaptive step size variant of the DavisY...
read it

On the convergence rate of some nonlocal energies
We study the rate of convergence of some nonlocal functionals recently c...
read it

Convergence rates of subsampled Newton methods
We consider the problem of minimizing a sum of n functions over a convex...
read it

Convergence of Cubic Regularization for Nonconvex Optimization under KL Property
Cubicregularized Newton's method (CR) is a popular algorithm that guara...
read it

Faster Subgradient Methods for Functions with Hölderian Growth
The purpose of this manuscript is to derive new convergence results for ...
read it

Optimal Convergence Rate of SelfConsistent Field Iteration for Solving Eigenvectordependent Nonlinear Eigenvalue Problems
We present a comprehensive convergence analysis for SelfConsistent Fiel...
read it

Matrix Exponential Learning for Resource Allocation with Low Informational Exchange
We consider a distributed resource allocation problem in a multicarrier ...
read it
Simple steps are all you need: FrankWolfe and generalized selfconcordant functions
Generalized selfconcordance is a key property present in the objective function of many important learning problems. We establish the convergence rate of a simple FrankWolfe variant that uses the openloop step size strategy γ_t = 2/(t+2), obtaining a 𝒪(1/t) convergence rate for this class of functions in terms of primal gap and FrankWolfe gap, where t is the iteration count. This avoids the use of secondorder information or the need to estimate local smoothness parameters of previous work. We also show improved convergence rates for various common cases, e.g., when the feasible region under consideration is uniformly convex or polyhedral.
READ FULL TEXT
Comments
There are no comments yet.