Improved Complexities for Stochastic Conditional Gradient Methods under Interpolation-like Conditions

06/15/2020
by   Tesi Xiao, et al.
0

We analyze stochastic conditional gradient type methods for constrained optimization problems arising in over-parametrized machine learning. We show that one could leverage the interpolation-like conditions satisfied by such models to obtain improved complexities for conditional gradient type methods. For the aforementioned class of problem, when the objective function is convex, we show that the conditional gradient method requires O(ϵ^-2) calls to the stochastic gradient oracle to find an ϵ-optimal solution. Furthermore, by including a gradient sliding step, the number of calls reduces to O(ϵ^-1.5). We also establish similar improved results in the zeroth-order setting, where only noisy function evaluations are available. Notably, the above results are achieved without any variance reduction techniques, thereby demonstrating the improved performance of vanilla versions of conditional gradient methods for over-parametrized machine learning problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2023

Projection-Free Methods for Stochastic Simple Bilevel Optimization with Convex Lower-level Problem

In this paper, we study a class of stochastic bilevel optimization probl...
research
09/07/2021

COCO Denoiser: Using Co-Coercivity for Variance Reduction in Stochastic Convex Optimization

First-order methods for stochastic optimization have undeniable relevanc...
research
04/22/2021

A Dimension-Insensitive Algorithm for Stochastic Zeroth-Order Optimization

This paper concerns a convex, stochastic zeroth-order optimization (S-ZO...
research
09/28/2020

Escaping Saddle-Points Faster under Interpolation-like Conditions

In this paper, we show that under over-parametrization several standard ...
research
06/22/2022

Projection-free Constrained Stochastic Nonconvex Optimization with State-dependent Markov Data

We study a projection-free conditional gradient-type algorithm for const...
research
09/18/2021

An Accelerated Variance-Reduced Conditional Gradient Sliding Algorithm for First-order and Zeroth-order Optimization

The conditional gradient algorithm (also known as the Frank-Wolfe algori...
research
04/10/2023

First-order methods for Stochastic Variational Inequality problems with Function Constraints

The monotone Variational Inequality (VI) is an important problem in mach...

Please sign up or login with your details

Forgot password? Click here to reset