Projection Efficient Subgradient Method and Optimal Nonsmooth Frank-Wolfe Method

We consider the classical setting of optimizing a nonsmooth Lipschitz continuous convex function over a convex constraint set, when having access to a (stochastic) first-order oracle (FO) for the function and a projection oracle (PO) for the constraint set. It is well known that to achieve ϵ-suboptimality in high-dimensions, Θ(ϵ^-2) FO calls are necessary. This is achieved by the projected subgradient method (PGD). However, PGD also entails O(ϵ^-2) PO calls, which may be computationally costlier than FO calls (e.g. nuclear norm constraints). Improving this PO calls complexity of PGD is largely unexplored, despite the fundamental nature of this problem and extensive literature. We present first such improvement. This only requires a mild assumption that the objective function, when extended to a slightly larger neighborhood of the constraint set, still remains Lipschitz and accessible via FO. In particular, we introduce MOPES method, which carefully combines Moreau-Yosida smoothing and accelerated first-order schemes. This is guaranteed to find a feasible ϵ-suboptimal solution using only O(ϵ^-1) PO calls and optimal O(ϵ^-2) FO calls. Further, instead of a PO if we only have a linear minimization oracle (LMO, a la Frank-Wolfe) to access the constraint set, an extension of our method, MOLES, finds a feasible ϵ-suboptimal solution using O(ϵ^-2) LMO calls and FO calls—both match known lower bounds, resolving a question left open since White (1993). Our experiments confirm that these methods achieve significant speedups over the state-of-the-art, for a problem with costly PO and LMO calls.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/19/2019

Stochastic Conditional Gradient++

In this paper, we develop Stochastic Continuous Greedy++ (SCG++), the fi...
research
05/23/2022

Exploiting the Curvature of Feasible Sets for Faster Projection-Free Online Learning

In this paper, we develop new efficient projection-free algorithms for O...
research
06/22/2017

Efficient Convex Optimization with Membership Oracles

We consider the problem of minimizing a convex function over a convex se...
research
02/09/2022

New Projection-free Algorithms for Online Convex Optimization with Adaptive Regret Guarantees

We present new efficient projection-free algorithms for online convex op...
research
02/03/2021

Frank-Wolfe with a Nearest Extreme Point Oracle

We consider variants of the classical Frank-Wolfe algorithm for constrai...
research
05/12/2020

Gradient-Free Methods for Saddle-Point Problem

In the paper, we generalize the approach Gasnikov et. al, 2017, which al...
research
12/23/2017

Finding the Submodularity Hidden in Symmetric Difference

A fundamental property of convex functions in continuous space is that t...

Please sign up or login with your details

Forgot password? Click here to reset