Efficient Projection-Free Online Methods with Stochastic Recursive Gradient

by   Jiahao Xie, et al.

This paper focuses on projection-free methods for solving smooth Online Convex Optimization (OCO) problems. Existing projection-free methods either achieve suboptimal regret bounds or have high per-iteration computational costs. To fill this gap, two efficient projection-free online methods called ORGFW and MORGFW are proposed for solving stochastic and adversarial OCO problems, respectively. By employing a recursive gradient estimator, our methods achieve optimal regret bounds (up to a logarithmic factor) while possessing low per-iteration computational costs. Experimental results demonstrate the efficiency of the proposed methods compared to state-of-the-arts.


page 1

page 2

page 3

page 4


Stochastic Recursive Gradient-Based Methods for Projection-Free Online Learning

This paper focuses on projection-free methods for solving smooth Online ...

Faster Projection-free Online Learning

In many online learning problems the computational bottleneck for gradie...

Towards Gradient Free and Projection Free Stochastic Optimization

This paper focuses on the problem of constrainedstochastic optimization....

Projection-Free Online Optimization with Stochastic Gradient: From Convexity to Submodularity

Online optimization has been a successful framework for solving large-sc...

An Efficient Gradient Projection Method for Structural Topology Optimization

This paper presents an efficient gradient projection-based method for st...

New Variants of Frank-Wolfe Algorithm for Video Co-localization Problem

The co-localization problem is a model that simultaneously localizes obj...

Nonlinear Projection Based Gradient Estimation for Query Efficient Blackbox Attacks

Gradient estimation and vector space projection have been studied as two...

Please sign up or login with your details

Forgot password? Click here to reset