Projection-free Online Learning over Strongly Convex Sets

10/16/2020
by   Yuanyu Wan, et al.
0

To efficiently solve online problems with complicated constraints, projection-free algorithms including online frank-wolfe (OFW) and its variants have received significant interest recently. However, in the general case, existing projection-free algorithms only achieved the regret bound of O(T^3/4), which is worse than the regret of projection-based algorithms, where T is the number of decision rounds. In this paper, we study the special case of online learning over strongly convex sets, for which we first prove that OFW enjoys a better regret bound of O(T^2/3) for general convex losses. The key idea is to refine the decaying step-size in the original OFW by a simple line search rule. Furthermore, for strongly convex losses, we propose a strongly convex variant of OFW by redefining the surrogate loss function in OFW. We show that it achieves a regret bound of O(T^2/3) over general convex sets and a better regret bound of O(√(T)) over strongly convex sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/20/2021

Projection-free Distributed Online Learning with Strongly Convex Losses

To efficiently solve distributed online learning problems with complicat...
research
10/05/2015

On the Online Frank-Wolfe Algorithms for Convex and Non-convex Optimizations

In this paper, the online variants of the classical Frank-Wolfe algorith...
research
04/11/2022

Online Frank-Wolfe with Unknown Delays

The online Frank-Wolfe (OFW) method has gained much popularity for onlin...
research
04/15/2020

Online Multiserver Convex Chasing and Optimization

We introduce the problem of k-chasing of convex functions, a simultaneou...
research
02/09/2017

Coordinated Online Learning With Applications to Learning User Preferences

We study an online multi-task learning setting, in which instances of re...
research
05/30/2023

On Riemannian Projection-free Online Learning

The projection operation is a critical component in a wide range of opti...
research
05/08/2019

SAdam: A Variant of Adam for Strongly Convex Functions

The Adam algorithm has become extremely popular for large-scale machine ...

Please sign up or login with your details

Forgot password? Click here to reset