Fast Rates for Online Gradient Descent Without Strong Convexity via Hoffman's Bound

02/13/2018
by   Dan Garber, et al.
0

Hoffman's classical result gives a bound on the distance of a point from a convex and compact polytope in terms of the magnitude of violation of the constraints. Recently, several results showed that Hoffman's bound can be used to derive strongly-convex-like rates for first-order methods for convex optimization of curved, though not strongly convex, functions, over polyhedral sets. In this work, we use this classical result for the first time to obtain faster rates for online convex optimization over polyhedral sets with curved convex, though not strongly convex, loss functions. Mainly, we show that under several reasonable assumptions on the data, the standard Online Gradient Descent (OGD) algorithm guarantees logarithmic regret. To the best of our knowledge, the only previous algorithm to achieve logarithmic regret in the considered settings is the Online Newton Step algorithm which requires quadratic (in the dimension) memory and to solve a linear system on each iteration, which greatly limits its applicability to large-scale problems. We also show that in the corresponding stochastic convex optimization setting, Stochastic Gradient Descent achieves convergence rate of 1/t, matching the strongly-convex case.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2021

Online Strongly Convex Optimization with Unknown Delays

We investigate the problem of online convex optimization with unknown de...
research
01/20/2013

A Linearly Convergent Conditional Gradient Algorithm with Applications to Online and Stochastic Optimization

Linear optimization is many times algorithmically simpler than non-linea...
research
02/25/2010

Less Regret via Online Conditioning

We analyze and evaluate an online gradient descent algorithm with adapti...
research
06/17/2017

Variants of RMSProp and Adagrad with Logarithmic Regret Bounds

Adaptive gradient methods have become recently very popular, in particul...
research
09/14/2015

Dropping Convexity for Faster Semi-definite Optimization

We study the minimization of a convex function f(X) over the set of n× n...
research
02/08/2020

Curvature of Feasible Sets in Offline and Online Optimization

It is known that the curvature of the feasible set in convex optimizatio...
research
06/15/2017

Second-Order Kernel Online Convex Optimization with Adaptive Sketching

Kernel online convex optimization (KOCO) is a framework combining the ex...

Please sign up or login with your details

Forgot password? Click here to reset