Adaptive Online Learning with Varying Norms

02/10/2020
by   Ashok Cutkosky, et al.
0

Given any increasing sequence of norms ·_0,...,·_T-1, we provide an online convex optimization algorithm that outputs points w_t in some domain W in response to convex losses ℓ_t:W→R that guarantees regret R_T(u)=∑_t=1^T ℓ_t(w_t)-ℓ_t(u)<Õ(u_T-1√(∑_t=1^T g_t_t-1,^2)) where g_t is a subgradient of ℓ_t at w_t. Our method does not require tuning to the value of u and allows for arbitrary convex W. We apply this result to obtain new "full-matrix"-style regret bounds. Along the way, we provide a new examination of the full-matrix AdaGrad algorithm, suggesting a better learning rate value that improves significantly upon prior analysis. We use our new techniques to tune AdaGrad on-the-fly, realizing our improved bound in a concrete algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2019

Matrix-Free Preconditioning in Online Learning

We provide an online convex optimization algorithm with regret that inte...
research
03/07/2017

Online Convex Optimization with Unconstrained Domains and Losses

We propose an online convex optimization algorithm (RescaledExp) that ac...
research
02/27/2019

Lipschitz Adaptivity with Multiple Learning Rates in Online Learning

We aim to design adaptive online learning algorithms that take advantage...
research
01/25/2019

Surrogate Losses for Online Learning of Stepsizes in Stochastic Non-Convex Optimization

Stochastic Gradient Descent (SGD) has played a central role in machine l...
research
12/29/2021

Isotuning With Applications To Scale-Free Online Learning

We extend and combine several tools of the literature to design fast, ad...
research
04/13/2017

ZigZag: A new approach to adaptive online learning

We develop a novel family of algorithms for the online learning setting ...
research
05/23/2018

Efficient online algorithms for fast-rate regret bounds under sparsity

We consider the online convex optimization problem. In the setting of ar...

Please sign up or login with your details

Forgot password? Click here to reset