Online to Offline Conversions, Universality and Adaptive Minibatch Sizes

05/30/2017
by   Kfir Y. Levy, et al.
0

We present an approach towards convex optimization that relies on a novel scheme which converts online adaptive algorithms into offline methods. In the offline optimization setting, our derived methods are shown to obtain favourable adaptive guarantees which depend on the harmonic sum of the queried gradients. We further show that our methods implicitly adapt to the objective's structure: in the smooth case fast convergence rates are ensured without any prior knowledge of the smoothness parameter, while still maintaining guarantees in the non-smooth setting. Our approach has a natural extension to the stochastic setting, resulting in a lazy version of SGD (stochastic GD), where minibathces are chosen adaptively depending on the magnitude of the gradients. Thus providing a principled approach towards choosing minibatch sizes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2019

UniXGrad: A Universal, Adaptive Algorithm with Optimal Guarantees for Constrained Optimization

We propose a novel adaptive, accelerated algorithm for the stochastic co...
research
03/03/2019

Anytime Online-to-Batch Conversions, Optimism, and Acceleration

A standard way to obtain convergence guarantees in stochastic convex opt...
research
07/17/2020

Adaptive Gradient Methods for Constrained Convex Optimization

We provide new adaptive first-order methods for constrained convex optim...
research
06/23/2021

Learning Under Delayed Feedback: Implicitly Adapting to Gradient Delays

We consider stochastic convex optimization problems, where several machi...
research
02/13/2023

Beyond Uniform Smoothness: A Stopped Analysis of Adaptive SGD

This work considers the problem of finding a first-order stationary poin...
research
06/17/2023

Adaptive Strategies in Non-convex Optimization

An algorithm is said to be adaptive to a certain parameter (of the probl...
research
05/04/2022

Making SGD Parameter-Free

We develop an algorithm for parameter-free stochastic convex optimizatio...

Please sign up or login with your details

Forgot password? Click here to reset