Stochastic Polyak Stepsize with a Moving Target

06/22/2021
by   Robert M. Gower, et al.
1

We propose a new stochastic gradient method that uses recorded past loss values to reduce the variance. Our method can be interpreted as a new stochastic variant of the Polyak Stepsize that converges globally without assuming interpolation. Our method introduces auxiliary variables, one for each data point, that track the loss value for each data point. We provide a global convergence theory for our method by showing that it can be interpreted as a special variant of online SGD. The new method only stores a single scalar per data point, opening up new applications for variance reduction where memory is the bottleneck.

READ FULL TEXT

page 4

page 8

page 33

page 38

research
06/11/2015

Variance Reduced Stochastic Gradient Descent with Neighbors

Stochastic Gradient Descent (SGD) is a workhorse in machine learning, ye...
research
06/22/2020

Sketched Newton-Raphson

We propose a new globally convergent stochastic second order method. Our...
research
03/21/2019

SVAG: Unified Convergence Results for SAG-SAGA Interpolation with Stochastic Variance Adjusted Gradient Descent

We analyze SVAG, a variance reduced stochastic gradient method with SAG ...
research
04/03/2019

Exponentially convergent stochastic k-PCA without variance reduction

We present Matrix Krasulina, an algorithm for online k-PCA, by generaliz...
research
07/20/2018

signProx: One-Bit Proximal Algorithm for Nonconvex Stochastic Optimization

Stochastic gradient descent (SGD) is one of the most widely used optimiz...
research
07/26/2023

Function Value Learning: Adaptive Learning Rates Based on the Polyak Stepsize and Function Splitting in ERM

Here we develop variants of SGD (stochastic gradient descent) with an ad...
research
06/28/2019

Variance reduction for effective energies of random lattices in the Thomas-Fermi-von Weizsäcker model

In the computation of the material properties of random alloys, the meth...

Please sign up or login with your details

Forgot password? Click here to reset