DeepAI AI Chat
Log In Sign Up

Towards Noise-adaptive, Problem-adaptive Stochastic Gradient Descent

by   Sharan Vaswani, et al.

We design step-size schemes that make stochastic gradient descent (SGD) adaptive to (i) the noise σ^2 in the stochastic gradients and (ii) problem-dependent constants. When minimizing smooth, strongly-convex functions with condition number κ, we first prove that T iterations of SGD with Nesterov acceleration and exponentially decreasing step-sizes can achieve a near-optimal Õ(exp(-T/√(κ)) + σ^2/T) convergence rate. Under a relaxed assumption on the noise, with the same step-size scheme and knowledge of the smoothness, we prove that SGD can achieve an Õ(exp(-T/κ) + σ^2/T) rate. In order to be adaptive to the smoothness, we use a stochastic line-search (SLS) and show (via upper and lower-bounds) that SGD converges at the desired rate, but only to a neighbourhood of the solution. Next, we use SGD with an offline estimate of the smoothness and prove convergence to the minimizer. However, its convergence is slowed down proportional to the estimation error and we prove a lower-bound justifying this slowdown. Compared to other step-size schemes, we empirically demonstrate the effectiveness of exponential step-sizes coupled with a novel variant of SLS.


page 1

page 2

page 3

page 4


Error Lower Bounds of Constant Step-size Stochastic Gradient Descent

Stochastic Gradient Descent (SGD) plays a central role in modern machine...

Barzilai-Borwein Step Size for Stochastic Gradient Descent

One of the major issues in stochastic gradient descent (SGD) methods is ...

Linear Convergence of Adaptive Stochastic Gradient Descent

We prove that the norm version of the adaptive stochastic gradient metho...

Black Box Lie Group Preconditioners for SGD

A matrix free and a low rank approximation preconditioner are proposed t...

On the Double Descent of Random Features Models Trained with SGD

We study generalization properties of random features (RF) regression in...

Random Shuffling Beats SGD after Finite Epochs

A long-standing problem in the theory of stochastic gradient descent (SG...

Escaping Saddles with Stochastic Gradients

We analyze the variance of stochastic gradients along negative curvature...