Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping

05/21/2020
by   Eduard Gorbunov, et al.
0

In this paper, we propose a new accelerated stochastic first-order method called clipped-SSTM for smooth convex stochastic optimization with heavy-tailed distributed noise in stochastic gradients and derive the first high-probability complexity bounds for this method closing the gap in the theory of stochastic optimization with heavy-tailed noise. Our method is based on a special variant of accelerated Stochastic Gradient Descent (SGD) and clipping of stochastic gradients. We extend our method to the strongly convex case and prove new complexity bounds that outperform state-of-the-art results in this case. Finally, we extend our proof technique and derive the first non-trivial high-probability complexity bounds for SGD with clipping without light-tails assumption on the noise.

READ FULL TEXT

page 13

page 14

page 17

page 18

page 19

page 20

page 21

page 22

research
06/10/2021

Near-Optimal High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise

Thanks to their practical efficiency and random nature of the data, stoc...
research
07/25/2023

High Probability Analysis for Non-Convex Stochastic Optimization with Clipping

Gradient clipping is a commonly used technique to stabilize the training...
research
08/25/2021

Heavy-tailed Streaming Statistical Estimation

We consider the task of heavy-tailed statistical estimation given stream...
research
08/17/2022

High Probability Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise

In this work we study high probability bounds for stochastic subgradient...
research
05/24/2021

Robust learning with anytime-guaranteed feedback

Under data distributions which may be heavy-tailed, many stochastic grad...
research
06/28/2021

High-probability Bounds for Non-Convex Stochastic Optimization with Heavy Tails

We consider non-convex stochastic optimization using first-order algorit...
research
07/05/2019

Algorithms of Robust Stochastic Optimization Based on Mirror Descent Method

We propose an approach to construction of robust non-Euclidean iterative...

Please sign up or login with your details

Forgot password? Click here to reset