-
Asynchronous decentralized accelerated stochastic gradient descent
In this work, we introduce an asynchronous decentralized accelerated sto...
read it
-
Nearly Optimal Robust Method for Convex Compositional Problems with Heavy-Tailed Noise
In this paper, we propose robust stochastic algorithms for solving conve...
read it
-
Multiplicative noise and heavy tails in stochastic optimization
Although stochastic optimization is central to modern machine learning, ...
read it
-
Why ADAM Beats SGD for Attention Models
While stochastic gradient descent (SGD) is still the de facto algorithm ...
read it
-
On the insufficiency of existing momentum schemes for Stochastic Optimization
Momentum based stochastic gradient methods such as heavy ball (HB) and N...
read it
-
Hausdorff Dimension, Stochastic Differential Equations, and Generalization in Neural Networks
Despite its success in a wide range of applications, characterizing the ...
read it
-
Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise
In this paper, we propose a unified view of gradient-based algorithms fo...
read it
Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping
In this paper, we propose a new accelerated stochastic first-order method called clipped-SSTM for smooth convex stochastic optimization with heavy-tailed distributed noise in stochastic gradients and derive the first high-probability complexity bounds for this method closing the gap in the theory of stochastic optimization with heavy-tailed noise. Our method is based on a special variant of accelerated Stochastic Gradient Descent (SGD) and clipping of stochastic gradients. We extend our method to the strongly convex case and prove new complexity bounds that outperform state-of-the-art results in this case. Finally, we extend our proof technique and derive the first non-trivial high-probability complexity bounds for SGD with clipping without light-tails assumption on the noise.
READ FULL TEXT
Comments
There are no comments yet.