Directional Analysis of Stochastic Gradient Descent via von Mises-Fisher Distributions in Deep learning

09/29/2018
by   Cheolhyoung Lee, et al.
0

Although stochastic gradient descent (SGD) is a driving force behind the recent success of deep learning, our understanding of its dynamics in a high-dimensional parameter space is limited. In recent years, some researchers have used the stochasticity of minibatch gradients, or the signal-to-noise ratio, to better characterize the learning dynamics of SGD. Inspired from these work, we here analyze SGD from a geometrical perspective by inspecting the stochasticity of the norms and directions of minibatch gradients. We propose a model of the directional concentration for minibatch gradients through von Mises-Fisher (VMF) distribution, and show that the directional uniformity of minibatch gradients increases over the course of SGD. We empirically verify our result using deep convolutional networks and observe a higher correlation between the gradient stochasticity and the proposed directional uniformity than that against the gradient norm stochasticity, suggesting that the directional statistics of minibatch gradients is a major factor behind SGD.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2022

Directional Privacy for Deep Learning

Differentially Private Stochastic Gradient Descent (DP-SGD) is a key met...
research
01/27/2017

Reinforced stochastic gradient descent for deep neural network learning

Stochastic gradient descent (SGD) is a standard optimization method to m...
research
12/18/2017

On the Relationship Between the OpenAI Evolution Strategy and Stochastic Gradient Descent

Because stochastic gradient descent (SGD) has shown promise optimizing n...
research
12/08/2020

Neural Mechanics: Symmetry and Broken Conservation Laws in Deep Learning Dynamics

Predicting the dynamics of neural network parameters during training is ...
research
04/04/2022

Deep learning, stochastic gradient descent and diffusion maps

Stochastic gradient descent (SGD) is widely used in deep learning due to...
research
05/21/2019

Time-Smoothed Gradients for Online Forecasting

Here, we study different update rules in stochastic gradient descent (SG...
research
06/15/2020

Walking in the Shadow: A New Perspective on Descent Directions for Constrained Minimization

Descent directions such as movement towards Frank-Wolfe vertices, away s...

Please sign up or login with your details

Forgot password? Click here to reset