Fluctuation-dissipation relations for stochastic gradient descent

09/28/2018
by   Sho Yaida, et al.
0

The notion of the stationary equilibrium ensemble has played a central role in statistical mechanics. In machine learning as well, training serves as generalized equilibration that drives the probability distribution of model parameters toward stationarity. Here, we derive stationary fluctuation-dissipation relations that link measurable quantities and hyperparameters in the stochastic gradient descent algorithm. These relations hold exactly for any stationary state and can in particular be used to adaptively set training schedule. We can further use the relations to efficiently extract information pertaining to a loss-function landscape such as the magnitudes of its Hessian and anharmonicity. Our claims are empirically verified.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2023

Machine learning in and out of equilibrium

The algorithms used to train neural networks, like stochastic gradient d...
research
08/13/2023

Law of Balance and Stationary Distribution of Stochastic Gradient Descent

The stochastic gradient descent (SGD) algorithm is the algorithm we use ...
research
02/14/2022

Continuous-time stochastic gradient descent for optimizing over the stationary distribution of stochastic differential equations

We develop a new continuous-time stochastic gradient descent method for ...
research
03/05/2018

Energy-entropy competition and the effectiveness of stochastic gradient descent in machine learning

Finding parameters that minimise a loss function is at the core of many ...
research
07/01/2022

Analysis of Kinetic Models for Label Switching and Stochastic Gradient Descent

In this paper we provide a novel approach to the analysis of kinetic mod...
research
12/03/2020

SSGD: A safe and efficient method of gradient descent

With the vigorous development of artificial intelligence technology, var...

Please sign up or login with your details

Forgot password? Click here to reset