LOSSGRAD: automatic learning rate in gradient descent

02/20/2019
by   Bartosz Wójcik, et al.
0

In this paper, we propose a simple, fast and easy to implement algorithm LOSSGRAD (locally optimal step-size in gradient descent), which automatically modifies the step-size in gradient descent during neural networks training. Given a function f, a point x, and the gradient ∇_x f of f, we aim to find the step-size h which is (locally) optimal, i.e. satisfies: h=arg min_t ≥ 0 f(x-t ∇_x f). Making use of quadratic approximation, we show that the algorithm satisfies the above assumption. We experimentally show that our method is insensitive to the choice of initial learning rate while achieving results comparable to other methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2022

Incorporating the Barzilai-Borwein Adaptive Step Size into Sugradient Methods for Deep Network Training

In this paper, we incorporate the Barzilai-Borwein step size into gradie...
research
08/05/2019

Extending the step-size restriction for gradient descent to avoid strict saddle points

We provide larger step-size restrictions for which gradient descent base...
research
05/28/2019

Concavifiability and convergence: necessary and sufficient conditions for gradient descent analysis

Convergence of the gradient descent algorithm has been attracting renewe...
research
01/31/2022

Step-size Adaptation Using Exponentiated Gradient Updates

Optimizers like Adam and AdaGrad have been very successful in training l...
research
08/12/2013

Faster gradient descent and the efficient recovery of images

Much recent attention has been devoted to gradient descent algorithms wh...
research
06/02/2021

q-RBFNN:A Quantum Calculus-based RBF Neural Network

In this research a novel stochastic gradient descent based learning appr...
research
11/28/2017

Backprop as Functor: A compositional perspective on supervised learning

A supervised learning algorithm searches over a set of functions A → B p...

Please sign up or login with your details

Forgot password? Click here to reset