Nesterov's Accelerated Gradient and Momentum as approximations to Regularised Update Descent

07/07/2016
by   Aleksandar Botev, et al.
0

We present a unifying framework for adapting the update direction in gradient-based iterative optimization methods. As natural special cases we re-derive classical momentum and Nesterov's accelerated gradient method, lending a new intuitive interpretation to the latter algorithm. We show that a new algorithm, which we term Regularised Gradient Descent, can converge more quickly than either Nesterov's algorithm or the classical momentum algorithm.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset