A modified limited memory Nesterov's accelerated quasi-Newton

12/01/2021
by   S. Indrapriyadarsini, et al.
0

The Nesterov's accelerated quasi-Newton (L)NAQ method has shown to accelerate the conventional (L)BFGS quasi-Newton method using the Nesterov's accelerated gradient in several neural network (NN) applications. However, the calculation of two gradients per iteration increases the computational cost. The Momentum accelerated Quasi-Newton (MoQ) method showed that the Nesterov's accelerated gradient can be approximated as a linear combination of past gradients. This abstract extends the MoQ approximation to limited memory NAQ and evaluates the performance on a function approximation problem.

READ FULL TEXT
research
10/17/2019

A Stochastic Variance Reduced Nesterov's Accelerated Quasi-Newton Method

Recently algorithms incorporating second order curvature information hav...
research
04/25/2016

Towards Real-time Simulation of Hyperelastic Materials

We present a new method for real-time physics-based simulation supportin...
research
02/10/2014

Probabilistic Interpretation of Linear Solvers

This manuscript proposes a probabilistic framework for algorithms that i...
research
10/21/2019

Implementation of a modified Nesterov's Accelerated quasi-Newton Method on Tensorflow

Recent studies incorporate Nesterov's accelerated gradient method for th...
research
04/14/2020

Quantum speedups of some general-purpose numerical optimisation algorithms

We give quantum speedups of several general-purpose numerical optimisati...
research
09/09/2019

An Adaptive Stochastic Nesterov Accelerated Quasi Newton Method for Training RNNs

A common problem in training neural networks is the vanishing and/or exp...
research
03/09/2023

Provably Convergent Plug-and-Play Quasi-Newton Methods

Plug-and-Play (PnP) methods are a class of efficient iterative methods t...

Please sign up or login with your details

Forgot password? Click here to reset