A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization with Applications

12/10/2019
by   H. Chen, et al.
0

This paper proposes a novel stochastic version of damped and regularized BFGS method for addressing the above problems.

READ FULL TEXT

page 3

page 4

page 5

page 7

page 9

page 13

page 14

page 15

research
10/17/2019

A Stochastic Variance Reduced Nesterov's Accelerated Quasi-Newton Method

Recently algorithms incorporating second order curvature information hav...
research
11/11/2019

Regularization of Limited Memory Quasi-Newton Methods for Large-Scale Nonconvex Minimization

This paper deals with the unconstrained optimization of smooth objective...
research
02/23/2017

Stochastic Newton and Quasi-Newton Methods for Large Linear Least-squares Problems

We describe stochastic Newton and stochastic quasi-Newton approaches to ...
research
10/29/2019

Adaptive Sampling Quasi-Newton Methods for Derivative-Free Stochastic Optimization

We consider stochastic zero-order optimization problems, which arise in ...
research
06/27/2019

Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization

We focus on minimizing nonconvex finite-sum functions that typically ari...
research
02/12/2018

Stochastic quasi-Newton with adaptive step lengths for large-scale problems

We provide a numerically robust and fast method capable of exploiting th...
research
02/15/2018

A Progressive Batching L-BFGS Method for Machine Learning

The standard L-BFGS method relies on gradient approximations that are no...

Please sign up or login with your details

Forgot password? Click here to reset