Stochastic Trust Region Inexact Newton Method for Large-scale Machine Learning

12/26/2018
by   Vinod Kumar Chauhan, et al.
0

Nowadays stochastic approximation methods are one of the major research direction to deal with the large-scale machine learning problems. From stochastic first order methods, now the focus is shifting to stochastic second order methods due to their faster convergence. In this paper, we have proposed a novel Stochastic Trust RegiOn inexact Newton method, called as STRON, which uses conjugate gradient (CG) to solve trust region subproblem. The method uses progressive subsampling in the calculation of gradient and Hessian values to take the advantage of both stochastic approximation and full batch regimes. We have extended STRON using existing variance reduction techniques to deal with the noisy gradients, and using preconditioned conjugate gradient (PCG) as subproblem solver. We further extend STRON to solve SVM. Finally, the theoretical results prove superlinear convergence for STRON and the empirical results prove the efficacy of the proposed method against existing methods with bench marked datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2019

Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization

We focus on minimizing nonconvex finite-sum functions that typically ari...
research
05/23/2018

Approximate Newton-based statistical inference using only stochastic gradients

We present a novel inference framework for convex empirical risk minimiz...
research
07/13/2022

Grassmanian packings: Trust region stochastic tuning for matrix incoherence

We provide a new numerical procedure for constructing low coherence matr...
research
02/15/2018

A Progressive Batching L-BFGS Method for Machine Learning

The standard L-BFGS method relies on gradient approximations that are no...
research
07/24/2018

SAAGs: Biased Stochastic Variance Reduction Methods

Stochastic optimization is one of the effective approach to deal with th...
research
09/28/2022

TRBoost: A Generic Gradient Boosting Machine based on Trust-region Method

A generic Gradient Boosting Machine called Trust-region Boosting (TRBoos...
research
06/23/2020

An efficient Averaged Stochastic Gauss-Newton algorithm for estimating parameters of non linear regressions models

Non linear regression models are a standard tool for modeling real pheno...

Please sign up or login with your details

Forgot password? Click here to reset