Exact and Inexact Subsampled Newton Methods for Optimization

09/27/2016
by   Raghu Bollapragada, et al.
0

The paper studies the solution of stochastic optimization problems in which approximations to the gradient and Hessian are obtained through subsampling. We first consider Newton-like methods that employ these approximations and discuss how to coordinate the accuracy in the gradient and Hessian to yield a superlinear rate of convergence in expectation. The second part of the paper analyzes an inexact Newton method that solves linear systems approximately using the conjugate gradient (CG) method, and that samples the Hessian and not the gradient (the gradient is assumed to be exact). We provide a complexity analysis for this method based on the properties of the CG iteration and the quality of the Hessian approximation, and compare it with a method that employs a stochastic gradient iteration instead of the CG method. We report preliminary numerical results that illustrate the performance of inexact subsampled Newton methods on machine learning applications based on logistic regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2021

Nys-Curve: Nyström-Approximated Curvature for Stochastic Optimization

The quasi-Newton methods generally provide curvature information by appr...
research
06/17/2020

Structured Stochastic Quasi-Newton Methods for Large-Scale Optimization Problems

In this paper, we consider large-scale finite-sum nonconvex problems ari...
research
11/19/2020

On the asymptotic rate of convergence of Stochastic Newton algorithms and their Weighted Averaged versions

The majority of machine learning methods can be regarded as the minimiza...
research
07/21/2023

Newton Nonholonomic Source Seeking for Distance-Dependent Maps

The topics of source seeking and Newton-based extremum seeking have flou...
research
03/09/2018

A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization

In this work, we present a globalized stochastic semismooth Newton metho...
research
08/19/2023

Complexity Guarantees for Nonconvex Newton-MR Under Inexact Hessian Information

We consider extensions of the Newton-MR algorithm for nonconvex optimiza...
research
08/02/2021

Computing the Newton-step faster than Hessian accumulation

Computing the Newton-step of a generic function with N decision variable...

Please sign up or login with your details

Forgot password? Click here to reset