On the asymptotic rate of convergence of Stochastic Newton algorithms and their Weighted Averaged versions

11/19/2020
by   Claire Boyer, et al.
0

The majority of machine learning methods can be regarded as the minimization of an unavailable risk function. To optimize the latter, given samples provided in a streaming fashion, we define a general stochastic Newton algorithm and its weighted average version. In several use cases, both implementations will be shown not to require the inversion of a Hessian estimate at each iteration, but a direct update of the estimate of the inverse Hessian instead will be favored. This generalizes a trick introduced in [2] for the specific case of logistic regression, by directly updating the estimate of the inverse Hessian. Under mild assumptions such as local strong convexity at the optimum, we establish almost sure convergences and rates of convergence of the algorithms, as well as central limit theorems for the constructed parameter estimates. The unified framework considered in this paper covers the case of linear, logistic or softmax regressions to name a few. Numerical experiments on simulated data give the empirical evidence of the pertinence of the proposed methods, which outperform popular competitors particularly in case of bad initializa-tions.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

09/27/2016

Exact and Inexact Subsampled Newton Methods for Optimization

The paper studies the solution of stochastic optimization problems in wh...
04/16/2019

An efficient stochastic Newton algorithm for parameter estimation in logistic regressions

Logistic regression is a well-known statistical model which is commonly ...
06/01/2018

Global linear convergence of Newton's method without strong-convexity or Lipschitz gradients

We show that Newton's method converges globally at a linear rate for obj...
10/07/2021

A Stochastic Newton Algorithm for Distributed Convex Optimization

We propose and analyze a stochastic Newton algorithm for homogeneous dis...
05/28/2019

Distributed estimation of the inverse Hessian by determinantal averaging

In distributed optimization and distributed numerical linear algebra, we...
06/23/2020

An efficient Averaged Stochastic Gauss-Newton algorithm for estimating parameters of non linear regressions models

Non linear regression models are a standard tool for modeling real pheno...
06/23/2020

An efficient Averaged Stochastic Gauss-Newtwon algorithm for estimating parameters of non linear regressions models

Non linear regression models are a standard tool for modeling real pheno...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.