
An efficient Averaged Stochastic GaussNewton algorithm for estimating parameters of non linear regressions models
Non linear regression models are a standard tool for modeling real pheno...
read it

An efficient stochastic Newton algorithm for parameter estimation in logistic regressions
Logistic regression is a wellknown statistical model which is commonly ...
read it

Stochastic Trust Region Inexact Newton Method for Largescale Machine Learning
Nowadays stochastic approximation methods are one of the major research ...
read it

Progressive Batching for Efficient Nonlinear Least Squares
Nonlinear least squares solvers are used across a broad range of offlin...
read it

On the asymptotic rate of convergence of Stochastic Newton algorithms and their Weighted Averaged versions
The majority of machine learning methods can be regarded as the minimiza...
read it

Statistical Inference for PolyakRuppert Averaged Zerothorder Stochastic Gradient Algorithm
As machine learning models are deployed in critical applications, it bec...
read it

A Tree Adjoining Grammar Representation for Models Of Stochastic Dynamical Systems
Model structure and complexity selection remains a challenging problem i...
read it
An efficient Averaged Stochastic GaussNewtwon algorithm for estimating parameters of non linear regressions models
Non linear regression models are a standard tool for modeling real phenomena, with several applications in machine learning, ecology, econometry... Estimating the parameters of the model has garnered a lot of attention during many years. We focus here on a recursive method for estimating parameters of non linear regressions. Indeed, these kinds of methods, whose most famous are probably the stochastic gradient algorithm and its averaged version, enable to deal efficiently with massive data arriving sequentially. Nevertheless, they can be, in practice, very sensitive to the case where the eigenvalues of the Hessian of the functional we would like to minimize are at different scales. To avoid this problem, we first introduce an online Stochastic GaussNewton algorithm. In order to improve the estimates behavior in case of bad initialization, we also introduce a new Averaged Stochastic GaussNewton algorithm and prove its asymptotic efficiency.
READ FULL TEXT
Comments
There are no comments yet.