Statistical Inference for Online Learning and Stochastic Approximation via Hierarchical Incremental Gradient Descent

02/13/2018
by   Weijie Su, et al.
0

Stochastic gradient descent (SGD) is an immensely popular approach for online learning in settings where data arrives in a stream or data sizes are very large. However, despite an ever-increasing volume of work on SGD, much less is known about the statistical inferential properties of SGD-based predictions. Taking a fully inferential viewpoint, this paper introduces a novel procedure termed HiGrad to conduct statistical inference for online learning, without incurring additional computational cost compared with SGD. The HiGrad procedure begins by performing SGD updates for a while and then splits the single thread into several threads, and this procedure hierarchically operates in this fashion along each thread. With predictions provided by multiple threads in place, a t-based confidence interval is constructed by decorrelating predictions using covariance structures given by the Ruppert--Polyak averaging scheme. Under certain regularity conditions, the HiGrad confidence interval is shown to attain asymptotically exact coverage probability. Finally, the performance of HiGrad is evaluated through extensive simulation studies and a real data example. An R package higrad has been developed to implement the method.

READ FULL TEXT
research
07/01/2017

On Scalable Inference with Stochastic Gradient Descent

In many applications involving large dataset or online updating, stochas...
research
02/24/2023

Statistical Inference with Stochastic Gradient Methods under φ-mixing Data

Stochastic gradient descent (SGD) is a scalable and memory-efficient opt...
research
05/21/2017

Statistical inference using SGD

We present a novel method for frequentist statistical inference in M-est...
research
06/03/2023

Online Bootstrap Inference with Nonconvex Stochastic Gradient Descent Estimator

In this paper, we investigate the theoretical properties of stochastic g...
research
06/06/2021

Fast and Robust Online Inference with Stochastic Gradient Descent via Random Scaling

We develop a new method of online inference for a vector of parameters e...
research
12/02/2022

Covariance Estimators for the ROOT-SGD Algorithm in Online Learning

Online learning naturally arises in many statistical and machine learnin...
research
05/10/2015

Towards stability and optimality in stochastic gradient descent

Iterative procedures for parameter estimation based on stochastic gradie...

Please sign up or login with your details

Forgot password? Click here to reset