On Scalable Inference with Stochastic Gradient Descent

07/01/2017
by   Yixin Fang, et al.
0

In many applications involving large dataset or online updating, stochastic gradient descent (SGD) provides a scalable way to compute parameter estimates and has gained increasing popularity due to its numerical convenience and memory efficiency. While the asymptotic properties of SGD-based estimators have been established decades ago, statistical inference such as interval estimation remains much unexplored. The traditional resampling method such as the bootstrap is not computationally feasible since it requires to repeatedly draw independent samples from the entire dataset. The plug-in method is not applicable when there are no explicit formulas for the covariance matrix of the estimator. In this paper, we propose a scalable inferential procedure for stochastic gradient descent, which, upon the arrival of each observation, updates the SGD estimate as well as a large number of randomly perturbed SGD estimates. The proposed method is easy to implement in practice. We establish its theoretical properties for a general class of models that includes generalized linear models and quantile regression models as special cases. The finite-sample performance and numerical utility is evaluated by simulation studies and two real data applications.

READ FULL TEXT

page 18

page 19

research
02/24/2023

Statistical Inference with Stochastic Gradient Methods under φ-mixing Data

Stochastic gradient descent (SGD) is a scalable and memory-efficient opt...
research
01/06/2020

Scalable Estimation and Inference with Large-scale or Online Survival Data

With the rapid development of data collection and aggregation technologi...
research
02/13/2018

Statistical Inference for Online Learning and Stochastic Approximation via Hierarchical Incremental Gradient Descent

Stochastic gradient descent (SGD) is an immensely popular approach for o...
research
05/21/2021

Online Statistical Inference for Parameters Estimation with Linear-Equality Constraints

Stochastic gradient descent (SGD) and projected stochastic gradient desc...
research
02/05/2023

Scalable inference in functional linear regression with streaming data

Traditional static functional data analysis is facing new challenges due...
research
11/28/2018

First-order Newton-type Estimator for Distributed Estimation and Inference

This paper studies distributed estimation and inference for a general st...
research
04/02/2021

A Sieve Stochastic Gradient Descent Estimator for Online Nonparametric Regression in Sobolev ellipsoids

The goal of regression is to recover an unknown underlying function that...

Please sign up or login with your details

Forgot password? Click here to reset