Statistical Inference with Stochastic Gradient Methods under φ-mixing Data

02/24/2023
by   Ruiqi Liu, et al.
0

Stochastic gradient descent (SGD) is a scalable and memory-efficient optimization algorithm for large datasets and stream data, which has drawn a great deal of attention and popularity. The applications of SGD-based estimators to statistical inference such as interval estimation have also achieved great success. However, most of the related works are based on i.i.d. observations or Markov chains. When the observations come from a mixing time series, how to conduct valid statistical inference remains unexplored. As a matter of fact, the general correlation among observations imposes a challenge on interval estimation. Most existing methods may ignore this correlation and lead to invalid confidence intervals. In this paper, we propose a mini-batch SGD estimator for statistical inference when the data is ϕ-mixing. The confidence intervals are constructed using an associated mini-batch bootstrap SGD procedure. Using “independent block” trick from <cit.>, we show that the proposed estimator is asymptotically normal, and its limiting distribution can be effectively approximated by the bootstrap procedure. The proposed method is memory-efficient and easy to implement in practice. Simulation studies on synthetic data and an application to a real-world dataset confirm our theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2016

Statistical Inference for Model Parameters in Stochastic Gradient Descent

The stochastic gradient descent (SGD) algorithm has been widely used in ...
research
07/01/2017

On Scalable Inference with Stochastic Gradient Descent

In many applications involving large dataset or online updating, stochas...
research
06/03/2023

Online Bootstrap Inference with Nonconvex Stochastic Gradient Descent Estimator

In this paper, we investigate the theoretical properties of stochastic g...
research
02/13/2018

Statistical Inference for Online Learning and Stochastic Approximation via Hierarchical Incremental Gradient Descent

Stochastic gradient descent (SGD) is an immensely popular approach for o...
research
05/21/2017

Statistical inference using SGD

We present a novel method for frequentist statistical inference in M-est...
research
01/06/2020

Scalable Estimation and Inference with Large-scale or Online Survival Data

With the rapid development of data collection and aggregation technologi...

Please sign up or login with your details

Forgot password? Click here to reset