Online Bootstrap Inference with Nonconvex Stochastic Gradient Descent Estimator

06/03/2023
by   Yanjie Zhong, et al.
0

In this paper, we investigate the theoretical properties of stochastic gradient descent (SGD) for statistical inference in the context of nonconvex optimization problems, which have been relatively unexplored compared to convex settings. Our study is the first to establish provable inferential procedures using the SGD estimator for general nonconvex objective functions, which may contain multiple local minima. We propose two novel online inferential procedures that combine SGD and the multiplier bootstrap technique. The first procedure employs a consistent covariance matrix estimator, and we establish its error convergence rate. The second procedure approximates the limit distribution using bootstrap SGD estimators, yielding asymptotically valid bootstrap confidence intervals. We validate the effectiveness of both approaches through numerical experiments. Furthermore, our analysis yields an intermediate result: the in-expectation error convergence rate for the original SGD estimator in nonconvex settings, which is comparable to existing results for convex problems. We believe this novel finding holds independent interest and enriches the literature on optimization and statistical inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2023

Statistical Inference with Stochastic Gradient Methods under φ-mixing Data

Stochastic gradient descent (SGD) is a scalable and memory-efficient opt...
research
05/27/2022

Asymptotic Convergence Rate and Statistical Inference for Stochastic Sequential Quadratic Programming

We apply a stochastic sequential quadratic programming (StoSQP) algorith...
research
08/03/2023

Online covariance estimation for stochastic gradient descent under Markovian sampling

We study the online overlapping batch-means covariance estimator for Sto...
research
08/25/2020

PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization

In this paper, we propose a novel stochastic gradient estimator—ProbAbil...
research
02/13/2018

Statistical Inference for Online Learning and Stochastic Approximation via Hierarchical Incremental Gradient Descent

Stochastic gradient descent (SGD) is an immensely popular approach for o...
research
02/20/2023

High-dimensional Central Limit Theorems for Linear Functionals of Online Least-Squares SGD

Stochastic gradient descent (SGD) has emerged as the quintessential meth...
research
12/02/2022

Covariance Estimators for the ROOT-SGD Algorithm in Online Learning

Online learning naturally arises in many statistical and machine learnin...

Please sign up or login with your details

Forgot password? Click here to reset