First-order Newton-type Estimator for Distributed Estimation and Inference

11/28/2018
by   Xi Chen, et al.
0

This paper studies distributed estimation and inference for a general statistical problem with a convex loss that could be non-differentiable. For the purpose of efficient computation, we restrict ourselves to stochastic first-order optimization, which enjoys low per-iteration complexity. To motivate the proposed method, we first investigate the theoretical properties of a straightforward Divide-and-Conquer Stochastic Gradient Descent (DC-SGD) approach. Our theory shows that there is a restriction on the number of machines and this restriction becomes more stringent when the dimension p is large. To overcome this limitation, this paper proposes a new multi-round distributed estimation procedure that approximates the Newton step only using stochastic subgradient. The key component in our method is the proposal of a computationally efficient estimator of Σ^-1 w, where Σ is the population Hessian matrix and w is any given vector. Instead of estimating Σ (or Σ^-1) that usually requires the second-order differentiability of the loss, the proposed First-Order Newton-type Estimator (FONE) directly estimates the vector of interest Σ^-1 w as a whole and is applicable to non-differentiable losses. Our estimator also facilitates the inference for the empirical risk minimizer. It turns out that the key term in the limiting covariance has the form of Σ^-1 w, which can be estimated by FONE.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2019

A Stochastic First-Order Method for Ordered Empirical Risk Minimization

We propose a new stochastic first-order method for empirical risk minimi...
research
07/01/2017

On Scalable Inference with Stochastic Gradient Descent

In many applications involving large dataset or online updating, stochas...
research
05/23/2018

Approximate Newton-based statistical inference using only stochastic gradients

We present a novel inference framework for convex empirical risk minimiz...
research
02/12/2021

Newton Method over Networks is Fast up to the Statistical Precision

We propose a distributed cubic regularization of the Newton method for s...
research
11/29/2018

Distributed Inference for Linear Support Vector Machine

The growing size of modern data brings many new challenges to existing s...
research
10/15/2022

Distributed Estimation and Inference for Semi-parametric Binary Response Models

The development of modern technology has enabled data collection of unpr...
research
02/20/2021

Estimation and Inference by Stochastic Optimization: Three Examples

This paper illustrates two algorithms designed in Forneron Ng (2020)...

Please sign up or login with your details

Forgot password? Click here to reset