Computationally and Statistically Efficient Truncated Regression

10/22/2020
by   Constantinos Daskalakis, et al.
0

We provide a computationally and statistically efficient estimator for the classical problem of truncated linear regression, where the dependent variable y = w^T x + ϵ and its corresponding vector of covariates x ∈ R^k are only revealed if the dependent variable falls in some subset S ⊆ R; otherwise the existence of the pair (x, y) is hidden. This problem has remained a challenge since the early works of [Tobin 1958, Amemiya 1973, Hausman and Wise 1977], its applications are abundant, and its history dates back even further to the work of Galton, Pearson, Lee, and Fisher. While consistent estimators of the regression coefficients have been identified, the error rates are not well-understood, especially in high dimensions. Under a thickness assumption about the covariance matrix of the covariates in the revealed sample, we provide a computationally efficient estimator for the coefficient vector w from n revealed samples that attains l_2 error Õ(√(k/n)). Our estimator uses Projected Stochastic Gradient Descent (PSGD) without replacement on the negative log-likelihood of the truncated sample. For the statistically efficient estimation we only need oracle access to the set S.In order to achieve computational efficiency we need to assume that S is a union of a finite number of intervals but still can be complicated. PSGD without replacement must be restricted to an appropriately defined convex cone to guarantee that the negative log-likelihood is strongly convex, which in turn is established using concentration of matrices on variables with sub-exponential tails. We perform experiments on simulated data to illustrate the accuracy of our estimator. As a corollary, we show that SGD learns the parameters of single-layer neural networks with noisy activation functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/25/2022

Efficient Truncated Linear Regression with Unknown Noise Variance

Truncated linear regression is a classical challenge in Statistics, wher...
research
07/29/2020

Truncated Linear Regression in High Dimensions

As in standard linear regression, in truncated linear regression, we are...
research
09/11/2018

Efficient Statistics, in High Dimensions, from Truncated Samples

We provide an efficient algorithm for the classical problem, going back ...
research
05/08/2019

Regression from Dependent Observations

The standard linear and logistic regression models assume that the respo...
research
05/10/2023

Computationally Efficient and Statistically Optimal Robust High-Dimensional Linear Regression

High-dimensional linear regression under heavy-tailed noise or outlier c...
research
10/25/2018

Efficient Learning of Restricted Boltzmann Machines Using Covariance estimates

Learning of RBMs using standard algorithms such as CD(k) involves gradie...
research
12/21/2020

Nonstationarity Analysis of Materials Microstructures via Fisher Score Vectors

Microstructures are critical to the physical properties of materials. St...

Please sign up or login with your details

Forgot password? Click here to reset