An {l_1,l_2,l_∞}-Regularization Approach to High-Dimensional Errors-in-variables Models

12/22/2014
by   Alexandre Belloni, et al.
0

Several new estimation methods have been recently proposed for the linear regression model with observation error in the design. Different assumptions on the data generating process have motivated different estimators and analysis. In particular, the literature considered (1) observation errors in the design uniformly bounded by some δ̅, and (2) zero mean independent observation errors. Under the first assumption, the rates of convergence of the proposed estimators depend explicitly on δ̅, while the second assumption has been applied when an estimator for the second moment of the observational error is available. This work proposes and studies two new estimators which, compared to other procedures for regression models with errors in the design, exploit an additional l_∞-norm regularization. The first estimator is applicable when both (1) and (2) hold but does not require an estimator for the second moment of the observational error. The second estimator is applicable under (2) and requires an estimator for the second moment of the observation error. Importantly, we impose no assumption on the accuracy of this pilot estimator, in contrast to the previously known procedures. As the recent proposals, we allow the number of covariates to be much larger than the sample size. We establish the rates of convergence of the estimators and compare them with the bounds obtained for related estimators in the literature. These comparisons show interesting insights on the interplay of the assumptions and the achievable rates of convergence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2023

A zero-estimator approach for estimating the signal level in a high-dimensional regression setting

Analysis of high-dimensional data, where the number of covariates is lar...
research
09/04/2019

On Least Squares Estimation under Heteroscedastic and Heavy-Tailed Errors

We consider least squares estimation in a general nonparametric regressi...
research
12/17/2019

Jackknife covariance matrix estimation for observations from mixture

A general jackknife estimator for the asymptotic covariance of moment es...
research
05/07/2018

Robustness of shape-restricted regression estimators: an envelope perspective

Classical least squares estimators are well-known to be robust with resp...
research
06/19/2020

New Insights into Learning with Correntropy Based Regression

Stemming from information-theoretic learning, the correntropy criterion ...
research
02/09/2015

High dimensional errors-in-variables models with dependent measurements

Suppose that we observe y ∈R^f and X ∈R^f × m in the following errors-in...
research
02/18/2019

Efficiency requires innovation

In estimation a parameter θ∈ R from a sample (x_1,...,x_n) from a popula...

Please sign up or login with your details

Forgot password? Click here to reset