
Inference robust to outliers with l1norm penalization
This paper considers the problem of inference in a linear regression mod...
read it

Robusttooutliers squareroot LASSO, simultaneous inference with a MOM approach
We consider the leastsquares regression problem with unknown noise vari...
read it

Robust Nonparametric Regression via Sparsity Control with Application to Load Curve Data Cleansing
Nonparametric methods are widely applicable to statistical inference pro...
read it

Outlierrobust estimation of a sparse linear model using ℓ_1penalized Huber's Mestimator
We study the problem of estimating a pdimensional ssparse vector in a ...
read it

Highdimensional robust regression and outliers detection with SLOPE
The problems of outliers detection and robust regression in a highdimen...
read it

Matrix optimization based Euclidean embedding with outliers
Euclidean embedding from noisy observations containing outlier errors is...
read it

A New Perspective on Debiasing Linear Regressions
In this paper, we propose an abstract procedure for debiasing constraine...
read it
Highdimensional inference robust to outliers with l1norm penalization
This paper studies inference in the highdimensional linear regression model with outliers. Sparsity constraints are imposed on the vector of coefficients of the covariates. The number of outliers can grow with the sample size while their proportion goes to 0. We propose a twostep procedure for inference on the coefficients of a fixed subset of regressors. The first step is a based on several squareroot lasso l1norm penalized estimators, while the second step is the ordinary least squares estimator applied to a well chosen regression. We establish asymptotic normality of the twostep estimator. The proposed procedure is efficient in the sense that it attains the semiparametric efficiency bound when applied to the model without outliers under homoscedasticity. This approach is also computationally advantageous, it amounts to solving a finite number of convex optimization programs.
READ FULL TEXT
Comments
There are no comments yet.