The Adaptive τ-Lasso: Its Robustness and Oracle Properties

04/18/2023
by   Emadaldin Mozafari-Majd, et al.
0

This paper introduces a new regularized version of the robust τ-regression estimator for analyzing high-dimensional data sets subject to gross contamination in the response variables and covariates. We call the resulting estimator adaptive τ-Lasso that is robust to outliers and high-leverage points and simultaneously employs adaptive ℓ_1-norm penalty term to reduce the bias associated with large true regression coefficients. More specifically, this adaptive ℓ_1-norm penalty term assigns a weight to each regression coefficient. For a fixed number of predictors p, we show that the adaptive τ-Lasso has the oracle property with respect to variable-selection consistency and asymptotic normality for the regression vector corresponding to the true support, assuming knowledge of the true regression vector support. We then characterize its robustness via the finite-sample breakdown point and the influence function. We carry-out extensive simulations to compare the performance of the adaptive τ-Lasso estimator with that of other competing regularized estimators in terms of prediction and variable selection accuracy in the presence of contamination within the response vector/regression matrix and additive heavy-tailed noise. We observe from our simulations that the class of τ-Lasso estimators exhibits robustness and reliable performance in both contaminated and uncontaminated data settings, achieving the best or close-to-best for many scenarios, except for oracle estimators. However, it is worth noting that no particular estimator uniformly dominates others. We also validate our findings on robustness properties through simulation experiments.

READ FULL TEXT
research
07/07/2021

Robust Variable Selection and Estimation Via Adaptive Elastic Net S-Estimators for Linear Regression

Heavy-tailed error distributions and predictors with anomalous values ar...
research
08/30/2023

Adaptive Lasso, Transfer Lasso, and Beyond: An Asymptotic Perspective

This paper presents a comprehensive exploration of the theoretical prope...
research
04/11/2020

Robust adaptive variable selection in ultra-high dimensional regression models based on the density power divergence loss

We consider the problem of simultaneous model selection and the estimati...
research
09/10/2021

Reducing bias and alleviating the influence of excess of zeros with multioutcome adaptive LAD-lasso

Zero-inflated explanatory variables are common in fields such as ecology...
research
05/17/2020

Robust subset selection

The best subset selection (or "best subsets") estimator is a classic too...
research
04/04/2011

Robust Nonparametric Regression via Sparsity Control with Application to Load Curve Data Cleansing

Nonparametric methods are widely applicable to statistical inference pro...
research
04/26/2018

GEP-MSCRA for computing the group zero-norm regularized least squares estimator

This paper concerns with the group zero-norm regularized least squares e...

Please sign up or login with your details

Forgot password? Click here to reset