Support estimation in high-dimensional heteroscedastic mean regression

11/03/2020
by   Philipp Hermann, et al.
0

A current strand of research in high-dimensional statistics deals with robustifying the available methodology with respect to deviations from the pervasive light-tail assumptions. In this paper we consider a linear mean regression model with random design and potentially heteroscedastic, heavy-tailed errors, and investigate support estimation in this framework. We use a strictly convex, smooth variant of the Huber loss function with tuning parameter depending on the parameters of the problem, as well as the adaptive LASSO penalty for computational efficiency. For the resulting estimator we show sign-consistency and optimal rates of convergence in the ℓ_∞ norm as in the homoscedastic, light-tailed setting. In our analysis, we have to deal with the issue that the support of the target parameter in the linear mean regression model and its robustified version may differ substantially even for small values of the tuning parameter of the Huber loss function. Simulations illustrate the favorable numerical performance of the proposed methodology.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2019

Distributed High-dimensional Regression Under a Quantile Loss Function

This paper studies distributed estimation and support recovery for high-...
research
12/07/2018

Variable selection in high-dimensional linear model with possibly asymmetric or heavy-tailed errors

We consider the problem of automatic variable selection in a linear mode...
research
04/18/2019

Adaptive Huber Regression on Markov-dependent Data

High-dimensional linear regression has been intensively studied in the c...
research
06/13/2018

Plug-in Regularized Estimation of High-Dimensional Parameters in Nonlinear Semiparametric Models

We develop a theory for estimation of a high-dimensional sparse paramete...
research
01/01/2015

Statistical consistency and asymptotic normality for high-dimensional robust M-estimators

We study theoretical properties of regularized robust M-estimators, appl...
research
03/27/2016

Regularization Parameter Selection for a Bayesian Multi-Level Group Lasso Regression Model with Application to Imaging Genomics

We investigate the choice of tuning parameters for a Bayesian multi-leve...
research
09/11/2018

Tuning metaheuristics by sequential optimization of regression models

Tuning parameters is an important step for the application of metaheuris...

Please sign up or login with your details

Forgot password? Click here to reset