On regularization methods based on Rényi's pseudodistances for sparse high-dimensional linear regression models

07/31/2020
by   Elena Castilla, et al.
0

Several regularization methods have been considered over the last decade for sparse high-dimensional linear regression models, but the most common ones use the least square (quadratic) or likelihood loss and hence are not robust against data contamination. Some authors have overcome the problem of non-robustness by considering suitable loss function based on divergence measures (e.g., density power divergence, gamma-divergence, etc.) instead of the quadratic loss. In this paper we shall consider a loss function based on the Rényi's pseudodistance jointly with non-concave penalties in order to simultaneously perform variable selection and get robust estimators of the parameters in a high-dimensional linear regression model of non-polynomial dimensionality. The desired oracle properties of our proposed method are derived theoretically and its usefulness is illustustrated numerically through simulations and real data examples.

READ FULL TEXT
research
04/11/2020

Robust adaptive variable selection in ultra-high dimensional regression models based on the density power divergence loss

We consider the problem of simultaneous model selection and the estimati...
research
07/08/2020

Sparse Regression for Extreme Values

We study the problem of selecting features associated with extreme value...
research
11/09/2017

Oracle inequalities for sign constrained generalized linear models

High-dimensional data have recently been analyzed because of data collec...
research
03/03/2011

Sparse Volterra and Polynomial Regression Models: Recoverability and Estimation

Volterra and polynomial regression models play a major role in nonlinear...
research
02/26/2020

Aggregated hold out for sparse linear regression with a robust loss function

Sparse linear regression methods generally have a free hyperparameter wh...
research
11/22/2014

Characterization of the equivalence of robustification and regularization in linear and matrix regression

The notion of developing statistical methods in machine learning which a...
research
11/14/2022

On the generalization error of norm penalty linear regression models

We study linear regression problems inf_β∈ℝ^d(𝔼_ℙ_n[|Y - 𝐗^⊤β|^r])^1/r +...

Please sign up or login with your details

Forgot password? Click here to reset