Retire: Robust Expectile Regression in High Dimensions

12/11/2022
by   Rebeka Man, et al.
0

High-dimensional data can often display heterogeneity due to heteroscedastic variance or inhomogeneous covariate effects. Penalized quantile and expectile regression methods offer useful tools to detect heteroscedasticity in high-dimensional data. The former is computationally challenging due to the non-smooth nature of the check loss, and the latter is sensitive to heavy-tailed error distributions. In this paper, we propose and study (penalized) robust expectile regression (retire), with a focus on iteratively reweighted ℓ_1-penalization which reduces the estimation bias from ℓ_1-penalization and leads to oracle properties. Theoretically, we establish the statistical properties of the retire estimator under two regimes: (i) low-dimensional regime in which d ≪ n; (ii) high-dimensional regime in which s≪ n≪ d with s denoting the number of significant predictors. In the high-dimensional setting, we carefully characterize the solution path of the iteratively reweighted ℓ_1-penalized retire estimation, adapted from the local linear approximation algorithm for folded-concave regularization. Under a mild minimum signal strength condition, we show that after as many as log(log d) iterations the final iterate enjoys the oracle convergence rate. At each iteration, the weighted ℓ_1-penalized convex program can be efficiently solved by a semismooth Newton coordinate descent algorithm. Numerical studies demonstrate the competitive performance of the proposed procedure compared with either non-robust or quantile regression based alternatives.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2021

High-Dimensional Quantile Regression: Convolution Smoothing and Concave Regularization

ℓ_1-penalized quantile regression is widely used for analyzing high-dime...
research
06/13/2019

Distributed High-dimensional Regression Under a Quantile Loss Function

This paper studies distributed estimation and support recovery for high-...
research
09/09/2015

Semismooth Newton Coordinate Descent Algorithm for Elastic-Net Penalized Huber Loss Regression and Quantile Regression

We propose an algorithm, semismooth Newton coordinate descent (SNCD), fo...
research
05/21/2012

Variance function estimation in high-dimensions

We consider the high-dimensional heteroscedastic regression model, where...
research
08/21/2022

High-Dimensional Composite Quantile Regression: Optimal Statistical Guarantees and Fast Algorithms

The composite quantile regression (CQR) was introduced by Zou and Yuan [...
research
10/15/2015

Robust Learning for Optimal Treatment Decision with NP-Dimensionality

In order to identify important variables that are involved in making opt...
research
09/04/2023

Robust penalized least squares of depth trimmed residuals regression for high-dimensional data

Challenges with data in the big-data era include (i) the dimension p is ...

Please sign up or login with your details

Forgot password? Click here to reset