Derivatives and residual distribution of regularized M-estimators with application to adaptive tuning

07/11/2021
by   Pierre C. Bellec, et al.
16

This paper studies M-estimators with gradient-Lipschitz loss function regularized with convex penalty in linear models with Gaussian design matrix and arbitrary noise distribution. A practical example is the robust M-estimator constructed with the Huber loss and the Elastic-Net penalty and the noise distribution has heavy-tails. Our main contributions are three-fold. (i) We provide general formulae for the derivatives of regularized M-estimators β̂(y,X) where differentiation is taken with respect to both y and X; this reveals a simple differentiability structure shared by all convex regularized M-estimators. (ii) Using these derivatives, we characterize the distribution of the residual r_i = y_i-x_i^⊤β̂ in the intermediate high-dimensional regime where dimension and sample size are of the same order. (iii) Motivated by the distribution of the residuals, we propose a novel adaptive criterion to select tuning parameters of regularized M-estimators. The criterion approximates the out-of-sample error up to an additive constant independent of the estimator, so that minimizing the criterion provides a proxy for minimizing the out-of-sample error. The proposed adaptive criterion does not require the knowledge of the noise distribution or of the covariance of the design. Simulated data confirms the theoretical findings, regarding both the distribution of the residuals and the success of the criterion as a proxy of the out-of-sample error. Finally our results reveal new relationships between the derivatives of β̂(y,X) and the effective degrees of freedom of the M-estimator, which are of independent interest.

READ FULL TEXT

page 2

page 8

page 23

page 24

research
08/26/2020

Out-of-sample error estimate for robust M-estimators with convex penalty

A generic out-of-sample error estimate is proposed for robust M-estimato...
research
05/29/2019

The cost-free nature of optimally tuning Tikhonov regularizers and other ordered smoothers

We consider the problem of selecting the best estimator among a family o...
research
06/16/2022

Universality of regularized regression estimators in high dimensions

The Convex Gaussian Min-Max Theorem (CGMT) has emerged as a prominent th...
research
07/08/2021

Asymptotic normality of robust M-estimators with convex penalty

This paper develops asymptotic normality results for individual coordina...
research
07/07/2021

Robust Variable Selection and Estimation Via Adaptive Elastic Net S-Estimators for Linear Regression

Heavy-tailed error distributions and predictors with anomalous values ar...
research
04/14/2022

Observable adjustments in single-index models for regularized M-estimators

We consider observations (X,y) from single index models with unknown lin...
research
07/22/2020

On Optimal and Feasible Regularization in Linear Models in Time Series

We discuss predictive linear modeling in the presence of: (i) stochastic...

Please sign up or login with your details

Forgot password? Click here to reset