On the asymptotic properties of SLOPE

08/23/2019
by   Michał Kos, et al.
0

Sorted L-One Penalized Estimator (SLOPE) is a relatively new convex optimization procedure for selecting predictors in large data bases. Contrary to LASSO, SLOPE has been proved to be asymptotically minimax in the context of sparse high-dimensional generalized linear models. Additionally, in case when the design matrix is orthogonal, SLOPE with the sequence of tuning parameters λ^BH, corresponding to the sequence of decaying thresholds for the Benjamini-Hochberg multiple testing correction, provably controls False Discovery Rate in the multiple regression model. In this article we provide new asymptotic results on the properties of SLOPE when the elements of the design matrix are iid random variables from the Gaussian distribution. Specifically, we provide the conditions, under which the asymptotic FDR of SLOPE based on the sequence λ^BH converges to zero and the power converges to 1. We illustrate our theoretical asymptotic results with extensive simulation study. We also provide precise formulas describing FDR of SLOPE under different loss functions, which sets the stage for future results on the model selection properties of SLOPE and its extensions.

READ FULL TEXT

page 5

page 9

page 11

research
06/29/2020

Penalized regression with multiple loss functions and selection by vote

This article considers a linear model in a high dimensional data scenari...
research
10/05/2018

Asymptotic Confidence Regions Based on the Adaptive Lasso with Partial Consistent Tuning

We construct confidence sets based on an adaptive Lasso estimator with c...
research
02/03/2023

Trade-off between prediction and FDR for high-dimensional Gaussian model selection

In the context of the high-dimensional Gaussian linear regression for or...
research
05/15/2019

Revisiting High Dimensional Bayesian Model Selection for Gaussian Regression

Model selection for regression problems with an increasing number of cov...
research
11/24/2020

Identifying important predictors in large data bases – multiple testing and model selection

This is a chapter of the forthcoming Handbook of Multiple Testing. We co...
research
06/16/2021

Pre-processing with Orthogonal Decompositions for High-dimensional Explanatory Variables

Strong correlations between explanatory variables are problematic for hi...
research
01/17/2013

Hypothesis Testing in High-Dimensional Regression under the Gaussian Random Design Model: Asymptotic Theory

We consider linear regression in the high-dimensional regime where the n...

Please sign up or login with your details

Forgot password? Click here to reset