Variable selection in convex quantile regression: L1-norm or L0-norm regularization?

07/07/2021
by   Sheng Dai, et al.
0

The curse of dimensionality is a recognized challenge in nonparametric estimation. This paper develops a new L0-norm regularization approach to the convex quantile and expectile regressions for subset variable selection. We show how to use mixed integer programming to solve the proposed L0-norm regularization approach in practice and build a link to the commonly used L1-norm regularization approach. A Monte Carlo study is performed to compare the finite sample performances of the proposed L0-penalized convex quantile and expectile regression approaches with the L1-norm regularization approaches. The proposed approach is further applied to benchmark the sustainable development performance of the OECD countries and empirically analyze the accuracy in the dimensionality reduction of variables. The results from the simulation and application illustrate that the proposed L0-norm regularization approach can more effectively address the curse of dimensionality than the L1-norm regularization approach in multidimensional spaces.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2015

l1-norm Penalized Orthogonal Forward Regression

A l1-norm penalized orthogonal forward regression (l1-POFR) algorithm is...
research
12/17/2020

l1-norm quantile regression screening rule via the dual circumscribed sphere

l1-norm quantile regression is a common choice if there exists outlier o...
research
10/20/2019

Sparse (group) learning with Lipschitz loss functions: a unified analysis

We study a family of sparse estimators defined as minimizers of some emp...
research
06/19/2020

Sparse Quantile Regression

We consider both ℓ _0-penalized and ℓ _0-constrained quantile regression...
research
05/28/2017

L1-norm Error Function Robustness and Outlier Regularization

In many real-world applications, data come with corruptions, large error...
research
10/23/2016

Inertial Regularization and Selection (IRS): Sequential Regression in High-Dimension and Sparsity

In this paper, we develop a new sequential regression modeling approach ...

Please sign up or login with your details

Forgot password? Click here to reset