Oracle Inequalities for High-dimensional Prediction

08/01/2016
by   Johannes Lederer, et al.
0

The abundance of high-dimensional data in the modern sciences has generated tremendous interest in penalized estimators such as the lasso, scaled lasso, square-root lasso, elastic net, and many others. However, the common theoretical bounds for the predictive performance of these estimators hinge on strong, in practice unverifiable assumptions on the design. In this paper, we introduce a new set of oracle inequalities for prediction in high-dimensional linear regression. These bounds hold irrespective of the design matrix. Moreover, since the proofs rely only on convexity and continuity arguments, the bounds apply to a wide range of penalized estimators. Overall, the bounds demonstrate that generic estimators can provide consistent prediction with any design matrix. From a practical point of view, the bounds can help to identify the potential of specific estimators, and they can help to get a sense of the prediction accuracy in a given application.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/03/2020

On Dantzig and Lasso estimators of the drift in a high dimensional Ornstein-Uhlenbeck model

In this paper we present new theoretical results for the Dantzig and Las...
research
03/18/2014

On the Sensitivity of the Lasso to the Number of Predictor Variables

The Lasso is a computationally efficient regression regularization proce...
research
04/12/2022

High-dimensional nonconvex lasso-type M-estimators

This paper proposes a theory for ℓ_1-norm penalized high-dimensional M-e...
research
06/16/2021

Pre-processing with Orthogonal Decompositions for High-dimensional Explanatory Variables

Strong correlations between explanatory variables are problematic for hi...
research
03/28/2018

Greedy Variance Estimation for the LASSO

Recent results have proven the minimax optimality of LASSO and related a...
research
06/20/2016

On the prediction loss of the lasso in the partially labeled setting

In this paper we revisit the risk bounds of the lasso estimator in the c...
research
07/19/2018

Sparse space-time models: Concentration Inequalities and Lasso

Inspired by Kalikow-type decompositions, we introduce a new stochastic m...

Please sign up or login with your details

Forgot password? Click here to reset