Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression

02/23/2014
by   Divyanshu Vats, et al.
0

In this paper, we address the challenging problem of selecting tuning parameters for high-dimensional sparse regression. We propose a simple and computationally efficient method, called path thresholding (PaTh), that transforms any tuning parameter-dependent sparse regression algorithm into an asymptotically tuning-free sparse regression algorithm. More specifically, we prove that, as the problem size becomes large (in the number of variables and in the number of observations), PaTh performs accurate sparse regression, under appropriate conditions, without specifying a tuning parameter. In finite-dimensional settings, we demonstrate that PaTh can alleviate the computational burden of model selection algorithms by significantly reducing the search space of tuning parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2019

A Survey of Tuning Parameter Selection for High-dimensional Regression

Penalized (or regularized) regression, as represented by Lasso and its v...
research
01/29/2018

Model selection in sparse high-dimensional vine copula models with application to portfolio risk

Vine copulas allow to build flexible dependence models for an arbitrary ...
research
08/05/2014

Convex Biclustering

In the biclustering problem, we seek to simultaneously group observation...
research
08/31/2014

Persistent Homology in Sparse Regression and Its Application to Brain Morphometry

Sparse systems are usually parameterized by a tuning parameter that dete...
research
10/28/2017

Cox's proportional hazards model with a high-dimensional and sparse regression parameter

This paper deals with the proportional hazards model proposed by D. R. C...
research
07/25/2023

Tuning-free testing of factor regression against factor-augmented sparse alternatives

This study introduces a bootstrap test of the validity of factor regress...
research
10/29/2020

An Exact Solution Path Algorithm for SLOPE and Quasi-Spherical OSCAR

Sorted L_1 penalization estimator (SLOPE) is a regularization technique ...

Please sign up or login with your details

Forgot password? Click here to reset