Pattern Recovery in Penalized and Thresholded Estimation and its Geometry

07/19/2023
by   Piotr Graczyk, et al.
0

We consider the framework of penalized estimation where the penalty term is given by a real-valued polyhedral gauge, which encompasses methods such as LASSO (and many variants thereof such as the generalized LASSO), SLOPE, OSCAR, PACS and others. Each of these estimators can uncover a different structure or “pattern” of the unknown parameter vector. We define a general notion of patterns based on subdifferentials and formalize an approach to measure their complexity. For pattern recovery, we provide a minimal condition for a particular pattern to be detected by the procedure with positive probability, the so-called accessibility condition. Using our approach, we also introduce the stronger noiseless recovery condition. For the LASSO, it is well known that the irrepresentability condition is necessary for pattern recovery with probability larger than 1/2 and we show that the noiseless recovery plays exactly the same role, thereby extending and unifying the irrepresentability condition of the LASSO to a broad class of penalized estimators. We show that the noiseless recovery condition can be relaxed when turning to thresholded penalized estimators, extending the idea of the thresholded LASSO: we prove that the accessibility condition is already sufficient (and necessary) for sure pattern recovery by thresholded penalized estimation provided that the signal of the pattern is large enough. Throughout the article, we demonstrate how our findings can be interpreted through a geometrical lens.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/20/2020

The Geometry of Uniqueness and Model Selection of Penalized Estimators including SLOPE, LASSO, and Basis Pursuit

We provide a necessary and sufficient condition for the uniqueness of pe...
research
05/18/2020

The Trimmed Lasso: Sparse Recovery Guarantees and Practical Optimization by the Generalized Soft-Min Penalty

We present a new approach to solve the sparse approximation or best subs...
research
03/22/2022

Pattern recovery by SLOPE

LASSO and SLOPE are two popular methods for dimensionality reduction in ...
research
01/02/2011

Sparse recovery with unknown variance: a LASSO-type approach

We address the issue of estimating the regression vector β in the generi...
research
04/19/2011

A sufficient condition on monotonic increase of the number of nonzero entry in the optimizer of L1 norm penalized least-square problem

The ℓ-1 norm based optimization is widely used in signal processing, esp...
research
12/13/2018

On the Differences between L2-Boosting and the Lasso

We prove that L2-Boosting lacks a theoretical property which is central ...
research
11/02/2019

Global Adaptive Generative Adjustment

Many traditional signal recovery approaches can behave well basing on th...

Please sign up or login with your details

Forgot password? Click here to reset