Algorithmic Analysis and Statistical Estimation of SLOPE via Approximate Message Passing

07/17/2019
by   Zhiqi Bu, et al.
0

SLOPE is a relatively new convex optimization procedure for high-dimensional linear regression via the sorted l1 penalty: the larger the rank of the fitted coefficient, the larger the penalty. This non-separable penalty renders many existing techniques invalid or inconclusive in analyzing the SLOPE solution. In this paper, we develop an asymptotically exact characterization of the SLOPE solution under Gaussian random designs through solving the SLOPE problem using approximate message passing (AMP). This algorithmic approach allows us to approximate the SLOPE solution via the much more amenable AMP iterates. Explicitly, we characterize the asymptotic dynamics of the AMP iterates relying on a recently developed state evolution analysis for non-separable penalties, thereby overcoming the difficulty caused by the sorted l1 penalty. Moreover, we prove that the AMP iterates converge to the SLOPE solution in an asymptotic sense, and numerical simulations show that the convergence is surprisingly fast. Our proof rests on a novel technique that specifically leverages the SLOPE problem. In contrast to prior literature, our work not only yields an asymptotically sharp analysis but also offers an algorithmic, flexible, and constructive approach to understanding the SLOPE problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2021

Asymptotic Statistical Analysis of Sparse Group LASSO via Approximate Message Passing Algorithm

Sparse Group LASSO (SGL) is a regularized model for high-dimensional lin...
research
06/25/2019

Approximate separability of symmetrically penalized least squares in high dimensions: characterization and consequences

We show that the high-dimensional behavior of symmetrically penalized le...
research
09/17/2018

Approximate message-passing for convex optimization with non-separable penalties

We introduce an iterative optimization scheme for convex objectives cons...
research
02/14/2021

Efficient Designs of SLOPE Penalty Sequences in Finite Dimension

In linear regression, SLOPE is a new convex analysis method that general...
research
03/01/2022

On Orthogonal Approximate Message Passing

Approximate Message Passing (AMP) is an efficient iterative parameter-es...
research
05/27/2021

Characterizing the SLOPE Trade-off: A Variational Perspective and the Donoho-Tanner Limit

Sorted l1 regularization has been incorporated into many methods for sol...
research
05/30/2018

RLS Recovery with Asymmetric Penalty: Fundamental Limits and Algorithmic Approaches

This paper studies regularized least square recovery of signals whose sa...

Please sign up or login with your details

Forgot password? Click here to reset