Approximate separability of symmetrically penalized least squares in high dimensions: characterization and consequences

06/25/2019
by   Michael Celentano, et al.
0

We show that the high-dimensional behavior of symmetrically penalized least squares with a possibly non-separable, symmetric, convex penalty in both (i) the Gaussian sequence model and (ii) the linear model with uncorrelated Gaussian designs nearly matches the behavior of least squares with an appropriately chosen separable penalty in these same models. The similarity in behavior is precisely quantified by a finite-sample concentration inequality in both cases. Our results help clarify the role non-separability can play in high-dimensional M-estimation. In particular, if the empirical distribution of the coordinates of the parameter is known --exactly or approximately-- there are at most limited advantages to using non-separable, symmetric penalties over separable ones. In contrast, if the empirical distribution of the coordinates of the parameter is unknown, we argue that non-separable, symmetric penalties automatically implement an adaptive procedure which we characterize. We also provide a partial converse which characterizes adaptive procedures which can be implemented in this way.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2019

Algorithmic Analysis and Statistical Estimation of SLOPE via Approximate Message Passing

SLOPE is a relatively new convex optimization procedure for high-dimensi...
research
08/27/2021

Statistical Inference for Linear Mediation Models with High-dimensional Mediators and Application to Studying Stock Reaction to COVID-19 Pandemic

Mediation analysis draws increasing attention in many scientific areas s...
research
05/11/2016

Tuning parameter selection in high dimensional penalized likelihood

Determining how to appropriately select the tuning parameter is essentia...
research
06/10/2011

A Linear Time Natural Evolution Strategy for Non-Separable Functions

We present a novel Natural Evolution Strategy (NES) variant, the Rank-On...
research
06/25/2008

High-dimensional additive modeling

We propose a new sparsity-smoothness penalty for high-dimensional genera...
research
03/25/2019

Fundamental Barriers to High-Dimensional Regression with Convex Penalties

In high-dimensional regression, we attempt to estimate a parameter vecto...
research
06/15/2018

A 5-Dimensional Tonnetz for Nearly Symmetric Hexachords

The standard 2-dimensional Tonnetz describes parsimonious voice-leading ...

Please sign up or login with your details

Forgot password? Click here to reset