Spectral Universality of Regularized Linear Regression with Nearly Deterministic Sensing Matrices

08/04/2022
by   Rishabh Dudeja, et al.
0

It has been observed that the performances of many high-dimensional estimation problems are universal with respect to underlying sensing (or design) matrices. Specifically, matrices with markedly different constructions seem to achieve identical performance if they share the same spectral distribution and have “generic” singular vectors. We prove this universality phenomenon for the case of convex regularized least squares (RLS) estimators under a linear regression model with additive Gaussian noise. Our main contributions are two-fold: (1) We introduce a notion of universality classes for sensing matrices, defined through a set of deterministic conditions that fix the spectrum of the sensing matrix and precisely capture the previously heuristic notion of generic singular vectors; (2) We show that for all sensing matrices that lie in the same universality class, the dynamics of the proximal gradient descent algorithm for solving the regression problem, as well as the performance of RLS estimators themselves (under additional strong convexity conditions) are asymptotically identical. In addition to including i.i.d. Gaussian and rotational invariant matrices as special cases, our universality class also contains highly structured, strongly correlated, or even (nearly) deterministic matrices. Examples of the latter include randomly signed versions of incoherent tight frames and randomly subsampled Hadamard transforms. As a consequence of this universality principle, the asymptotic performance of regularized linear regression on many structured matrices constructed with limited randomness can be characterized by using the rotationally invariant ensemble as an equivalent yet mathematically more tractable surrogate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/15/2023

Bayes-Optimal Estimation in Generalized Linear Models via Spatial Coupling

We consider the problem of signal estimation in a generalized linear mod...
research
02/11/2020

Asymptotic errors for convex penalized linear regression beyond Gaussian matrices

We consider the problem of learning a coefficient vector x_0 in R^N from...
research
03/22/2019

Implicit Regularization via Hadamard Product Over-Parametrization in High-Dimensional Linear Regression

We consider Hadamard product parametrization as a change-of-variable (ov...
research
05/13/2019

RLS-Based Detection for Massive Spatial Modulation MIMO

Most detection algorithms in spatial modulation (SM) are formulated as l...
research
03/30/2020

Regularization in High-Dimensional Regression and Classification via Random Matrix Theory

We study general singular value shrinkage estimators in high-dimensional...
research
06/18/2023

Dropout Regularization Versus ℓ_2-Penalization in the Linear Model

We investigate the statistical behavior of gradient descent iterates wit...

Please sign up or login with your details

Forgot password? Click here to reset