Concentration of polynomial random matrices via Efron-Stein inequalities

09/06/2022
by   Goutham Rajendran, et al.
0

Analyzing concentration of large random matrices is a common task in a wide variety of fields. Given independent random variables, many tools are available to analyze random matrices whose entries are linear in the variables, e.g. the matrix-Bernstein inequality. However, in many applications, we need to analyze random matrices whose entries are polynomials in the variables. These arise naturally in the analysis of spectral algorithms, e.g., Hopkins et al. [STOC 2016], Moitra-Wein [STOC 2019]; and in lower bounds for semidefinite programs based on the Sum of Squares hierarchy, e.g. Barak et al. [FOCS 2016], Jones et al. [FOCS 2021]. In this work, we present a general framework to obtain such bounds, based on the matrix Efron-Stein inequalities developed by Paulin-Mackey-Tropp [Annals of Probability 2016]. The Efron-Stein inequality bounds the norm of a random matrix by the norm of another simpler (but still random) matrix, which we view as arising by "differentiating" the starting matrix. By recursively differentiating, our framework reduces the main task to analyzing far simpler matrices. For Rademacher variables, these simpler matrices are in fact deterministic and hence, analyzing them is far easier. For general non-Rademacher variables, the task reduces to scalar concentration, which is much easier. Moreover, in the setting of polynomial matrices, our results generalize the work of Paulin-Mackey-Tropp. Using our basic framework, we recover known bounds in the literature for simple "tensor networks" and "dense graph matrices". Using our general framework, we derive bounds for "sparse graph matrices", which were obtained only recently by Jones et al. [FOCS 2021] using a nontrivial application of the trace power method, and was a core component in their work. We expect our framework to be helpful for other applications involving concentration phenomena for nonlinear random matrices.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2023

Nonlinear Random Matrices and Applications to the Sum of Squares Hierarchy

We develop new tools in the theory of nonlinear random matrices and appl...
research
11/17/2021

Sum-of-Squares Lower Bounds for Sparse Independent Set

The Sum-of-Squares (SoS) hierarchy of semidefinite programs is a powerfu...
research
10/13/2017

The power of sum-of-squares for detecting hidden structures

We study planted problems---finding hidden structures in random noisy in...
research
03/24/2021

Improved Estimation of Concentration Under ℓ_p-Norm Distance Metrics Using Half Spaces

Concentration of measure has been argued to be the fundamental cause of ...
research
02/19/2019

Universality of Computational Lower Bounds for Submatrix Detection

In the general submatrix detection problem, the task is to detect the pr...
research
11/03/2021

Scalar and Matrix Chernoff Bounds from ℓ_∞-Independence

We present new scalar and matrix Chernoff-style concentration bounds for...
research
09/06/2021

Spectral properties of sample covariance matrices arising from random matrices with independent non identically distributed columns

Given a random matrix X= (x_1,…, x_n)∈ℳ_p,n with independent columns and...

Please sign up or login with your details

Forgot password? Click here to reset