Nonlinear Random Matrices and Applications to the Sum of Squares Hierarchy

02/09/2023
by   Goutham Rajendran, et al.
0

We develop new tools in the theory of nonlinear random matrices and apply them to study the performance of the Sum of Squares (SoS) hierarchy on average-case problems. The SoS hierarchy is a powerful optimization technique that has achieved tremendous success for various problems in combinatorial optimization, robust statistics and machine learning. It's a family of convex relaxations that lets us smoothly trade off running time for approximation guarantees. In recent works, it's been shown to be extremely useful for recovering structure in high dimensional noisy data. It also remains our best approach towards refuting the notorious Unique Games Conjecture. In this work, we analyze the performance of the SoS hierarchy on fundamental problems stemming from statistics, theoretical computer science and statistical physics. In particular, we show subexponential-time SoS lower bounds for the problems of the Sherrington-Kirkpatrick Hamiltonian, Planted Slightly Denser Subgraph, Tensor Principal Components Analysis and Sparse Principal Components Analysis. These SoS lower bounds involve analyzing large random matrices, wherein lie our main contributions. These results offer strong evidence for the truth of and insight into the low-degree likelihood ratio hypothesis, an important conjecture that predicts the power of bounded-time algorithms for hypothesis testing. We also develop general-purpose tools for analyzing the behavior of random matrices which are functions of independent random variables. Towards this, we build on and generalize the matrix variant of the Efron-Stein inequalities. In particular, our general theorem on matrix concentration recovers various results that have appeared in the literature. We expect these random matrix theory ideas to have other significant applications.

READ FULL TEXT
research
09/06/2022

Concentration of polynomial random matrices via Efron-Stein inequalities

Analyzing concentration of large random matrices is a common task in a w...
research
10/13/2017

The power of sum-of-squares for detecting hidden structures

We study planted problems---finding hidden structures in random noisy in...
research
11/04/2019

Lifting Sum-of-Squares Lower Bounds: Degree-2 to Degree-4

The degree-4 Sum-of-Squares (SoS) SDP relaxation is a powerful algorithm...
research
11/17/2021

Sum-of-Squares Lower Bounds for Sparse Independent Set

The Sum-of-Squares (SoS) hierarchy of semidefinite programs is a powerfu...
research
12/13/2020

Adaptive and Oblivious Randomized Subspace Methods for High-Dimensional Optimization: Sharp Analysis and Lower Bounds

We propose novel randomized optimization methods for high-dimensional co...
research
07/13/2019

Lower Bound for RIP Constants and Concentration of Sum of Top Order Statistics

Restricted Isometry Property (RIP) is of fundamental importance in the t...
research
12/21/2022

A Nearly Tight Bound for Fitting an Ellipsoid to Gaussian Random Points

We prove that for c>0 a sufficiently small universal constant that a ran...

Please sign up or login with your details

Forgot password? Click here to reset