Testing Positive Semi-Definiteness via Random Submatrices

05/13/2020
by   Ainesh Bakshi, et al.
0

We study the problem of testing whether a matrix A ∈R^n × n with bounded entries (A_∞≤ 1) is positive semi-definite (PSD), or ϵ-far in ℓ_2^2-distance from the PSD cone, i.e. min_ B ≽ 0 A - B_F^2 = ∑_i : λ_i(A) < 0λ_i^2(A) > ϵ n^2. Our main algorithmic contribution is a non-adaptive tester which distinguishes between these cases using only Õ(1/ϵ^4) queries to the entries of A. For the related "ℓ_∞-gap problem", where A is either PSD or has an eigenvalue satisfying λ_i(A) < - ϵ n, our algorithm only requires Õ(1/ϵ^2) queries, which is optimal up to log(1/ϵ) factors. Our testers randomly sample a collection of principle sub-matrices and check whether these sub-matrices are PSD. Consequentially, our algorithms achieve one-sided error: whenever they output that A is not PSD, they return a certificate that A has negative eigenvalues. We complement our upper bound for PSD testing with ℓ_2^2-gap by giving a Ω̃(1/ϵ^2) lower bound for any non-adaptive algorithm. Our lower bound construction is general, and can be used to derive lower bounds for a number of spectral testing problems. As an example of the applicability of our construction, we obtain a new Ω̃(1/ϵ^4) sampling lower bound for testing the Schatten-1 norm with a ϵ^1.5 gap, extending a result of Balcan, Li, Woodruff, and Zhang [SODA'19]. In addition, our hard instance results in new sampling lower bounds for estimating the Ky-Fan Norm, and the cost of rank-k approximations, i.e. A - A_k_F^2 = ∑_i > kσ_i^2(A).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset