Average-Case Complexity of Tensor Decomposition for Low-Degree Polynomials

11/10/2022
by   Alexander S. Wein, et al.
0

Suppose we are given an n-dimensional order-3 symmetric tensor T ∈ (ℝ^n)^⊗ 3 that is the sum of r random rank-1 terms. The problem of recovering the rank-1 components is possible in principle when r ≲ n^2 but polynomial-time algorithms are only known in the regime r ≪ n^3/2. Similar "statistical-computational gaps" occur in many high-dimensional inference tasks, and in recent years there has been a flurry of work on explaining the apparent computational hardness in these problems by proving lower bounds against restricted (yet powerful) models of computation such as statistical queries (SQ), sum-of-squares (SoS), and low-degree polynomials (LDP). However, no such prior work exists for tensor decomposition, largely because its hardness does not appear to be explained by a "planted versus null" testing problem. We consider a model for random order-3 tensor decomposition where one component is slightly larger in norm than the rest (to break symmetry), and the components are drawn uniformly from the hypercube. We resolve the computational complexity in the LDP model: O(log n)-degree polynomial functions of the tensor entries can accurately estimate the largest component when r ≪ n^3/2 but fail to do so when r ≫ n^3/2. This provides rigorous evidence suggesting that the best known algorithms for tensor decomposition cannot be improved, at least by known approaches. A natural extension of the result holds for tensors of any fixed order k ≥ 3, in which case the LDP threshold is r ∼ n^k/2.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset