Smoothed Analysis of Discrete Tensor Decomposition and Assemblies of Neurons

by   Nima Anari, et al.

We analyze linear independence of rank one tensors produced by tensor powers of randomly perturbed vectors. This enables efficient decomposition of sums of high-order tensors. Our analysis builds upon [BCMV14] but allows for a wider range of perturbation models, including discrete ones. We give an application to recovering assemblies of neurons. Assemblies are large sets of neurons representing specific memories or concepts. The size of the intersection of two assemblies has been shown in experiments to represent the extent to which these memories co-occur or these concepts are related; the phenomenon is called association of assemblies. This suggests that an animal's memory is a complex web of associations, and poses the problem of recovering this representation from cognitive data. Motivated by this problem, we study the following more general question: Can we reconstruct the Venn diagram of a family of sets, given the sizes of their ℓ-wise intersections? We show that as long as the family of sets is randomly perturbed, it is enough for the number of measurements to be polynomially larger than the number of nonempty regions of the Venn diagram to fully reconstruct the diagram.


page 1

page 2

page 3

page 4


Provable Near-Optimal Low-Multilinear-Rank Tensor Recovery

We consider the problem of recovering a low-multilinear-rank tensor from...

Orthogonal Decomposition of Tensor Trains

In this paper we study the problem of recovering a tensor network decomp...

Tensor decomposition for learning Gaussian mixtures from moments

In data processing and machine learning, an important challenge is to re...

On Recoverability of Randomly Compressed Tensors with Low CP Rank

Our interest lies in the recoverability properties of compressed tensors...

Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors

Many idealized problems in signal processing, machine learning and stati...

T-Basis: a Compact Representation for Neural Networks

We introduce T-Basis, a novel concept for a compact representation of a ...

Please sign up or login with your details

Forgot password? Click here to reset