Deterministic tensor completion with hypergraph expanders

10/23/2019
by   Kameron Decker Harris, et al.
0

We provide a novel analysis of low rank tensor completion based on hypergraph expanders. As a proxy for rank, we minimize the max-quasinorm of the tensor, introduced by Ghadermarzy, Plan, and Yilmaz (2018), which generalizes the max-norm for matrices. Our analysis is deterministic and shows that the number of samples required to recover an order-t tensor with at most n entries per dimension is linear in n, under the assumption that the rank and order of the tensor are O(1). As steps in our proof, we find an improved expander mixing lemma for a t-partite, t-uniform regular hypergraph model and prove several new properties about tensor max-quasinorm. To the best of our knowledge, this is the first deterministic analysis of tensor completion.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset