Faster Johnson-Lindenstrauss Transforms via Kronecker Products

09/11/2019 ∙ by Ruhui Jin, et al. ∙ 0

The Kronecker product is an important matrix operation with a wide range of applications in supporting fast linear transforms, including signal processing, graph theory, quantum computing and deep learning. In this work, we introduce a generalization of the fast Johnson-Lindenstrauss projection for embedding vectors with Kronecker product structure, the Kronecker fast Johnson-Lindenstrauss transform (KFJLT). The KFJLT drastically reduces the embedding cost to an exponential factor of the standard fast Johnson-Lindenstrauss transform (FJLT)'s cost when applied to vectors with Kronecker structure, by avoiding explicitly forming the full Kronecker products. We prove that this computational gain comes with only a small price in embedding power: given N = ∏_k=1^d n_k, consider a finite set of p points in a tensor product of d constituent Euclidean spaces ⊗_k=d^1R^n_k⊂R^N. With high probability, a random KFJLT matrix of dimension N × m embeds the set of points up to multiplicative distortion (1±ε) provided by m ≳ε^-2·log^2d - 1 (p) ·log N. We conclude by describing a direct application of the KFJLT to the efficient solution of large-scale Kronecker-structured least squares problems for fitting the CP tensor decomposition.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.