Landscape analysis of an improved power method for tensor decomposition

10/29/2021
by   Joe Kileel, et al.
0

In this work, we consider the optimization formulation for symmetric tensor decomposition recently introduced in the Subspace Power Method (SPM) of Kileel and Pereira. Unlike popular alternative functionals for tensor decomposition, the SPM objective function has the desirable properties that its maximal value is known in advance, and its global optima are exactly the rank-1 components of the tensor when the input is sufficiently low-rank. We analyze the non-convex optimization landscape associated with the SPM objective. Our analysis accounts for working with noisy tensors. We derive quantitative bounds such that any second-order critical point with SPM objective value exceeding the bound must equal a tensor component in the noiseless case, and must approximate a tensor component in the noisy case. For decomposing tensors of size D^× m, we obtain a near-global guarantee up to rank o(D^⌊ m/2 ⌋) under a random tensor model, and a global guarantee up to rank 𝒪(D) assuming deterministic frame conditions. This implies that SPM with suitable initialization is a provable, efficient, robust algorithm for low-rank symmetric tensor decomposition. We conclude with numerics that show a practical preferability for using the SPM functional over a more established counterpart.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 7

11/12/2021

Robust Eigenvectors of Symmetric Tensors

The tensor power method generalizes the matrix power method to higher or...
12/09/2019

Subspace power method for symmetric tensor decomposition and generalized PCA

We introduce the Subspace Power Method (SPM) for decomposing symmetric e...
11/03/2021

Tensor Decomposition Bounds for TBM-Based Massive Access

Tensor-based modulation (TBM) has been proposed in the context of unsour...
08/05/2020

Robust Tensor Principal Component Analysis: Exact Recovery via Deterministic Model

Tensor, also known as multi-dimensional array, arises from many applicat...
05/22/2021

Tensor products of coherent configurations

A Cartesian decomposition of a coherent configuration X is defined as a ...
03/04/2021

A Closed Form Solution to Best Rank-1 Tensor Approximation via KL divergence Minimization

Tensor decomposition is a fundamentally challenging problem. Even the si...
03/22/2017

Characterization of Deterministic and Probabilistic Sampling Patterns for Finite Completability of Low Tensor-Train Rank Tensor

In this paper, we analyze the fundamental conditions for low-rank tensor...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.