Landscape analysis of an improved power method for tensor decomposition

by   Joe Kileel, et al.

In this work, we consider the optimization formulation for symmetric tensor decomposition recently introduced in the Subspace Power Method (SPM) of Kileel and Pereira. Unlike popular alternative functionals for tensor decomposition, the SPM objective function has the desirable properties that its maximal value is known in advance, and its global optima are exactly the rank-1 components of the tensor when the input is sufficiently low-rank. We analyze the non-convex optimization landscape associated with the SPM objective. Our analysis accounts for working with noisy tensors. We derive quantitative bounds such that any second-order critical point with SPM objective value exceeding the bound must equal a tensor component in the noiseless case, and must approximate a tensor component in the noisy case. For decomposing tensors of size D^× m, we obtain a near-global guarantee up to rank o(D^⌊ m/2 ⌋) under a random tensor model, and a global guarantee up to rank 𝒪(D) assuming deterministic frame conditions. This implies that SPM with suitable initialization is a provable, efficient, robust algorithm for low-rank symmetric tensor decomposition. We conclude with numerics that show a practical preferability for using the SPM functional over a more established counterpart.



There are no comments yet.


page 7


Robust Eigenvectors of Symmetric Tensors

The tensor power method generalizes the matrix power method to higher or...

Subspace power method for symmetric tensor decomposition and generalized PCA

We introduce the Subspace Power Method (SPM) for decomposing symmetric e...

Tensor Decomposition Bounds for TBM-Based Massive Access

Tensor-based modulation (TBM) has been proposed in the context of unsour...

Robust Tensor Principal Component Analysis: Exact Recovery via Deterministic Model

Tensor, also known as multi-dimensional array, arises from many applicat...

Tensor products of coherent configurations

A Cartesian decomposition of a coherent configuration X is defined as a ...

A Closed Form Solution to Best Rank-1 Tensor Approximation via KL divergence Minimization

Tensor decomposition is a fundamentally challenging problem. Even the si...

Characterization of Deterministic and Probabilistic Sampling Patterns for Finite Completability of Low Tensor-Train Rank Tensor

In this paper, we analyze the fundamental conditions for low-rank tensor...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.