A Riemannian Newton Optimization Framework for the Symmetric Tensor Rank Approximation Problem

03/03/2020
by   Rima Khouja, et al.
0

The symmetric tensor rank approximation problem (STA) consists in computing the best low rank approximation of a symmetric tensor. We describe a Riemannian Newton iteration with trust region scheme for the STA problem. We formulate this problem as a Riemannian optimization problem by parameterizing the constraint set as the Cartesian product of Veronese manifolds. We present an explicit and exact formula for the gradient vector and the Hessian matrix of the method, in terms of the weights and points of the low rank approximation and the symmetric tensor to approximate, by exploiting the properties of the apolar product. We introduce a retraction operator on the Veronese manifold. The Newton Riemannian iterations are performed for best low rank approximation over the real or complex numbers. Numerical experiments are implemented to show the numerical behavior of the new method first against perturbation, to compute the best rank-1 approximation and the spectral norm of a symmetric tensor, and to compare with some existing state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2021

The low-rank approximation of fourth-order partial-symmetric and conjugate partial-symmetric tensor

We present an orthogonal matrix outer product decomposition for the four...
research
12/04/2017

A Dual Framework for Low-rank Tensor Completion

We propose a novel formulation of the low-rank tensor completion problem...
research
05/27/2020

PNKH-B: A Projected Newton-Krylov Method for Large-Scale Bound-Constrained Optimization

We present PNKH-B, a projected Newton-Krylov method with a low-rank appr...
research
04/08/2022

Tensor approximation of the self-diffusion matrix of tagged particle processes

The objective of this paper is to investigate a new numerical method for...
research
06/17/2022

Tensor-on-Tensor Regression: Riemannian Optimization, Over-parameterization, Statistical-computational Gap, and Their Interplay

We study the tensor-on-tensor regression, where the goal is to connect t...
research
01/03/2018

Gradient-based Optimization for Regression in the Functional Tensor-Train Format

We consider the task of low-multilinear-rank functional regression, i.e....
research
01/07/2018

Blind Demixing for Low-Latency Communication

In the next generation wireless networks, lowlatency communication is cr...

Please sign up or login with your details

Forgot password? Click here to reset