Principal Component Analysis with Tensor Train Subspace

03/13/2018
by   Wenqi Wang, et al.
0

Tensor train is a hierarchical tensor network structure that helps alleviate the curse of dimensionality by parameterizing large-scale multidimensional data via a set of network of low-rank tensors. Associated with such a construction is a notion of Tensor Train subspace and in this paper we propose a TT-PCA algorithm for estimating this structured subspace from the given data. By maintaining low rank tensor structure, TT-PCA is more robust to noise comparing with PCA or Tucker-PCA. This is borne out numerically by testing the proposed approach on the Extended YaleFace Dataset B.

READ FULL TEXT
research
12/26/2022

Tensor Principal Component Analysis

In this paper, we develop new methods for analyzing high-dimensional ten...
research
05/16/2010

On the Subspace of Image Gradient Orientations

We introduce the notion of Principal Component Analysis (PCA) of image g...
research
11/16/2022

On some orthogonalization schemes in Tensor Train format

In the framework of tensor spaces, we consider orthogonalization kernels...
research
04/30/2015

Semi-Orthogonal Multilinear PCA with Relaxed Start

Principal component analysis (PCA) is an unsupervised method for learnin...
research
07/06/2023

ALPCAH: Sample-wise Heteroscedastic PCA with Tail Singular Value Regularization

Principal component analysis (PCA) is a key tool in the field of data di...
research
12/29/2020

Inference for Low-rank Tensors – No Need to Debias

In this paper, we consider the statistical inference for several low-ran...
research
04/11/2023

Tensor PCA from basis in tensor space

The aim of this paper is to present a mathematical framework for tensor ...

Please sign up or login with your details

Forgot password? Click here to reset