Tensor Networks for Dimensionality Reduction and Large-Scale Optimizations. Part 2 Applications and Future Perspectives

08/30/2017
by   A. Cichocki, et al.
0

Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. It focuses on tensor network models for super-compressed higher-order representation of data/parameters and related cost functions, while providing an outline of their applications in machine learning and data analytics. A particular emphasis is on the tensor train (TT) and Hierarchical Tucker (HT) decompositions, and their physically meaningful interpretations which reflect the scalability of the tensor network approach. Through a graphical approach, we also elucidate how, by virtue of the underlying low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volumes of data/parameters, thereby alleviating or even eliminating the curse of dimensionality. The usefulness of this concept is illustrated over a number of applied areas, including generalized regression and classification (support tensor machines, canonical correlation analysis, higher order partial least squares), generalized eigenvalue decomposition, Riemannian optimization, and in the optimization of deep neural networks. Part 1 and Part 2 of this work can be used either as stand-alone separate texts, or indeed as a conjoint comprehensive review of the exciting field of low-rank tensor networks and tensor decompositions.

READ FULL TEXT

page 25

page 29

page 30

research
04/27/2017

Multiscale Analysis for Higher-order Tensors

The widespread use of multisensor technology and the emergence of big da...
research
05/23/2020

p-order Tensor Products with Invertible Linear Transforms

This paper studies the issues about tensors. Three typical kinds of tens...
research
10/12/2021

Tensor decompositions and algorithms, with applications to tensor learning

A new algorithm of the canonical polyadic decomposition (CPD) presented ...
research
06/14/2019

Efficient N-Dimensional Convolutions via Higher-Order Factorization

With the unprecedented success of deep convolutional neural networks cam...
research
01/22/2021

Tensor-Train Networks for Learning Predictive Modeling of Multidimensional Data

Deep neural networks have attracted the attention of the machine learnin...
research
03/27/2021

Tensor Networks for Multi-Modal Non-Euclidean Data

Modern data sources are typically of large scale and multi-modal natures...
research
05/23/2022

Decoupling multivariate functions using a nonparametric filtered tensor decomposition

Multivariate functions emerge naturally in a wide variety of data-driven...

Please sign up or login with your details

Forgot password? Click here to reset