Efficient Structure-preserving Support Tensor Train Machine

by   Kirandeep Kour, et al.

Deploying the multi-relational tensor structure of a high dimensional feature space, more efficiently improves the performance of machine learning algorithms. One encounters the curse of dimensionality, and working with vectorized data fails to preserve the data structure. To mitigate the nonlinear relationship of tensor data more economically, we propose the Tensor Train Multi-way Multi-level Kernel (TT-MMK). This technique combines kernel filtering of the initial input data (Kernelized Tensor Train (KTT)), stable reparametrization of the KTT in the Canonical Polyadic (CP) format, and the Dual Structure-preserving Support Vector Machine (SVM) Kernel for revealing nonlinear relationships. We demonstrate numerically that the TT-MMK method is more reliable computationally, is less sensitive to tuning parameters, and gives higher prediction accuracy in the SVM classification compared to similar tensorised SVM methods.


page 1

page 2

page 3

page 4


Kernelized Support Tensor Train Machines

Tensor, a multi-dimensional data structure, has been exploited recently ...

A Support Tensor Train Machine

There has been growing interest in extending traditional vector-based ma...

Nonlinear Kernel Support Vector Machine with 0-1 Soft Margin Loss

Recent advance on linear support vector machine with the 0-1 soft margin...

Tensor Network Kalman Filtering for Large-Scale LS-SVMs

Least squares support vector machines are a commonly used supervised lea...

Parallelized Tensor Train Learning of Polynomial Classifiers

In pattern classification, polynomial classifiers are well-studied metho...

m-arcsinh: An Efficient and Reliable Function for SVM and MLP in scikit-learn

This paper describes the 'm-arcsinh', a modified ('m-') version of the i...

Supervised Learning for Non-Sequential Data with the Canonical Polyadic Decomposition

There has recently been increasing interest, both theoretical and practi...