A Nonlinear Kernel Support Matrix Machine for Matrix Learning
Tensor is a natural and compact representation for real world data which are often multi-dimensional. Meanwhile, problems of supervised tensor learning (STL) are commonly encountered in applications. Most existing classifiers based on tensor representation, such as support tensor machine (STM) need to solve iteratively which occupy much time and may suffer from local minima. In this paper, we present a kernel support matrix machine (KSMM) connected with the matrix Hilbert space to perform supervised learning when data are represented as matrices. KSMM is a general framework for constructing matrix-based hyperplane to exploit information. We analyze a unifying optimization problem for which we propose an asymptotically convergent algorithm. The goal is to both determine the hyperplane as well as predict the unlabeled samples. Theoretical analysis for the generalization bounds is derived based on Rademacher complexity with respect to a probability distribution. We demonstrate the merits of the proposed method by exhaustive experiments on simulation study and a number of real-word datasets from a variety of application domains.
READ FULL TEXT