Disturbance Grassmann Kernels for Subspace-Based Learning

02/10/2018
by   Junyuan Hong, et al.
0

In this paper, we focus on subspace-based learning problems, where data elements are linear subspaces instead of vectors. To handle this kind of data, Grassmann kernels were proposed to measure the space structure and used with classifiers, e.g., Support Vector Machines (SVMs). However, the existing discriminative algorithms mostly ignore the instability of subspaces, which would cause the classifiers misled by disturbed instances. Thus we propose considering all potential disturbance of subspaces in learning processes to obtain more robust classifiers. Firstly, we derive the dual optimization of linear classifiers with disturbance subject to a known distribution, resulting in a new kernel, Disturbance Grassmann (DG) kernel. Secondly, we research into two kinds of disturbance, relevant to the subspace matrix and singular values of bases, with which we extend the Projection kernel on Grassmann manifolds to two new kernels. Experiments on action data indicate that the proposed kernels perform better compared to state-of-the-art subspace-based methods, even in a worse environment.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2014

Expanding the Family of Grassmannian Kernels: An Embedding Perspective

Modeling videos and image-sets as linear subspaces has proven beneficial...
research
03/28/2022

Subspace-based Representation and Learning for Phonotactic Spoken Language Recognition

Phonotactic constraints can be employed to distinguish languages by repr...
research
06/27/2012

Discriminative Learning via Semidefinite Probabilistic Models

Discriminative linear models are a popular tool in machine learning. The...
research
06/12/2023

Nonlinear SVD with Asymmetric Kernels: feature learning and asymmetric Nyström method

Asymmetric data naturally exist in real life, such as directed graphs. D...
research
11/11/2013

Learning Mixtures of Linear Classifiers

We consider a discriminative learning (regression) problem, whereby the ...
research
01/16/2023

Krylov subspace methods to accelerate kernel machines on graphs

In classical frameworks as the Euclidean space, positive definite kernel...
research
10/22/2017

Exploiting generalization in the subspaces for faster model-based learning

Due to the lack of enough generalization in the state-space, common meth...

Please sign up or login with your details

Forgot password? Click here to reset