Kernel-Induced Label Propagation by Mapping for Semi-Supervised Classification

05/29/2019
by   Zhao Zhang, et al.
0

Kernel methods have been successfully applied to the areas of pattern recognition and data mining. In this paper, we mainly discuss the issue of propagating labels in kernel space. A Kernel-Induced Label Propagation (Kernel-LP) framework by mapping is proposed for high-dimensional data classification using the most informative patterns of data in kernel space. The essence of Kernel-LP is to perform joint label propagation and adaptive weight learning in a transformed kernel space. That is, our Kernel-LP changes the task of label propagation from the commonly-used Euclidean space in most existing work to kernel space. The motivation of our Kernel-LP to propagate labels and learn the adaptive weights jointly by the assumption of an inner product space of inputs, i.e., the original linearly inseparable inputs may be mapped to be separable in kernel space. Kernel-LP is based on existing positive and negative LP model, i.e., the effects of negative label information are integrated to improve the label prediction power. Also, Kernel-LP performs adaptive weight construction over the same kernel space, so it can avoid the tricky process of choosing the optimal neighborhood size suffered in traditional criteria. Two novel and efficient out-of-sample approaches for our Kernel-LP to involve new test data are also presented, i.e., (1) direct kernel mapping and (2) kernel mapping-induced label reconstruction, both of which purely depend on the kernel matrix between training set and testing set. Owing to the kernel trick, our algorithms will be applicable to handle the high-dimensional real data. Extensive results on real datasets demonstrate the effectiveness of our approach.

READ FULL TEXT

page 9

page 15

page 18

research
11/17/2019

Prototypical Networks for Multi-Label Learning

We propose to address multi-label learning by jointly estimating the dis...
research
05/25/2019

Joint Label Prediction based Semi-Supervised Adaptive Concept Factorization for Robust Data Representation

Constrained Concept Factorization (CCF) yields the enhanced representati...
research
01/07/2021

GraphHop: An Enhanced Label Propagation Method for Node Classification

A scalable semi-supervised node classification method on graph-structure...
research
05/19/2022

Simplifying Node Classification on Heterophilous Graphs with Compatible Label Propagation

Graph Neural Networks (GNNs) have been predominant for graph learning ta...
research
01/02/2020

Kernelized Support Tensor Train Machines

Tensor, a multi-dimensional data structure, has been exploited recently ...
research
03/11/2019

Similarity Learning via Kernel Preserving Embedding

Data similarity is a key concept in many data-driven applications. Many ...
research
06/03/2021

Projection-free Graph-based Classifier Learning using Gershgorin Disc Perfect Alignment

In semi-supervised graph-based binary classifier learning, a subset of k...

Please sign up or login with your details

Forgot password? Click here to reset