-
Multi-Label Learning with Label Enhancement
Multi-label learning deals with training instances associated with multi...
read it
-
Group Preserving Label Embedding for Multi-Label Classification
Multi-label learning is concerned with the classification of data with m...
read it
-
Online Multi-Label Classification: A Label Compression Method
Many modern applications deal with multi-label data, such as functional ...
read it
-
Bidirectional Loss Function for Label Enhancement and Distribution Learning
Label distribution learning (LDL) is an interpretable and general learni...
read it
-
Nearest Labelset Using Double Distances for Multi-label Classification
Multi-label classification is a type of supervised learning where an ins...
read it
-
Multi-Label Learning with Deep Forest
In multi-label learning, each instance is associated with multiple label...
read it
-
Disentangled Variational Autoencoder based Multi-Label Classification with Covariance-Aware Multivariate Probit Model
Multi-label classification is the challenging task of predicting the pre...
read it
Compact Learning for Multi-Label Classification
Multi-label classification (MLC) studies the problem where each instance is associated with multiple relevant labels, which leads to the exponential growth of output space. MLC encourages a popular framework named label compression (LC) for capturing label dependency with dimension reduction. Nevertheless, most existing LC methods failed to consider the influence of the feature space or misguided by original problematic features, so that may result in performance degeneration. In this paper, we present a compact learning (CL) framework to embed the features and labels simultaneously and with mutual guidance. The proposal is a versatile concept, hence the embedding way is arbitrary and independent of the subsequent learning process. Following its spirit, a simple yet effective implementation called compact multi-label learning (CMLL) is proposed to learn a compact low-dimensional representation for both spaces. CMLL maximizes the dependence between the embedded spaces of the labels and features, and minimizes the loss of label space recovery concurrently. Theoretically, we provide a general analysis for different embedding methods. Practically, we conduct extensive experiments to validate the effectiveness of the proposed method.
READ FULL TEXT
Comments
There are no comments yet.