Minor Constraint Disturbances for Deep Semi-supervised Learning

03/13/2020
by   Jielei Chu, et al.
1

In high-dimensional data space, semi-supervised feature learning based on Euclidean distance shows instability under a broad set of conditions. Furthermore, the scarcity and high cost of labels prompt us to explore new semi-supervised learning methods with the fewest labels. In this paper, we develop a novel Minor Constraint Disturbances-based Deep Semi-supervised Feature Learning framework (MCD-DSFL) from the perspective of probability distribution for feature representation. There are two fundamental modules in the proposed framework: one is a Minor Constraint Disturbances-based restricted Boltzmann machine with Gaussian visible units (MCDGRBM) for modelling continuous data and the other is a Minor Constraint Disturbances-based restricted Boltzmann machine (MCDRBM) for modelling binary data. The Minor Constraint Disturbances (MCD) consist of less instance-level constraints which are produced by only two randomly selected labels from each class. The Kullback-Leibler (KL) divergences of the MCD are fused into the Contrastive Divergence (CD) learning for training the proposed MCDGRBM and MCDRBM models. Then, the probability distributions of hidden layer features are as similar as possible in the same class and they are as dissimilar as possible in the different classes simultaneously. Despite the weak influence of the MCD for our shallow models (MCDGRBM and MCDRBM), the proposed deep MCD-DSFL framework improves the representation capability significantly under its leverage effect. The semi-supervised strategy based on the KL divergence of the MCD significantly reduces the reliance on the labels and improves the stability of the semi-supervised feature learning in high-dimensional space simultaneously.

READ FULL TEXT

page 2

page 3

page 5

page 7

page 9

page 10

page 12

page 14

research
06/12/2019

DCEF: Deep Collaborative Encoder Framework for Unsupervised Clustering

Collaborative representation is a popular feature learning approach, whi...
research
07/11/2020

AutoEmbedder: A semi-supervised DNN embedding system for clustering

Clustering is widely used in unsupervised learning method that deals wit...
research
12/05/2022

Rethinking Backdoor Data Poisoning Attacks in the Context of Semi-Supervised Learning

Semi-supervised learning methods can train high-accuracy machine learnin...
research
07/28/2017

Deep Co-Space: Sample Mining Across Feature Transformation for Semi-Supervised Learning

Aiming at improving performance of visual classification in a cost-effec...
research
06/04/2021

A Novel Semi-supervised Framework for Call Center Agent Malpractice Detection via Neural Feature Learning

This work presents a practical solution to the problem of call center ag...
research
02/26/2019

Directional Embedding Based Semi-supervised Framework For Bird Vocalization Segmentation

This paper proposes a data-efficient, semi-supervised, two-pass framewor...
research
08/13/2020

Unifying supervised learning and VAEs – automating statistical inference in high-energy physics

A KL-divergence objective of the joint distribution of data and labels a...

Please sign up or login with your details

Forgot password? Click here to reset