Semi-Supervised Learning by Disentangling and Self-Ensembling Over Stochastic Latent Space

07/22/2019
by   Prashnna Kumar Gyawali, et al.
0

The success of deep learning in medical imaging is mostly achieved at the cost of a large labeled data set. Semi-supervised learning (SSL) provides a promising solution by leveraging the structure of unlabeled data to improve learning from a small set of labeled data. Self-ensembling is a simple approach used in SSL to encourage consensus among ensemble predictions of unknown labels, improving generalization of the model by making it more insensitive to the latent space. Currently, such an ensemble is obtained by randomization such as dropout regularization and random data augmentation. In this work, we hypothesize -- from the generalization perspective -- that self-ensembling can be improved by exploiting the stochasticity of a disentangled latent space. To this end, we present a stacked SSL model that utilizes unsupervised disentangled representation learning as the stochastic embedding for self-ensembling. We evaluate the presented model for multi-label classification using chest X-ray images, demonstrating its improved performance over related SSL models as well as the interpretability of its disentangled representations.

READ FULL TEXT
research
06/14/2016

Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning

Effective convolutional neural networks are trained on large sets of lab...
research
11/12/2018

Improving Generalization for Abstract Reasoning Tasks Using Disentangled Feature Representations

In this work we explore the generalization characteristics of unsupervis...
research
06/07/2018

Semi-Supervised Learning via Compact Latent Space Clustering

We present a novel cost function for semi-supervised learning of neural ...
research
02/06/2023

Learning disentangled representations for explainable chest X-ray classification using Dirichlet VAEs

This study explores the use of the Dirichlet Variational Autoencoder (Di...
research
04/18/2021

Deep Clustering with Measure Propagation

Deep models have improved state-of-the-art for both supervised and unsup...
research
09/16/2020

Evaluating Self-Supervised Pretraining Without Using Labels

A common practice in unsupervised representation learning is to use labe...

Please sign up or login with your details

Forgot password? Click here to reset