Semi-Conditional Normalizing Flows for Semi-Supervised Learning

05/01/2019
by   Andrei Atanov, et al.
0

This paper proposes a semi-conditional normalizing flow model for semi-supervised learning. The model uses both labelled and unlabeled data to learn an explicit model of joint distribution over objects and labels. Semi-conditional architecture of the model allows us to efficiently compute a value and gradients of the marginal likelihood for unlabeled objects. The conditional part of the model is based on a proposed conditional coupling layer. We demonstrate performance of the model for semi-supervised classification problem on different datasets. The model outperforms the baseline approach based on variational auto-encoders on MNIST dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2017

Semi-supervised Conditional GANs

We introduce a new model for building conditional generative models in a...
research
08/26/2019

Improvability Through Semi-Supervised Learning: A Survey of Theoretical Results

Semi-supervised learning is a setting in which one has labeled and unlab...
research
09/17/2017

Semi-supervised learning

Semi-supervised learning deals with the problem of how, if possible, to ...
research
12/30/2019

Semi-Supervised Learning with Normalizing Flows

Normalizing flows transform a latent distribution through an invertible ...
research
08/13/2020

Unifying supervised learning and VAEs – automating statistical inference in high-energy physics

A KL-divergence objective of the joint distribution of data and labels a...
research
07/21/2022

PirouNet: Creating Intentional Dance with Semi-Supervised Conditional Recurrent Variational Autoencoders

Using Artificial Intelligence (AI) to create dance choreography with int...
research
01/09/2020

Semi-supervised Learning via Conditional Rotation Angle Estimation

Self-supervised learning (SlfSL), aiming at learning feature representat...

Please sign up or login with your details

Forgot password? Click here to reset