Bridging Disentanglement with Independence and Conditional Independence via Mutual Information for Representation Learning

11/25/2019
by   Xiaojiang Yang, et al.
12

Existing works on disentangled representation learning usually lie on a common assumption: all factors in disentangled representations should be independent. This assumption is about the inner property of disentangled representations, while ignoring their relation with external data. To tackle this problem, we propose another assumption to establish an important relation between data and its disentangled representations via mutual information: the mutual information between each factor of disentangled representations and data should be invariant to other factors. We formulate this assumption into mathematical equations, and theoretically bridge it with independence and conditional independence of factors. Meanwhile, we show that conditional independence is satisfied in encoders of VAEs due to factorized noise in reparameterization. To highlight the importance of our proposed assumption, we show in experiments that violating the assumption leads to dramatic decline of disentanglement. Based on this assumption, we further propose to split the deeper layers in encoder to ensure parameters in these layers are not shared for different factors. The proposed encoder, called Split Encoder, can be applied into models that penalize total correlation, and shows significant improvement in unsupervised learning of disentangled representations and reconstructions.

READ FULL TEXT

page 1

page 3

page 4

page 5

page 7

page 8

page 9

page 11

research
04/18/2019

Disentangled Representation Learning with Information Maximizing Autoencoder

Learning disentangled representation from any unlabelled data is a non-t...
research
08/06/2022

HSIC-InfoGAN: Learning Unsupervised Disentangled Representations by Maximising Approximated Mutual Information

Learning disentangled representations requires either supervision or the...
research
12/09/2019

Learning Disentangled Representations via Mutual Information Estimation

In this paper, we investigate the problem of learning disentangled repre...
research
04/01/2022

Learning Disentangled Representations of Negation and Uncertainty

Negation and uncertainty modeling are long-standing tasks in natural lan...
research
12/29/2021

Disentanglement and Generalization Under Correlation Shifts

Correlations between factors of variation are prevalent in real-world da...
research
06/02/2022

Learning Disentangled Representations for Counterfactual Regression via Mutual Information Minimization

Learning individual-level treatment effect is a fundamental problem in c...
research
07/19/2023

DisCover: Disentangled Music Representation Learning for Cover Song Identification

In the field of music information retrieval (MIR), cover song identifica...

Please sign up or login with your details

Forgot password? Click here to reset