Compositional Generalization in Unsupervised Compositional Representation Learning: A Study on Disentanglement and Emergent Language

10/02/2022
by   Zhenlin Xu, et al.
0

Deep learning models struggle with compositional generalization, i.e. the ability to recognize or generate novel combinations of observed elementary concepts. In hopes of enabling compositional generalization, various unsupervised learning algorithms have been proposed with inductive biases that aim to induce compositional structure in learned representations (e.g. disentangled representation and emergent language learning). In this work, we evaluate these unsupervised learning algorithms in terms of how well they enable compositional generalization. Specifically, our evaluation protocol focuses on whether or not it is easy to train a simple model on top of the learned representation that generalizes to new combinations of compositional factors. We systematically study three unsupervised representation learning algorithms - β-VAE, β-TCVAE, and emergent language (EL) autoencoders - on two datasets that allow directly testing compositional generalization. We find that directly using the bottleneck representation with simple models and few labels may lead to worse generalization than using representations from layers before or after the learned representation itself. In addition, we find that the previously proposed metrics for evaluating the levels of compositionality are not correlated with actual compositional generalization in our framework. Surprisingly, we find that increasing pressure to produce a disentangled representation produces representations with worse generalization, while representations from EL models show strong compositional generalization. Taken together, our results shed new light on the compositional generalization behavior of different unsupervised learning algorithms with a new setting to rigorously test this behavior, and suggest the potential benefits of delevoping EL learning algorithms for more generalizable representations.

READ FULL TEXT

page 19

page 20

page 22

page 24

research
05/29/2023

Vector-based Representation is the Key: A Study on Disentanglement and Compositional Generalization

Recognizing elementary underlying concepts from observations (disentangl...
research
02/19/2019

Measuring Compositionality in Representation Learning

Many machine learning algorithms represent input data with vector embedd...
research
02/08/2021

Concepts, Properties and an Approach for Compositional Generalization

Compositional generalization is the capacity to recognize and imagine a ...
research
06/12/2018

Evaluation of Unsupervised Compositional Representations

We evaluated various compositional models, from bag-of-words representat...
research
07/06/2022

Compositional Generalization in Grounded Language Learning via Induced Model Sparsity

We provide a study of how induced model sparsity can help achieve compos...
research
04/05/2023

Correcting Flaws in Common Disentanglement Metrics

Recent years have seen growing interest in learning disentangled represe...
research
11/29/2018

Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations

In recent years, the interest in unsupervised learning of disentangled r...

Please sign up or login with your details

Forgot password? Click here to reset