Evaluating Combinatorial Generalization in Variational Autoencoders

11/11/2019
by   Alican Bozkurt, et al.
14

We evaluate the ability of variational autoencoders to generalize to unseen examples in domains with a large combinatorial space of feature values. Our experiments systematically evaluate the effect of network width, depth, regularization, and the typical distance between the training and test examples. Increasing network capacity benefits generalization in easy problems, where test-set examples are similar to training examples. In more difficult problems, increasing capacity deteriorates generalization when optimizing the standard VAE objective, but once again improves generalization when we decrease the KL regularization. Our results establish that interplay between model capacity and KL regularization is not clear cut; we need to take the typical distance between train and test examples into account when evaluating generalization.

READ FULL TEXT

page 4

page 7

page 14

page 15

page 17

page 18

page 19

page 20

research
06/11/2018

Data augmentation instead of explicit regularization

Modern deep artificial neural networks have achieved impressive results ...
research
09/30/2019

On the Importance of the Kullback-Leibler Divergence Term in Variational Autoencoders for Text Generation

Variational Autoencoders (VAEs) are known to suffer from learning uninfo...
research
05/30/2019

Meta Dropout: Learning to Perturb Features for Generalization

A machine learning model that generalizes well should obtain low errors ...
research
08/13/2021

Datasets for Studying Generalization from Easy to Hard Examples

We describe new datasets for studying generalization from easy to hard e...
research
03/09/2021

More data or more parameters? Investigating the effect of data structure on generalization

One of the central features of deep learning is the generalization abili...
research
12/22/2018

Can VAEs Generate Novel Examples?

An implicit goal in works on deep generative models is that such models ...
research
06/22/2018

Probabilistic Natural Language Generation with Wasserstein Autoencoders

Probabilistic generation of natural language sentences is an important t...

Please sign up or login with your details

Forgot password? Click here to reset