A Variational AutoEncoder for Transformers with Nonparametric Variational Information Bottleneck

07/27/2022
by   James Henderson, et al.
12

We propose a VAE for Transformers by developing a variational information bottleneck regulariser for Transformer embeddings. We formalise the embedding space of Transformer encoders as mixture probability distributions, and use Bayesian nonparametrics to derive a nonparametric variational information bottleneck (NVIB) for such attention-based embeddings. The variable number of mixture components supported by nonparametric methods captures the variable number of vectors supported by attention, and the exchangeability of our nonparametric distributions captures the permutation invariance of attention. This allows NVIB to regularise the number of vectors accessible with attention, as well as the amount of information in individual vectors. By regularising the cross-attention of a Transformer encoder-decoder with NVIB, we propose a nonparametric variational autoencoder (NVAE). Initial experiments on training a NVAE on natural language text show that the induced embedding space has the desired properties of a VAE for Transformers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/04/2021

Transformer-based Conditional Variational Autoencoder for Controllable Story Generation

We investigate large-scale latent variable models (LVMs) for neural stor...
research
08/23/2021

Regularizing Transformers With Deep Probabilistic Layers

Language models (LM) have grown with non-stop in the last decade, from s...
research
10/28/2022

Multimodal Transformer for Parallel Concatenated Variational Autoencoders

In this paper, we propose a multimodal transformer using parallel concat...
research
09/27/2021

On Isotropy Calibration of Transformers

Different studies of the embedding space of transformer models suggest t...
research
08/05/2021

Finetuning Pretrained Transformers into Variational Autoencoders

Text variational autoencoders (VAEs) are notorious for posterior collaps...
research
08/21/2023

Analyzing Transformer Dynamics as Movement through Embedding Space

Transformer language models exhibit intelligent behaviors such as unders...
research
12/27/2022

Semi-supervised multiscale dual-encoding method for faulty traffic data detection

Inspired by the recent success of deep learning in multiscale informatio...

Please sign up or login with your details

Forgot password? Click here to reset