Latent Variable Modeling for Generative Concept Representations and Deep Generative Models

12/26/2018
by   Daniel T Chang, et al.
0

Latent representations are the essence of deep generative models and determine their usefulness and power. For latent representations to be useful as generative concept representations, their latent space must support latent space interpolation, attribute vectors and concept vectors, among other things. We investigate and discuss latent variable modeling, including latent variable models, latent representations and latent spaces, particularly hierarchical latent representations and latent space vectors and geometry. Our focus is on that used in variational autoencoders and generative adversarial networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2015

Automatic Relevance Determination For Deep Generative Models

A recurring problem when building probabilistic latent variable models i...
research
06/01/2021

Latent Space Refinement for Deep Generative Models

Deep generative models are becoming widely used across science and indus...
research
08/10/2021

Analysis of ODE2VAE with Examples

Deep generative models aim to learn underlying distributions that genera...
research
11/08/2018

Disentangling Latent Factors with Whitening

After the success of deep generative models in image generation tasks, l...
research
09/14/2016

Sampling Generative Networks

We introduce several techniques for sampling and visualizing the latent ...
research
10/29/2019

Concept Saliency Maps to Visualize Relevant Features in Deep Generative Models

Evaluating, explaining, and visualizing high-level concepts in generativ...
research
05/08/2021

On Linear Interpolation in the Latent Space of Deep Generative Models

The underlying geometrical structure of the latent space in deep generat...

Please sign up or login with your details

Forgot password? Click here to reset