Latent Variable Modeling for Generative Concept Representations and Deep Generative Models

12/26/2018 ∙ by Daniel T Chang, et al. ∙ 0

Latent representations are the essence of deep generative models and determine their usefulness and power. For latent representations to be useful as generative concept representations, their latent space must support latent space interpolation, attribute vectors and concept vectors, among other things. We investigate and discuss latent variable modeling, including latent variable models, latent representations and latent spaces, particularly hierarchical latent representations and latent space vectors and geometry. Our focus is on that used in variational autoencoders and generative adversarial networks.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.