DeepAI AI Chat
Log In Sign Up

DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures

by   Andrew R. Lawrence, et al.

We present a non-parametric Bayesian latent variable model capable of learning dependency structures across dimensions in a multivariate setting. Our approach is based on flexible Gaussian process priors for the generative mappings and interchangeable Dirichlet process priors to learn the structure. The introduction of the Dirichlet process as a specific structural prior allows our model to circumvent issues associated with previous Gaussian process latent variable models. Inference is performed by deriving an efficient variational bound on the marginal log-likelihood on the model.


Spike and Slab Gaussian Process Latent Variable Models

The Gaussian process latent variable model (GP-LVM) is a popular approac...

Gaussian Process Networks

In this paper we address the problem of learning the structure of a Baye...

NP-DRAW: A Non-Parametric Structured Latent Variable Modelfor Image Generation

In this paper, we present a non-parametric structured latent variable mo...

Implicit Generative Copulas

Copulas are a powerful tool for modeling multivariate distributions as t...

A Fully Bayesian Infinite Generative Model for Dynamic Texture Segmentation

Generative dynamic texture models (GDTMs) are widely used for dynamic te...

Gaussian Process Latent Variable Alignment Learning

We present a model that can automatically learn alignments between high-...

Bayesian Beta-Bernoulli Process Sparse Coding with Deep Neural Networks

Several approximate inference methods have been proposed for deep discre...