Conjugate Energy-Based Models

06/25/2021
by   Hao Wu, et al.
2

In this paper, we propose conjugate energy-based models (CEBMs), a new class of energy-based models that define a joint density over data and latent variables. The joint density of a CEBM decomposes into an intractable distribution over data and a tractable posterior over latent variables. CEBMs have similar use cases as variational autoencoders, in the sense that they learn an unsupervised mapping from data to latent variables. However, these models omit a generator network, which allows them to learn more flexible notions of similarity between data points. Our experiments demonstrate that conjugate EBMs achieve competitive results in terms of image modelling, predictive power of latent space, and out-of-domain detection on a variety of datasets.

READ FULL TEXT

page 2

page 7

page 18

research
11/19/2019

Deep Unsupervised Clustering with Clustered Generator Model

This paper addresses the problem of unsupervised clustering which remain...
research
12/01/2016

Piecewise Latent Variables for Neural Variational Text Processing

Advances in neural variational inference have facilitated the learning o...
research
05/10/2022

On learning agent-based models from data

Agent-Based Models (ABMs) are used in several fields to study the evolut...
research
05/24/2016

Semiparametric energy-based probabilistic models

Probabilistic models can be defined by an energy function, where the pro...
research
10/12/2018

Predictive Uncertainty through Quantization

High-risk domains require reliable confidence estimates from predictive ...
research
10/15/2020

Bi-level Score Matching for Learning Energy-based Latent Variable Models

Score matching (SM) provides a compelling approach to learn energy-based...
research
10/09/2015

Early Inference in Energy-Based Models Approximates Back-Propagation

We show that Langevin MCMC inference in an energy-based model with laten...

Please sign up or login with your details

Forgot password? Click here to reset