Energy-Inspired Models: Learning with Sampler-Induced Distributions

10/31/2019
by   Dieterich Lawson, et al.
18

Energy-based models (EBMs) are powerful probabilistic models, but suffer from intractable sampling and density evaluation due to the partition function. As a result, inference in EBMs relies on approximate sampling algorithms, leading to a mismatch between the model and inference. Motivated by this, we consider the sampler-induced distribution as the model of interest and maximize the likelihood of this model. This yields a class of energy-inspired models (EIMs) that incorporate learned energy functions while still providing exact samples and tractable log-likelihood lower bounds. We describe and evaluate three instantiations of such models based on truncated rejection sampling, self-normalized importance sampling, and Hamiltonian importance sampling. These models outperform or perform comparably to the recently proposed Learned Accept/Reject Sampling algorithm and provide new insights on ranking Noise Contrastive Estimation and Contrastive Predictive Coding. Moreover, EIMs allow us to generalize a recent connection between multi-sample variational lower bounds and auxiliary variable variational inference. We show how recent variational bounds can be unified with EIMs as the variational family.

READ FULL TEXT
research
06/01/2023

Balanced Training of Energy-Based Models with Adaptive Flow Sampling

Energy-based models (EBMs) are versatile density estimation models that ...
research
05/08/2019

Importance Weighted Hierarchical Variational Inference

Variational Inference is a powerful tool in the Bayesian modeling toolki...
research
08/27/2018

Importance Weighting and Variational Inference

Recent work used importance sampling ideas for better variational bounds...
research
07/08/2021

MCMC Variational Inference via Uncorrected Hamiltonian Annealing

Given an unnormalized target distribution we want to obtain approximate ...
research
11/10/2020

Learning Discrete Energy-based Models via Auxiliary-variable Local Exploration

Discrete structures play an important role in applications like program ...
research
06/15/2021

Learning Equivariant Energy Based Models with Equivariant Stein Variational Gradient Descent

We focus on the problem of efficient sampling and learning of probabilit...
research
06/21/2021

Nested Variational Inference

We develop nested variational inference (NVI), a family of methods that ...

Please sign up or login with your details

Forgot password? Click here to reset