DeepAI AI Chat
Log In Sign Up

Bounds all around: training energy-based models with bidirectional bounds

by   Cong Geng, et al.

Energy-based models (EBMs) provide an elegant framework for density estimation, but they are notoriously difficult to train. Recent work has established links to generative adversarial networks, where the EBM is trained through a minimax game with a variational value function. We propose a bidirectional bound on the EBM log-likelihood, such that we maximize a lower bound and minimize an upper bound when solving the minimax game. We link one bound to a gradient penalty that stabilizes training, thereby providing grounding for best engineering practice. To evaluate the bounds we develop a new and efficient estimator of the Jacobi-determinant of the EBM generator. We demonstrate that these developments significantly stabilize training and yield high-quality density estimation and sample generation.


page 6

page 8


Minimax Estimation of Neural Net Distance

An important class of distance metrics proposed for training generative ...

Bounding Evidence and Estimating Log-Likelihood in VAE

Many crucial problems in deep learning and statistics are caused by a va...

KALE: When Energy-Based Learning Meets Adversarial Training

Legendre duality provides a variational lower-bound for the Kullback-Lei...

Adversarial Likelihood Estimation with One-way Flows

Generative Adversarial Networks (GANs) can produce high-quality samples,...

GBHT: Gradient Boosting Histogram Transform for Density Estimation

In this paper, we propose a density estimation algorithm called Gradient...

Inference-less Density Estimation using Copula Bayesian Networks

We consider learning continuous probabilistic graphical models in the fa...