DeepAI AI Chat
Log In Sign Up

Bounds all around: training energy-based models with bidirectional bounds

11/01/2021
by   Cong Geng, et al.
1

Energy-based models (EBMs) provide an elegant framework for density estimation, but they are notoriously difficult to train. Recent work has established links to generative adversarial networks, where the EBM is trained through a minimax game with a variational value function. We propose a bidirectional bound on the EBM log-likelihood, such that we maximize a lower bound and minimize an upper bound when solving the minimax game. We link one bound to a gradient penalty that stabilizes training, thereby providing grounding for best engineering practice. To evaluate the bounds we develop a new and efficient estimator of the Jacobi-determinant of the EBM generator. We demonstrate that these developments significantly stabilize training and yield high-quality density estimation and sample generation.

READ FULL TEXT

page 6

page 8

11/02/2018

Minimax Estimation of Neural Net Distance

An important class of distance metrics proposed for training generative ...
06/19/2022

Bounding Evidence and Estimating Log-Likelihood in VAE

Many crucial problems in deep learning and statistics are caused by a va...
03/10/2020

KALE: When Energy-Based Learning Meets Adversarial Training

Legendre duality provides a variational lower-bound for the Kullback-Lei...
07/19/2023

Adversarial Likelihood Estimation with One-way Flows

Generative Adversarial Networks (GANs) can produce high-quality samples,...
06/10/2021

GBHT: Gradient Boosting Histogram Transform for Density Estimation

In this paper, we propose a density estimation algorithm called Gradient...
03/15/2012

Inference-less Density Estimation using Copula Bayesian Networks

We consider learning continuous probabilistic graphical models in the fa...