Flow Contrastive Estimation of Energy-Based Models

by   Ruiqi Gao, et al.

This paper studies a training method to jointly estimate an energy-based model and a flow-based model, in which the two models are iteratively updated based on a shared adversarial value function. This joint training method has the following traits. (1) The update of the energy-based model is based on noise contrastive estimation, with the flow model serving as a strong noise distribution. (2) The update of the flow model approximately minimizes the Jensen-Shannon divergence between the flow model and the data distribution. (3) Unlike generative adversarial networks (GAN) which estimates an implicit probability distribution defined by a generator model, our method estimates two explicit probabilistic distributions on the data. Using the proposed method we demonstrate a significant improvement on the synthesis quality of the flow model, and show the effectiveness of unsupervised feature learning by the learned energy-based model. Furthermore, the proposed training method can be easily adapted to semi-supervised learning. We achieve competitive results to the state-of-the-art semi-supervised learning methods.


page 7

page 8

page 9

page 15


A Tale of Two Flows: Cooperative Learning of Langevin Flow and Normalizing Flow Toward Energy-Based Model

This paper studies the cooperative learning of two generative flow model...

Optimizing the Noise in Self-Supervised Learning: from Importance Sampling to Noise-Contrastive Estimation

Self-supervised learning is an increasingly popular approach to unsuperv...

Self-Adapting Noise-Contrastive Estimation for Energy-Based Models

Training energy-based models (EBMs) with noise-contrastive estimation (N...

Deep Energy-Based NARX Models

This paper is directed towards the problem of learning nonlinear ARX mod...

Divergence Triangle for Joint Training of Generator Model, Energy-based Model, and Inference Model

This paper proposes the divergence triangle as a framework for joint tra...

On Feature Diversity in Energy-based Models

Energy-based learning is a powerful learning paradigm that encapsulates ...

Generating High Fidelity Synthetic Data via Coreset selection and Entropic Regularization

Generative models have the ability to synthesize data points drawn from ...

Code Repositories


Code for ICE-BeeM paper - NeurIPS 2020

view repo

Please sign up or login with your details

Forgot password? Click here to reset