Flow Contrastive Estimation of Energy-Based Models

12/02/2019
by   Ruiqi Gao, et al.
Google
43

This paper studies a training method to jointly estimate an energy-based model and a flow-based model, in which the two models are iteratively updated based on a shared adversarial value function. This joint training method has the following traits. (1) The update of the energy-based model is based on noise contrastive estimation, with the flow model serving as a strong noise distribution. (2) The update of the flow model approximately minimizes the Jensen-Shannon divergence between the flow model and the data distribution. (3) Unlike generative adversarial networks (GAN) which estimates an implicit probability distribution defined by a generator model, our method estimates two explicit probabilistic distributions on the data. Using the proposed method we demonstrate a significant improvement on the synthesis quality of the flow model, and show the effectiveness of unsupervised feature learning by the learned energy-based model. Furthermore, the proposed training method can be easily adapted to semi-supervised learning. We achieve competitive results to the state-of-the-art semi-supervised learning methods.

READ FULL TEXT

page 7

page 8

page 9

page 15

05/13/2022

A Tale of Two Flows: Cooperative Learning of Langevin Flow and Normalizing Flow Toward Energy-Based Model

This paper studies the cooperative learning of two generative flow model...
01/23/2023

Optimizing the Noise in Self-Supervised Learning: from Importance Sampling to Noise-Contrastive Estimation

Self-supervised learning is an increasingly popular approach to unsuperv...
11/03/2022

Self-Adapting Noise-Contrastive Estimation for Energy-Based Models

Training energy-based models (EBMs) with noise-contrastive estimation (N...
12/08/2020

Deep Energy-Based NARX Models

This paper is directed towards the problem of learning nonlinear ARX mod...
12/28/2018

Divergence Triangle for Joint Training of Generator Model, Energy-based Model, and Inference Model

This paper proposes the divergence triangle as a framework for joint tra...
06/02/2023

On Feature Diversity in Energy-based Models

Energy-based learning is a powerful learning paradigm that encapsulates ...
01/31/2023

Generating High Fidelity Synthetic Data via Coreset selection and Entropic Regularization

Generative models have the ability to synthesize data points drawn from ...

Code Repositories

icebeem

Code for ICE-BeeM paper - NeurIPS 2020


view repo

Please sign up or login with your details

Forgot password? Click here to reset