Learning Deep Energy Models: Contrastive Divergence vs. Amortized MLE

07/04/2017
by   Qiang Liu, et al.
0

We propose a number of new algorithms for learning deep energy models and demonstrate their properties. We show that our SteinCD performs well in term of test likelihood, while SteinGAN performs well in terms of generating realistic looking images. Our results suggest promising directions for learning better models by combining GAN-style methods with traditional energy-based learning.

READ FULL TEXT

page 7

page 8

page 9

research
04/29/2022

Statistical applications of contrastive learning

The likelihood function plays a crucial role in statistical inference an...
research
09/26/2017

Learning Multi-grid Generative ConvNets by Minimal Contrastive Divergence

This paper proposes a minimal contrastive divergence method for learning...
research
02/26/2023

Contrast-PLC: Contrastive Learning for Packet Loss Concealment

Packet loss concealment (PLC) is challenging in concealing missing conte...
research
03/21/2019

Towards Characterizing Divergence in Deep Q-Learning

Deep Q-Learning (DQL), a family of temporal difference algorithms for co...
research
03/24/2022

Bi-level Doubly Variational Learning for Energy-based Latent Variable Models

Energy-based latent variable models (EBLVMs) are more expressive than co...
research
02/02/2023

Dual Propagation: Accelerating Contrastive Hebbian Learning with Dyadic Neurons

Activity difference based learning algorithms-such as contrastive Hebbia...
research
11/02/2016

Learning Deep Embeddings with Histogram Loss

We suggest a loss for learning deep embeddings. The new loss does not in...

Please sign up or login with your details

Forgot password? Click here to reset