Conditional Noise-Contrastive Estimation of Unnormalised Models

06/10/2018
by   Ciwan Ceylan, et al.
0

Many parametric statistical models are not properly normalised and only specified up to an intractable partition function, which renders parameter estimation difficult. Examples of unnormalised models are Gibbs distributions, Markov random fields, and neural network models in unsupervised deep learning. In previous work, the estimation principle called noise-contrastive estimation (NCE) was introduced where unnormalised models are estimated by learning to distinguish between data and auxiliary noise. An open question is how to best choose the auxiliary noise distribution. We here propose a new method that addresses this issue. The proposed method shares with NCE the idea of formulating density estimation as a supervised learning problem but in contrast to NCE, the proposed method leverages the observed data when generating noise samples. The noise can thus be generated in a semi-automated manner. We first present the underlying theory of the new method, show that score matching emerges as a limiting case, validate the method on continuous and discrete valued synthetic data, and show that we can expect an improved performance compared to NCE when the data lie in a lower-dimensional manifold. Then we demonstrate its applicability in unsupervised deep learning by estimating a four-layer neural image model.

READ FULL TEXT

page 8

page 17

page 18

page 19

page 20

page 21

page 22

page 23

research
06/03/2022

MCD: Marginal Contrastive Discrimination for conditional density estimation

We consider the problem of conditional density estimation, which is a ma...
research
10/18/2018

Variational Noise-Contrastive Estimation

Unnormalised latent variable models are a broad and flexible class of st...
research
05/19/2018

Estimation of Non-Normalized Mixture Models and Clustering Using Deep Representation

We develop a general method for estimating a finite mixture of non-norma...
research
02/14/2012

Bregman divergence as general framework to estimate unnormalized statistical models

We show that the Bregman divergence provides a rich framework to estimat...
research
09/06/2018

Noise Contrastive Estimation and Negative Sampling for Conditional Models: Consistency and Statistical Efficiency

Noise Contrastive Estimation (NCE) is a powerful parameter estimation me...
research
09/15/2022

Towards Healing the Blindness of Score Matching

Score-based divergences have been widely used in machine learning and st...
research
06/13/2023

Learning Unnormalized Statistical Models via Compositional Optimization

Learning unnormalized statistical models (e.g., energy-based models) is ...

Please sign up or login with your details

Forgot password? Click here to reset