DeepAI AI Chat
Log In Sign Up

Likelihood-free inference with an improved cross-entropy estimator

by   Markus Stoye, et al.
Universidad Técnica Federico Santa María
NYU college
University of Liège

We extend recent work (Brehmer, et. al., 2018) that use neural networks as surrogate models for likelihood-free inference. As in the previous work, we exploit the fact that the joint likelihood ratio and joint score, conditioned on both observed and latent variables, can often be extracted from an implicit generative model or simulator to augment the training data for these surrogate models. We show how this augmented training data can be used to provide a new cross-entropy estimator, which provides improved sample efficiency compared to previous loss functions exploiting this augmented training data.


page 1

page 2

page 3

page 4


Mining gold from implicit models to improve likelihood-free inference

Simulators often provide the best description of real-world phenomena; h...

Regularising Deep Networks with DGMs

Here we develop a new method for regularising neural networks where we l...

Carving model-free inference

Many scientific studies are modeled as hierarchical procedures where the...

Neural Rendering Model: Joint Generation and Prediction for Semi-Supervised Learning

Unsupervised and semi-supervised learning are important problems that ar...

Notes on Latent Structure Models and SPIGOT

These notes aim to shed light on the recently proposed structured projec...

Simulation-Based Inference with Approximately Correct Parameters via Maximum Entropy

Inferring the input parameters of simulators from observations is a cruc...

Detecting Compromised Implicit Association Test Results Using Supervised Learning

An implicit association test is a human psychological test used to measu...