Likelihood-free inference with an improved cross-entropy estimator

08/02/2018
by   Markus Stoye, et al.
0

We extend recent work (Brehmer, et. al., 2018) that use neural networks as surrogate models for likelihood-free inference. As in the previous work, we exploit the fact that the joint likelihood ratio and joint score, conditioned on both observed and latent variables, can often be extracted from an implicit generative model or simulator to augment the training data for these surrogate models. We show how this augmented training data can be used to provide a new cross-entropy estimator, which provides improved sample efficiency compared to previous loss functions exploiting this augmented training data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2018

Mining gold from implicit models to improve likelihood-free inference

Simulators often provide the best description of real-world phenomena; h...
research
09/25/2019

Regularising Deep Networks with DGMs

Here we develop a new method for regularising neural networks where we l...
research
11/07/2018

Carving model-free inference

Many scientific studies are modeled as hierarchical procedures where the...
research
11/01/2018

Neural Rendering Model: Joint Generation and Prediction for Semi-Supervised Learning

Unsupervised and semi-supervised learning are important problems that ar...
research
07/24/2019

Notes on Latent Structure Models and SPIGOT

These notes aim to shed light on the recently proposed structured projec...
research
04/19/2021

Simulation-Based Inference with Approximately Correct Parameters via Maximum Entropy

Inferring the input parameters of simulators from observations is a cruc...
research
09/03/2019

Detecting Compromised Implicit Association Test Results Using Supervised Learning

An implicit association test is a human psychological test used to measu...

Please sign up or login with your details

Forgot password? Click here to reset