Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

05/18/2018
by   George Papamakarios, et al.
16

We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood in the region of high posterior density. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude. We show that SNL is more robust, more accurate and requires less tuning than related state-of-the-art methods which target the posterior, and discuss diagnostics for assessing calibration, convergence and goodness-of-fit.

READ FULL TEXT

page 14

page 17

page 18

research
10/10/2022

Sequential Neural Score Estimation: Likelihood-Free Inference with Conditional Score Based Diffusion Models

We introduce Sequential Neural Posterior Score Estimation (SNPSE) and Se...
research
05/17/2019

Automatic Posterior Transformation for Likelihood-Free Inference

How can one perform Bayesian inference on stochastic simulators with int...
research
02/12/2021

Sequential Neural Posterior and Likelihood Approximation

We introduce the sequential neural posterior and likelihood approximatio...
research
02/15/2021

Posterior-Aided Regularization for Likelihood-Free Inference

The recent development of likelihood-free inference aims training a flex...
research
02/27/2019

Adaptive Gaussian Copula ABC

Approximate Bayesian computation (ABC) is a set of techniques for Bayesi...
research
06/13/2022

Density Estimation with Autoregressive Bayesian Predictives

Bayesian methods are a popular choice for statistical inference in small...
research
02/10/2020

On Contrastive Learning for Likelihood-free Inference

Likelihood-free methods perform parameter inference in stochastic simula...

Please sign up or login with your details

Forgot password? Click here to reset