Distilling importance sampling

10/08/2019
by   Dennis Prangle, et al.
0

The two main approaches to Bayesian inference are sampling and optimisation methods. However many complicated posteriors are difficult to approximate by either. Therefore we propose a novel approach combining features of both. We use a flexible parameterised family of densities, such as a normalising flow. Given a density from this family approximating the posterior we use importance sampling to produce a weighted sample from a more accurate posterior approximation. This sample is then used in optimisation to update the parameters of the approximate density, a process we refer to as "distilling" the importance sampling results. We illustrate our method in a queueing model example.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2012

AND/OR Importance Sampling

The paper introduces AND/OR importance sampling for probabilistic graphi...
research
09/22/2020

Bayesian Update with Importance Sampling: Required Sample Size

Importance sampling is used to approximate Bayes' rule in many computati...
research
10/06/2019

FIS-GAN: GAN with Flow-based Importance Sampling

Generative Adversarial Networks (GAN) training process, in most cases, a...
research
10/12/2022

Importance Sampling Methods for Bayesian Inference with Partitioned Data

This article presents new methodology for sample-based Bayesian inferenc...
research
05/31/2018

Approximate Knowledge Compilation by Online Collapsed Importance Sampling

We introduce collapsed compilation, a novel approximate inference algori...
research
05/12/2023

Locking and Quacking: Stacking Bayesian model predictions by log-pooling and superposition

Combining predictions from different models is a central problem in Baye...
research
03/03/2021

Importance Sampling with the Integrated Nested Laplace Approximation

The Integrated Nested Laplace Approximation (INLA) is a deterministic ap...

Please sign up or login with your details

Forgot password? Click here to reset