Approximate Inference with Amortised MCMC

02/27/2017
by   Yingzhen Li, et al.
0

We propose a novel approximate inference algorithm that approximates a target distribution by amortising the dynamics of a user-selected MCMC sampler. The idea is to initialise MCMC using samples from an approximation network, apply the MCMC operator to improve these samples, and finally use the samples to update the approximation network thereby improving its quality. This provides a new generic framework for approximate inference, allowing us to deploy highly complex, or implicitly defined approximation families with intractable densities, including approximations produced by warping a source of randomness through a deep neural network. Experiments consider image modelling with deep generative models as a challenging test for the method. Deep models trained using amortised MCMC are shown to generate realistic looking samples as well as producing diverse imputations for images with regions of missing pixels.

READ FULL TEXT

page 7

page 8

page 13

page 14

page 15

page 16

page 17

research
12/17/2013

Parallelizing MCMC via Weierstrass Sampler

With the rapidly growing scales of statistical problems, subset based co...
research
07/10/2019

Trust-Region Variational Inference with Gaussian Mixture Models

Many methods for machine learning rely on approximate inference from int...
research
08/24/2022

The premise of approximate MCMC in Bayesian deep learning

This paper identifies several characteristics of approximate MCMC in Bay...
research
06/26/2020

Deep Involutive Generative Models for Neural MCMC

We introduce deep involutive generative models, a new architecture for d...
research
10/07/2021

De-randomizing MCMC dynamics with the diffusion Stein operator

Approximate Bayesian inference estimates descriptors of an intractable t...
research
11/06/2016

Learning to Draw Samples: With Application to Amortized MLE for Generative Adversarial Learning

We propose a simple algorithm to train stochastic neural networks to dra...
research
03/11/2019

Embarrassingly parallel MCMC using deep invertible transformations

While MCMC methods have become a main work-horse for Bayesian inference,...

Please sign up or login with your details

Forgot password? Click here to reset