The premise of approximate MCMC in Bayesian deep learning

08/24/2022
by   Theodore Papamarkou, et al.
0

This paper identifies several characteristics of approximate MCMC in Bayesian deep learning. It proposes an approximate sampling algorithm for neural networks. By analogy to sampling data batches from big datasets, it is proposed to sample parameter subgroups from neural network parameter spaces of high dimensions. While the advantages of minibatch MCMC have been discussed in the literature, blocked Gibbs sampling has received less research attention in Bayesian deep learning.

READ FULL TEXT

page 11

page 16

page 19

research
04/02/2023

Bayesian neural networks via MCMC: a Python-based tutorial

Bayesian inference provides a methodology for parameter estimation and u...
research
06/18/2020

Bayesian Changepoint Analysis

In my PhD thesis, we elaborate upon Bayesian changepoint analysis, where...
research
05/16/2019

Finding our Way in the Dark: Approximate MCMC for Approximate Bayesian Methods

With larger data at their disposal, scientists are emboldened to tackle ...
research
07/03/2012

The OS* Algorithm: a Joint Approach to Exact Optimization and Sampling

Most current sampling algorithms for high-dimensional distributions are ...
research
02/27/2017

Approximate Inference with Amortised MCMC

We propose a novel approximate inference algorithm that approximates a t...
research
11/21/2018

Self-Adversarially Learned Bayesian Sampling

Scalable Bayesian sampling is playing an important role in modern machin...
research
04/09/2020

Adaptive MCMC for synthetic likelihoods and correlated synthetic likelihoods

Approximate Bayesian computation (ABC) and synthetic likelihood (SL) are...

Please sign up or login with your details

Forgot password? Click here to reset