No Free Lunch for Approximate MCMC

10/23/2020
by   James E. Johndrow, et al.
0

It is widely known that the performance of Markov chain Monte Carlo (MCMC) can degrade quickly when targeting computationally expensive posterior distributions, such as when the sample size is large. This has motivated the search for MCMC variants that scale well to large datasets. One general approach has been to look at only a subsample of the data at every step. In this note, we point out that well-known MCMC convergence results often imply that these "subsampling" MCMC algorithms cannot greatly improve performance. We apply these generic results to realistic statistical problems and proposed algorithms, and also discuss some design principles suggested by the results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/04/2020

tfp.mcmc: Modern Markov Chain Monte Carlo Tools Built for Modern Hardware

Markov chain Monte Carlo (MCMC) is widely regarded as one of the most im...
research
05/23/2019

Efficient MCMC Sampling with Dimension-Free Convergence Rate using ADMM-type Splitting

Performing exact Bayesian inference for complex models is intractable. M...
research
06/30/2020

Involutive MCMC: a Unifying Framework

Markov Chain Monte Carlo (MCMC) is a computational approach to fundament...
research
10/27/2019

A Case for Quantifying Statistical Robustness of Specialized Probabilistic AI Accelerators

Statistical machine learning often uses probabilistic algorithms, such a...
research
05/29/2020

Rate-optimal refinement strategies for local approximation MCMC

Many Bayesian inference problems involve target distributions whose dens...
research
04/01/2021

Sampling and statistical physics via symmetry

We formulate both Markov chain Monte Carlo (MCMC) sampling algorithms an...
research
07/09/2021

Fast compression of MCMC output

We propose cube thinning, a novel method for compressing the output of a...

Please sign up or login with your details

Forgot password? Click here to reset