Learned harmonic mean estimation of the marginal likelihood with normalizing flows

by   Alicja Polanska, et al.

Computing the marginal likelihood (also called the Bayesian model evidence) is an important task in Bayesian model selection, providing a principled quantitative way to compare models. The learned harmonic mean estimator solves the exploding variance problem of the original harmonic mean estimation of the marginal likelihood. The learned harmonic mean estimator learns an importance sampling target distribution that approximates the optimal distribution. While the approximation need not be highly accurate, it is critical that the probability mass of the learned distribution is contained within the posterior in order to avoid the exploding variance problem. In previous work a bespoke optimization problem is introduced when training models in order to ensure this property is satisfied. In the current article we introduce the use of normalizing flows to represent the importance sampling target distribution. A flow-based model is trained on samples from the posterior by maximum likelihood estimation. Then, the probability density of the flow is concentrated by lowering the variance of the base distribution, i.e. by lowering its "temperature", ensuring its probability mass is contained within the posterior. This approach avoids the need for a bespoke optimisation problem and careful fine tuning of parameters, resulting in a more robust method. Moreover, the use of normalizing flows has the potential to scale to high dimensional settings. We present preliminary experiments demonstrating the effectiveness of the use of flows for the learned harmonic mean estimator. The harmonic code implementing the learned harmonic mean, which is publicly available, has been updated to now support normalizing flows.


page 1

page 2

page 3

page 4


Machine learning assisted Bayesian model comparison: learnt harmonic mean estimator

We resurrect the infamous harmonic mean estimator for computing the marg...

Removing the fat from your posterior samples with margarine

Bayesian workflows often require the introduction of nuisance parameters...

Flow Annealed Importance Sampling Bootstrap

Normalizing flows are tractable density models that can approximate comp...

Learning Optimal Flows for Non-Equilibrium Importance Sampling

Many applications in computational sciences and statistical inference re...

Robustly estimating the marginal likelihood for cognitive models via importance sampling

Recent advances in Markov chain Monte Carlo (MCMC) extend the scope of B...

Optimization of Annealed Importance Sampling Hyperparameters

Annealed Importance Sampling (AIS) is a popular algorithm used to estima...

Easily Computed Marginal Likelihoods from Posterior Simulation Using the THAMES Estimator

We propose an easily computed estimator of marginal likelihoods from pos...

Please sign up or login with your details

Forgot password? Click here to reset