Computational Solutions for Bayesian Inference in Mixture Models

12/18/2018
by   Gilles Celeux, et al.
0

This chapter surveys the most standard Monte Carlo methods available for simulating from a posterior distribution associated with a mixture and conducts some experiments about the robustness of the Gibbs sampler in high dimensional Gaussian settings. This is a chapter prepared for the forthcoming 'Handbook of Mixture Analysis'.

READ FULL TEXT
research
11/24/2018

Amortized Bayesian inference for clustering models

We develop methods for efficient amortized approximate Bayesian inferenc...
research
04/23/2020

Machine Learning Econometrics: Bayesian algorithms and methods

As the amount of economic and other data generated worldwide increases v...
research
10/01/2013

Summary Statistics for Partitionings and Feature Allocations

Infinite mixture models are commonly used for clustering. One can sample...
research
06/13/2020

Faster MCMC for Gaussian Latent Position Network Models

Latent position network models are a versatile tool in network science; ...
research
11/29/2017

Mixture Models in Astronomy

Mixture models combine multiple components into a single probability den...
research
05/10/2023

CosmoPower-JAX: high-dimensional Bayesian inference with differentiable cosmological emulators

We present CosmoPower-JAX, a JAX-based implementation of the CosmoPower ...
research
08/21/2017

Neural Block Sampling

Efficient Monte Carlo inference often requires manual construction of mo...

Please sign up or login with your details

Forgot password? Click here to reset