Machine Learning and the Future of Bayesian Computation

04/21/2023
by   Steven Winter, et al.
0

Bayesian models are a powerful tool for studying complex data, allowing the analyst to encode rich hierarchical dependencies and leverage prior information. Most importantly, they facilitate a complete characterization of uncertainty through the posterior distribution. Practical posterior computation is commonly performed via MCMC, which can be computationally infeasible for high dimensional models with many observations. In this article we discuss the potential to improve posterior computation using ideas from machine learning. Concrete future directions are explored in vignettes on normalizing flows, Bayesian coresets, distributed Bayesian inference, and variational inference.

READ FULL TEXT
research
11/15/2021

Natural Gradient Variational Inference with Gaussian Mixture Models

Bayesian methods estimate a measure of uncertainty by using the posterio...
research
08/20/2021

Approximate Bayesian Neural Doppler Imaging

The non-uniform surface temperature distribution of rotating active star...
research
09/08/2021

Self-explaining variational posterior distributions for Gaussian Process models

Bayesian methods have become a popular way to incorporate prior knowledg...
research
02/21/2022

Non-Volatile Memory Accelerated Posterior Estimation

Bayesian inference allows machine learning models to express uncertainty...
research
04/23/2020

Machine Learning Econometrics: Bayesian algorithms and methods

As the amount of economic and other data generated worldwide increases v...
research
02/07/2019

Probably the Best Itemsets

One of the main current challenges in itemset mining is to discover a sm...
research
02/15/2020

Further Inference on Categorical Data – A Bayesian Approach

Three different inferential problems related to a two dimensional catego...

Please sign up or login with your details

Forgot password? Click here to reset