Distributed Bayesian Learning with Stochastic Natural-gradient Expectation Propagation and the Posterior Server

12/31/2015
by   Leonard Hasenclever, et al.
0

This paper makes two contributions to Bayesian machine learning algorithms. Firstly, we propose stochastic natural gradient expectation propagation (SNEP), a novel alternative to expectation propagation (EP), a popular variational inference algorithm. SNEP is a black box variational algorithm, in that it does not require any simplifying assumptions on the distribution of interest, beyond the existence of some Monte Carlo sampler for estimating the moments of the EP tilted distributions. Further, as opposed to EP which has no guarantee of convergence, SNEP can be shown to be convergent, even when using Monte Carlo moment estimates. Secondly, we propose a novel architecture for distributed Bayesian learning which we call the posterior server. The posterior server allows scalable and robust Bayesian learning in cases where a data set is stored in a distributed manner across a cluster, with each compute node containing a disjoint subset of data. An independent Monte Carlo sampler is run on each compute node, with direct access only to the local data subset, but which targets an approximation to the global posterior distribution given all data across the whole cluster. This is achieved by using a distributed asynchronous implementation of SNEP to pass messages across the cluster. We demonstrate SNEP and the posterior server on distributed Bayesian learning of logistic regression and neural networks. Keywords: Distributed Learning, Large Scale Learning, Deep Learning, Bayesian Learn- ing, Variational Inference, Expectation Propagation, Stochastic Approximation, Natural Gradient, Markov chain Monte Carlo, Parameter Server, Posterior Server.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/31/2023

Learning to solve Bayesian inverse problems: An amortized variational inference approach

Inverse problems, i.e., estimating parameters of physical models from ex...
research
07/01/2020

Decentralized Stochastic Gradient Langevin Dynamics and Hamiltonian Monte Carlo

Stochastic gradient Langevin dynamics (SGLD) and stochastic gradient Ham...
research
05/27/2016

Merging MCMC Subposteriors through Gaussian-Process Approximations

Markov chain Monte Carlo (MCMC) algorithms have become powerful tools fo...
research
06/14/2015

Bayesian Dark Knowledge

We consider the problem of Bayesian parameter estimation for deep neural...
research
06/12/2015

Stochastic Expectation Propagation

Expectation propagation (EP) is a deterministic approximation algorithm ...
research
10/10/2019

Distributed Bayesian Computation for Model Choice

We propose a general method for distributed Bayesian model choice, where...
research
07/19/2021

Structured Stochastic Gradient MCMC

Stochastic gradient Markov chain Monte Carlo (SGMCMC) is considered the ...

Please sign up or login with your details

Forgot password? Click here to reset