Bayesian Hierarchical Mixtures of Experts

10/19/2012
by   Christopher M. Bishop, et al.
0

The Hierarchical Mixture of Experts (HME) is a well-known tree-based model for regression and classification, based on soft probabilistic splits. In its original formulation it was trained by maximum likelihood, and is therefore prone to over-fitting. Furthermore the maximum likelihood framework offers no natural metric for optimizing the complexity and structure of the tree. Previous attempts to provide a Bayesian treatment of the HME model have relied either on ad-hoc local Gaussian approximations or have dealt with related models representing the joint distribution of both input and output variables. In this paper we describe a fully Bayesian treatment of the HME model based on variational inference. By combining local and global variational methods we obtain a rigourous lower bound on the marginal probability of the data under the model. This bound is optimized during the training phase, and its resulting value can be used for model order selection. We present results using this approach for a data set describing robot arm kinematics.

READ FULL TEXT
research
03/21/2017

A Deterministic Global Optimization Method for Variational Inference

Variational inference methods for latent variable statistical models hav...
research
03/18/2019

Hierarchical Routing Mixture of Experts

In regression tasks the distribution of the data is often too complex to...
research
06/21/2018

Probabilistic PARAFAC2

The PARAFAC2 is a multimodal factor analysis model suitable for analyzin...
research
02/16/2021

Tighter Bounds on the Log Marginal Likelihood of Gaussian Process Regression Using Conjugate Gradients

We propose a lower bound on the log marginal likelihood of Gaussian proc...
research
03/01/2020

Clarifying the Hubble constant tension with a Bayesian hierarchical model of the local distance ladder

Estimates of the Hubble constant, $H_0$, from the local distance ladder ...
research
12/03/2020

A similarity-based Bayesian mixture-of-experts model

We present a new nonparametric mixture-of-experts model for multivariate...
research
12/16/2017

Hierarchical Bayesian Bradley-Terry for Applications in Major League Baseball

A common problem faced in statistical inference is drawing conclusions f...

Please sign up or login with your details

Forgot password? Click here to reset