DeepAI AI Chat
Log In Sign Up

Reversible Jump MCMC Simulated Annealing for Neural Networks

by   Christophe Andrieu, et al.

We propose a novel reversible jump Markov chain Monte Carlo (MCMC) simulated annealing algorithm to optimize radial basis function (RBF) networks. This algorithm enables us to maximize the joint posterior distribution of the network parameters and the number of basis functions. It performs a global search in the joint space of the parameters and number of parameters, thereby surmounting the problem of local minima. We also show that by calibrating a Bayesian model, we can obtain the classical AIC, BIC and MDL model selection criteria within a penalized likelihood framework. Finally, we show theoretically and empirically that the algorithm converges to the modes of the full posterior distribution in an efficient way.


page 1

page 2

page 3

page 4


Reversible Genetically Modified Mode Jumping MCMC

In this paper, we introduce a reversible version of a genetically modifi...

Fast Bayesian Deconvolution using Simple Reversible Jump Moves

We propose a Markov chain Monte Carlo-based deconvolution method designe...

A novel Empirical Bayes with Reversible Jump Markov Chain in User-Movie Recommendation system

In this article we select the unknown dimension of the feature by re- ve...

Novel Bayesian Procrustes Variance-based Inferences in Geometric Morphometrics Novel R package: BPviGM1

Compared to abundant classical statistics-based literature, to date, ver...