Efficient Bayes Inference in Neural Networks through Adaptive Importance Sampling

10/03/2022
by   Yunshi Huang, et al.
0

Bayesian neural networks (BNNs) have received an increased interest in the last years. In BNNs, a complete posterior distribution of the unknown weight and bias parameters of the network is produced during the training stage. This probabilistic estimation offers several advantages with respect to point-wise estimates, in particular, the ability to provide uncertainty quantification when predicting new data. This feature inherent to the Bayesian paradigm, is useful in countless machine learning applications. It is particularly appealing in areas where decision-making has a crucial impact, such as medical healthcare or autonomous driving. The main challenge of BNNs is the computational cost of the training procedure since Bayesian techniques often face a severe curse of dimensionality. Adaptive importance sampling (AIS) is one of the most prominent Monte Carlo methodologies benefiting from sounded convergence guarantees and ease for adaptation. This work aims to show that AIS constitutes a successful approach for designing BNNs. More precisely, we propose a novel algorithm PMCnet that includes an efficient adaptation mechanism, exploiting geometric information on the complex (often multimodal) posterior distribution. Numerical results illustrate the excellent performance and the improved exploration capabilities of the proposed method for both shallow and deep neural networks.

READ FULL TEXT

page 25

page 28

research
11/11/2018

Langevin-gradient parallel tempering for Bayesian neural learning

Bayesian neural learning feature a rigorous approach to estimation and u...
research
02/26/2018

ABC Samplers

This Chapter, "ABC Samplers", is to appear in the forthcoming Handbook o...
research
01/26/2022

Visualizing the diversity of representations learned by Bayesian neural networks

Explainable artificial intelligence (XAI) aims to make learning machines...
research
04/14/2022

Optimized Population Monte Carlo

Adaptive importance sampling (AIS) methods are increasingly used for the...
research
11/08/2018

Practical Bayesian Learning of Neural Networks via Adaptive Subgradient Methods

We introduce a novel framework for the estimation of the posterior distr...
research
10/26/2020

Bayesian Fusion of Data Partitioned Particle Estimates

We present a Bayesian data fusion method to approximate a posterior dist...
research
09/05/2021

Robust Importance Sampling for Error Estimation in the Context of Optimal Bayesian Transfer Learning

Classification has been a major task for building intelligent systems as...

Please sign up or login with your details

Forgot password? Click here to reset