Statistical Foundation of Variational Bayes Neural Networks

06/29/2020
by   Shrijita Bhattacharya, et al.
0

Despite the popularism of Bayesian neural networks in recent years, its use is somewhat limited in complex and big data situations due to the computational cost associated with full posterior evaluations. Variational Bayes (VB) provides a useful alternative to circumvent the computational cost and time complexity associated with the generation of samples from the true posterior using Markov Chain Monte Carlo (MCMC) techniques. The efficacy of the VB methods is well established in machine learning literature. However, its potential broader impact is hindered due to a lack of theoretical validity from a statistical perspective. However there are few results which revolve around the theoretical properties of VB, especially in non-parametric problems. In this paper, we establish the fundamental result of posterior consistency for the mean-field variational posterior (VP) for a feed-forward artificial neural network model. The paper underlines the conditions needed to guarantee that the VP concentrates around Hellinger neighborhoods of the true density function. Additionally, the role of the scale parameter and its influence on the convergence rates has also been discussed. The paper mainly relies on two results (1) the rate at which the true posterior grows (2) the rate at which the KL-distance between the posterior and variational posterior grows. The theory provides a guideline of building prior distributions for Bayesian NN models along with an assessment of accuracy of the corresponding VB implementation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/19/2020

Variational Bayes Neural Network: Posterior Consistency, Classification Accuracy and Computational Challenges

Bayesian neural network models (BNN) have re-surged in recent years due ...
research
05/26/2019

Variational Bayes under Model Misspecification

Variational Bayes (VB) is a scalable alternative to Markov chain Monte C...
research
12/07/2017

Convergence Rates of Variational Posterior Distributions

We study convergence rates of variational posterior distributions for no...
research
06/12/2022

Variational Bayes Deep Operator Network: A data-driven Bayesian solver for parametric differential equations

Neural network based data-driven operator learning schemes have shown tr...
research
08/25/2021

Layer Adaptive Node Selection in Bayesian Neural Networks: Statistical Guarantees and Implementation Details

Sparse deep neural networks have proven to be efficient for predictive m...
research
04/09/2018

A Bayes-Sard Cubature Method

This paper focusses on the formulation of numerical integration as an in...
research
07/13/2020

Model Fusion with Kullback–Leibler Divergence

We propose a method to fuse posterior distributions learned from heterog...

Please sign up or login with your details

Forgot password? Click here to reset