Posterior Inference on Infinitely Wide Bayesian Neural Networks under Weights with Unbounded Variance

05/18/2023
by   Jorge Loría, et al.
0

From the classical and influential works of Neal (1996), it is known that the infinite width scaling limit of a Bayesian neural network with one hidden layer is a Gaussian process, when the network weights have bounded prior variance. Neal's result has been extended to networks with multiple hidden layers and to convolutional neural networks, also with Gaussian process scaling limits. The tractable properties of Gaussian processes then allow straightforward posterior inference and uncertainty quantification, considerably simplifying the study of the limit process compared to a network of finite width. Neural network weights with unbounded variance, however, pose unique challenges. In this case, the classical central limit theorem breaks down and it is well known that the scaling limit is an α-stable process under suitable conditions. However, current literature is primarily limited to forward simulations under these processes and the problem of posterior inference under such a scaling limit remains largely unaddressed, unlike in the Gaussian process case. To this end, our contribution is an interpretable and computationally efficient procedure for posterior inference, using a conditionally Gaussian representation, that then allows full use of the Gaussian process machinery for tractable posterior inference and uncertainty quantification in the non-Gaussian regime.

READ FULL TEXT

page 16

page 18

page 19

research
05/17/2022

Deep neural networks with dependent weights: Gaussian Process mixture limit, heavy tails, sparsity and compressibility

This article studies the infinite-width limit of deep feedforward neural...
research
11/23/2021

Depth induces scale-averaging in overparameterized linear Bayesian neural networks

Inference in deep Bayesian neural networks is only fully understood in t...
research
06/18/2020

Exact posterior distributions of wide Bayesian neural networks

Recent work has shown that the prior over functions induced by a deep Ba...
research
10/25/2018

A Gaussian Process perspective on Convolutional Neural Networks

In this paper we cast the well-known convolutional neural network in a G...
research
06/02/2023

Linked Deep Gaussian Process Emulation for Model Networks

Modern scientific problems are often multi-disciplinary and require inte...
research
10/14/2020

Exploring the Uncertainty Properties of Neural Networks' Implicit Priors in the Infinite-Width Limit

Modern deep learning models have achieved great success in predictive ac...
research
09/21/2022

Variational Inference for Infinitely Deep Neural Networks

We introduce the unbounded depth neural network (UDN), an infinitely dee...

Please sign up or login with your details

Forgot password? Click here to reset