Dependence between Bayesian neural network units

11/29/2021
by   Mariia Vladimirova, et al.
0

The connection between Bayesian neural networks and Gaussian processes gained a lot of attention in the last few years, with the flagship result that hidden units converge to a Gaussian process limit when the layers width tends to infinity. Underpinning this result is the fact that hidden units become independent in the infinite-width limit. Our aim is to shed some light on hidden units dependence properties in practical finite-width Bayesian neural networks. In addition to theoretical results, we assess empirically the depth and width impacts on hidden units dependence properties.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2021

Bayesian neural network unit priors and generalized Weibull-tail property

The connection between Bayesian neural networks and Gaussian processes g...
research
01/03/2020

Wide Neural Networks with Bottlenecks are Deep Gaussian Processes

There is recently much work on the "wide limit" of neural networks, wher...
research
06/11/2021

The Limitations of Large Width in Neural Networks: A Deep Gaussian Process Perspective

Large width limits have been a recent focus of deep learning research: m...
research
01/30/2022

Stochastic Neural Networks with Infinite Width are Deterministic

This work theoretically studies stochastic neural networks, a main type ...
research
11/06/2015

The Poisson Gamma Belief Network

To infer a multilayer representation of high-dimensional count vectors, ...
research
10/13/2021

Using Multitask Gaussian Processes to estimate the effect of a targeted effort to remove firearms

Gun violence is a critical public safety concern in the United States. I...
research
08/27/2019

Finite size corrections for neural network Gaussian processes

There has been a recent surge of interest in modeling neural networks (N...

Please sign up or login with your details

Forgot password? Click here to reset