An Empirical Analysis of the Advantages of Finite- v.s. Infinite-Width Bayesian Neural Networks

11/16/2022
by   Jiayu Yao, et al.
0

Comparing Bayesian neural networks (BNNs) with different widths is challenging because, as the width increases, multiple model properties change simultaneously, and, inference in the finite-width case is intractable. In this work, we empirically compare finite- and infinite-width BNNs, and provide quantitative and qualitative explanations for their performance difference. We find that when the model is mis-specified, increasing width can hurt BNN performance. In these cases, we provide evidence that finite-width BNNs generalize better partially due to the properties of their frequency spectrum that allows them to adapt under model mismatch.

READ FULL TEXT
research
09/04/2023

Les Houches Lectures on Deep Learning at Large Infinite Width

These lectures, presented at the 2022 Les Houches Summer School on Stati...
research
08/19/2020

Asymptotics of Wide Convolutional Neural Networks

Wide neural networks have proven to be a rich class of architectures for...
research
12/06/2020

Any-Width Networks

Despite remarkable improvements in speed and accuracy, convolutional neu...
research
06/18/2019

Approximation power of random neural networks

This paper investigates the approximation power of three types of random...
research
10/27/2020

Are wider nets better given the same number of parameters?

Empirical studies demonstrate that the performance of neural networks im...
research
12/10/2021

Eigenspace Restructuring: a Principle of Space and Frequency in Neural Networks

Understanding the fundamental principles behind the massive success of n...
research
07/26/2021

Are Bayesian neural networks intrinsically good at out-of-distribution detection?

The need to avoid confident predictions on unfamiliar data has sparked i...

Please sign up or login with your details

Forgot password? Click here to reset