An Overview of Uncertainty Quantification Methods for Infinite Neural Networks

01/13/2022
by   Florian Juengermann, et al.
0

To better understand the theoretical behavior of large neural networks, several works have analyzed the case where a network's width tends to infinity. In this regime, the effect of random initialization and the process of training a neural network can be formally expressed with analytical tools like Gaussian processes and neural tangent kernels. In this paper, we review methods for quantifying uncertainty in such infinite-width neural networks and compare their relationship to Gaussian processes in the Bayesian inference framework. We make use of several equivalence results along the way to obtain exact closed-form solutions for predictive uncertainty.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2023

Les Houches Lectures on Deep Learning at Large Infinite Width

These lectures, presented at the 2022 Les Houches Summer School on Stati...
research
01/30/2022

Stochastic Neural Networks with Infinite Width are Deterministic

This work theoretically studies stochastic neural networks, a main type ...
research
09/30/2019

Non-Gaussian processes and neural networks at finite widths

Gaussian processes are ubiquitous in nature and engineering. A case in p...
research
02/19/2023

Guided Deep Kernel Learning

Combining Gaussian processes with the expressive power of deep neural ne...
research
10/12/2021

Uncertainty-based out-of-distribution detection requires suitable function space priors

The need to avoid confident predictions on unfamiliar data has sparked i...
research
12/19/2019

Optimization for deep learning: theory and algorithms

When and why can a neural network be successfully trained? This article ...
research
08/20/2015

Steps Toward Deep Kernel Methods from Infinite Neural Networks

Contemporary deep neural networks exhibit impressive results on practica...

Please sign up or login with your details

Forgot password? Click here to reset