A brief note on understanding neural networks as Gaussian processes

07/25/2021
by   Mengwu Guo, et al.
0

As a generalization of the work in [Lee et al., 2017], this note briefly discusses when the prior of a neural network output follows a Gaussian process, and how a neural-network-induced Gaussian process is formulated. The posterior mean functions of such a Gaussian process regression lie in the reproducing kernel Hilbert space defined by the neural-network-induced kernel. In the case of two-layer neural networks, the induced Gaussian processes provide an interpretation of the reproducing kernel Hilbert spaces whose union forms a Barron space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/29/2021

Neural Network Gaussian Processes by Increasing Depth

Recent years have witnessed an increasing interest in the correspondence...
research
10/28/2019

Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes

Wide neural networks with random weights and biases are Gaussian process...
research
01/26/2022

A Kernel-Based Approach for Modelling Gaussian Processes with Functional Information

Gaussian processes are among the most useful tools in modeling continuou...
research
07/06/2018

Gaussian Processes and Kernel Methods: A Review on Connections and Equivalences

This paper is an attempt to bridge the conceptual gaps between researche...
research
03/04/2021

Small Sample Spaces for Gaussian Processes

It is known that the membership in a given reproducing kernel Hilbert sp...
research
06/05/2023

Global universal approximation of functional input maps on weighted spaces

We introduce so-called functional input neural networks defined on a pos...
research
08/11/2022

Gaussian process surrogate models for neural networks

The lack of insight into deep learning systems hinders their systematic ...

Please sign up or login with your details

Forgot password? Click here to reset