Mixed neural network Gaussian processes

12/01/2021
by   Alexey Lindo, et al.
0

This paper makes two contributions. Firstly, it introduces mixed compositional kernels and mixed neural network Gaussian processes (NGGPs). Mixed compositional kernels are generated by composition of probability generating functions (PGFs). A mixed NNGP is a Gaussian process (GP) with a mixed compositional kernel, arising in the infinite-width limit of multilayer perceptrons (MLPs) that have a different activation function for each layer. Secondly, θ activation functions for neural networks and θ compositional kernels are introduced by building upon the theory of branching processes, and more specifically upon θ PGFs. While θ compositional kernels are recursive, they are expressed in closed form. It is shown that θ compositional kernels have non-degenerate asymptotic properties under certain conditions. Thus, GPs with θ compositional kernels do not require non-explicit recursive kernel evaluations and have controllable infinite-depth asymptotic properties. An open research question is whether GPs with θ compositional kernels are limits of infinitely-wide MLPs with θ activation functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2020

Mehler's Formula, Branching Process, and Compositional Kernels of Deep Neural Networks

In this paper, we utilize a connection between compositional kernels and...
research
02/20/2020

Avoiding Kernel Fixed Points: Computing with ELU and GELU Infinite Networks

Analysing and computing with Gaussian processes arising from infinitely ...
research
05/27/2019

Interpretable deep Gaussian processes

We propose interpretable deep Gaussian Processes (GPs) that combine the ...
research
09/26/2022

A connection between probability, physics and neural networks

We illustrate an approach that can be exploited for constructing neural ...
research
06/12/2018

Differentiable Compositional Kernel Learning for Gaussian Processes

The generalization properties of Gaussian processes depend heavily on th...
research
10/19/2020

Stationary Activations for Uncertainty Calibration in Deep Learning

We introduce a new family of non-linear neural network activation functi...
research
03/30/2023

Neural signature kernels as infinite-width-depth-limits of controlled ResNets

Motivated by the paradigm of reservoir computing, we consider randomly i...

Please sign up or login with your details

Forgot password? Click here to reset