Infinitely Wide Tensor Networks as Gaussian Process

01/07/2021
by   Erdong Guo, et al.
0

Gaussian Process is a non-parametric prior which can be understood as a distribution on the function space intuitively. It is known that by introducing appropriate prior to the weights of the neural networks, Gaussian Process can be obtained by taking the infinite-width limit of the Bayesian neural networks from a Bayesian perspective. In this paper, we explore the infinitely wide Tensor Networks and show the equivalence of the infinitely wide Tensor Networks and the Gaussian Process. We study the pure Tensor Network and another two extended Tensor Network structures: Neural Kernel Tensor Network and Tensor Network hidden layer Neural Network and prove that each one will converge to the Gaussian Process as the width of each model goes to infinity. (We note here that Gaussian Process can also be obtained by taking the infinite limit of at least one of the bond dimensions α_i in the product of tensor nodes, and the proofs can be done with the same ideas in the proofs of the infinite-width cases.) We calculate the mean function (mean vector) and the covariance function (covariance matrix) of the finite dimensional distribution of the induced Gaussian Process by the infinite-width tensor network with a general set-up. We study the properties of the covariance function and derive the approximation of the covariance function when the integral in the expectation operator is intractable. In the numerical experiments, we implement the Gaussian Process corresponding to the infinite limit tensor networks and plot the sample paths of these models. We study the hyperparameters and plot the sample path families in the induced Gaussian Process by varying the standard deviations of the prior distributions. As expected, the parameters in the prior distribution namely the hyper-parameters in the induced Gaussian Process controls the characteristic lengthscales of the Gaussian Process.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2022

Deep Maxout Network Gaussian Process

Study of neural networks with infinite width is important for better und...
research
03/14/2022

On Connecting Deep Trigonometric Networks with Deep Gaussian Processes: Covariance, Expressivity, and Neural Tangent Kernel

Deep Gaussian Process as a Bayesian learning model is promising because ...
research
10/17/2019

Why bigger is not always better: on finite and infinite neural networks

Recent work has shown that the outputs of convolutional neural networks ...
research
02/14/2021

Double-descent curves in neural networks: a new perspective using Gaussian processes

Double-descent curves in neural networks describe the phenomenon that th...
research
10/16/2020

The Ridgelet Prior: A Covariance Function Approach to Prior Specification for Bayesian Neural Networks

Bayesian neural networks attempt to combine the strong predictive perfor...
research
12/25/2019

A statistical test for correspondence of texts to the Zipf-Mandelbrot law

We analyse correspondence of a text to a simple probabilistic model. The...
research
08/12/2021

Efficient reduced-rank methods for Gaussian processes with eigenfunction expansions

In this work we introduce a reduced-rank algorithm for Gaussian process ...

Please sign up or login with your details

Forgot password? Click here to reset