Replica theory for learning curves for Gaussian processes on random graphs

by   Matthew J. Urry, et al.

Statistical physics approaches can be used to derive accurate predictions for the performance of inference methods learning from potentially noisy data, as quantified by the learning curve defined as the average error versus number of training examples. We analyse a challenging problem in the area of non-parametric inference where an effectively infinite number of parameters has to be learned, specifically Gaussian process regression. When the inputs are vertices on a random graph and the outputs noisy function values, we show that replica techniques can be used to obtain exact performance predictions in the limit of large graphs. The covariance of the Gaussian process prior is defined by a random walk kernel, the discrete analogue of squared exponential kernels on continuous spaces. Conventionally this kernel is normalised only globally, so that the prior variance can differ between vertices; as a more principled alternative we consider local normalisation, where the prior variance is uniform.



There are no comments yet.


page 1

page 2

page 3

page 4


Random walk kernels and learning curves for Gaussian process regression on random graphs

We consider learning on graphs, guided by kernels that encode similarity...

Approximate inference in related multi-output Gaussian Process Regression

In Gaussian Processes a multi-output kernel is a covariance function ove...

On Connecting Deep Trigonometric Networks with Deep Gaussian Processes: Covariance, Expressivity, and Neural Tangent Kernel

Deep Gaussian Process as a Bayesian learning model is promising because ...

Learning curves for multi-task Gaussian process regression

We study the average case performance of multi-task Gaussian process (GP...

Double-descent curves in neural networks: a new perspective using Gaussian processes

Double-descent curves in neural networks describe the phenomenon that th...

Function-Space Distributions over Kernels

Gaussian processes are flexible function approximators, with inductive b...

Movement-induced Priors for Deep Stereo

We propose a method for fusing stereo disparity estimation with movement...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.