Gaussian Processes and Statistical Decision-making in Non-Euclidean Spaces

by   Alexander Terenin, et al.

Bayesian learning using Gaussian processes provides a foundational framework for making decisions in a manner that balances what is known with what could be learned by gathering data. In this dissertation, we develop techniques for broadening the applicability of Gaussian processes. This is done in two ways. Firstly, we develop pathwise conditioning techniques for Gaussian processes, which allow one to express posterior random functions as prior random functions plus a dependent update term. We introduce a wide class of efficient approximations built from this viewpoint, which can be randomly sampled once in advance, and evaluated at arbitrary locations without any subsequent stochasticity. This key property improves efficiency and makes it simpler to deploy Gaussian process models in decision-making settings. Secondly, we develop a collection of Gaussian process models over non-Euclidean spaces, including Riemannian manifolds and graphs. We derive fully constructive expressions for the covariance kernels of scalar-valued Gaussian processes on Riemannian manifolds and graphs. Building on these ideas, we describe a formalism for defining vector-valued Gaussian processes on Riemannian manifolds. The introduced techniques allow all of these models to be trained using standard computational methods. In total, these contributions make Gaussian processes easier to work with and allow them to be used within a wider class of domains in an effective and principled manner. This, in turn, makes it possible to potentially apply Gaussian processes to novel decision-making settings.



page 1

page 2

page 3

page 4


Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge Independent Projected Kernels

Gaussian processes are machine learning models capable of learning unkno...

Matern Gaussian processes on Riemannian manifolds

Gaussian processes are an effective model class for learning unknown fun...

Fixed-Domain Inference for Gausian Processes with Matérn Covariogram on Compact Riemannian Manifolds

Gaussian processes are widely employed as versatile modeling and predict...

Conditioning Sparse Variational Gaussian Processes for Online Decision-making

With a principled representation of uncertainty and closed form posterio...

Pathwise Conditioning of Gaussian Processes

As Gaussian processes are integrated into increasingly complex problem s...

Variational Gaussian Processes with Signature Covariances

We introduce a Bayesian approach to learn from stream-valued data by usi...

Intrinsic Gaussian processes on complex constrained domains

We propose a class of intrinsic Gaussian processes (in-GPs) for interpol...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.