Deep Bayesian Supervised Learning given Hypercuboidally-shaped, Discontinuous Data, using Compound Tensor-Variate & Scalar-Variate Gaussian Processes

03/13/2018
by   Kangrui Wang, et al.
0

We undertake learning of the high-dimensional functional relationship between a system vector and a tensor-valued observable, where this observable affects the system parameter. The final aim is to undertake Bayesian inverse prediction of system parameter values, at which test data on the observable is reported. We attempt such learning given hypercuboidally-shaped data comprising multiple measurements of the observable, where the data displays strong discontinuities. We undertake modelling of the sought functional relationship, with a tensor-variate Gaussian Process (GP) and use three independent ways for learning covariance matrices of the resulting Tensor-Normal likelihood. We demonstrate that when the covariance matrix is kernel parametrised, the discontinuous nature of the data demands the implemented kernel to be non-stationary, which we achieve by modelling each kernel hyperparameter, as a function of the sample function of the invoked tensor-variate GP. This translates to a dynamically varying function, within our Bayesian inference scheme, with this function treated as a realisation from a scalar-variate GP, the covariance structure of which is described adaptively by collating information from a historical set of samples. Parameters of the scalar-variate GPs, are updated first in our Metropolis-within-Gibbs scheme, thus allowing kernel hyperparameters to be updated; remaining parameters of the tensor-variate GP are then updated. Thus, the functional relation between the system parameter and observable is learnt, and subsequently, we perform inverse Bayesian prediction. We apply our method to a cuboidally-shaped, discontinuous, real dataset, while undertaking forward prediction to generate data from our model, given our results--to perform model-checking.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2017

Bayesian Learning of Random Graphs & Correlation Structure of Multivariate Data, with Distance between Graphs

We present a method for the simultaneous Bayesian learning of the correl...
research
11/22/2018

Learning in the Absence of Training Data -- a Galactic Application

There are multiple real-world problems in which training data is unavail...
research
06/11/2020

Fast increased fidelity approximate Gibbs samplers for Bayesian Gaussian process regression

The use of Gaussian processes (GPs) is supported by efficient sampling a...
research
03/24/2019

Modelling Function-Valued Processes with Nonseparable Covariance Structure

We discuss a general Bayesian framework on modelling multidimensional fu...
research
10/31/2017

Tensor Regression Meets Gaussian Processes

Low-rank tensor regression, a new model class that learns high-order cor...
research
06/05/2020

A conditional one-output likelihood formulation for multitask Gaussian processes

Multitask Gaussian processes (MTGP) are the Gaussian process (GP) framew...
research
07/06/2022

Nonparametric Factor Trajectory Learning for Dynamic Tensor Decomposition

Tensor decomposition is a fundamental framework to analyze data that can...

Please sign up or login with your details

Forgot password? Click here to reset