Conditional Deep Gaussian Processes: empirical Bayes hyperdata learning

10/01/2021
by   Chi-Ken Lu, et al.
2

It is desirable to combine the expressive power of deep learning with Gaussian Process (GP) in one expressive Bayesian learning model. Deep kernel learning proposed in [1] showed success in adopting a deep network for feature extraction followed by a GP used as function model. Recently, [2] suggested that the deterministic nature of feature extractor may lead to overfitting while the replacement with a Bayesian network seemed to cure it. Here, we propose the conditional Deep Gaussian Process (DGP) in which the intermediate GPs in hierarchical composition are supported by the hyperdata and the exposed GP remains zero mean. Motivated by the inducing points in sparse GP, the hyperdata also play the role of function supports, but are hyperparameters rather than random variables. We use the moment matching method [3] to approximate the marginal prior for conditional DGP with a GP carrying an effective kernel. Thus, as in empirical Bayes, the hyperdata are learned by optimizing the approximate marginal likelihood which implicitly depends on the hyperdata via the kernel. We shall show the equivalence with the deep kernel learning in the limit of dense hyperdata in latent space. However, the conditional DGP and the corresponding approximate inference enjoy the benefit of being more Bayesian than deep kernel learning. Preliminary extrapolation results demonstrate expressive power of the proposed model compared with GP kernel composition, DGP variational inference, and deep kernel learning. We also address the non-Gaussian aspect of our model as well as way of upgrading to a full Bayes inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/15/2021

GP-Tree: A Gaussian Process Classifier for Few-Shot Incremental Learning

Gaussian processes (GPs) are non-parametric, flexible, models that work ...
research
06/05/2019

Approximate Inference Turns Deep Networks into Gaussian Processes

Deep neural networks (DNN) and Gaussian processes (GP) are two powerful ...
research
02/24/2021

The Promises and Pitfalls of Deep Kernel Learning

Deep kernel learning and related techniques promise to combine the repre...
research
05/15/2022

Incorporating Prior Knowledge into Neural Networks through an Implicit Composite Kernel

It is challenging to guide neural network (NN) learning with prior knowl...
research
02/07/2020

Multi-source Deep Gaussian Process Kernel Learning

For many problems, relevant data are plentiful but explicit knowledge is...
research
02/28/2023

Interactive Segmentation as Gaussian Process Classification

Click-based interactive segmentation (IS) aims to extract the target obj...
research
06/01/2019

Bayesian Deconditional Kernel Mean Embeddings

Conditional kernel mean embeddings form an attractive nonparametric fram...

Please sign up or login with your details

Forgot password? Click here to reset