Deep Latent-Variable Kernel Learning

05/18/2020
by   Haitao Liu, et al.
24

Deep kernel learning (DKL) leverages the connection between Gaussian process (GP) and neural networks (NN) to build an end-to-end, hybrid model. It combines the capability of NN to learn rich representations under massive data and the non-parametric property of GP to achieve automatic calibration. However, the deterministic encoder may weaken the model calibration of the following GP part, especially on small datasets, due to the free latent representation. We therefore present a complete deep latent-variable kernel learning (DLVKL) model wherein the latent variables perform stochastic encoding for regularized representation. Theoretical analysis however indicates that the DLVKL with i.i.d. prior for latent variables suffers from posterior collapse and degenerates to a constant predictor. Hence, we further enhance the DLVKL from two aspects: (i) the complicated variational posterior through neural stochastic differential equation (NSDE) to reduce the divergence gap, and (ii) the hybrid prior taking knowledge from both the SDE prior and the posterior to arrive at a flexible trade-off. Intensive experiments imply that the DLVKL-NSDE performs similarly to the well calibrated GP on small datasets, and outperforms existing deep GPs on large datasets.

READ FULL TEXT

page 7

page 8

page 9

research
07/01/2013

Dimensionality Detection and Integration of Multiple Data Sources via the GP-LVM

The Gaussian Process Latent Variable Model (GP-LVM) is a non-linear prob...
research
08/29/2020

Modulating Scalable Gaussian Processes for Expressive Statistical Learning

For a learning task, Gaussian process (GP) is interested in learning the...
research
10/23/2020

Statistical Guarantees for Transformation Based Models with Applications to Implicit Variational Inference

Transformation-based methods have been an attractive approach in non-par...
research
06/08/2020

Physics Regularized Gaussian Processes

We consider incorporating incomplete physics knowledge, expressed as dif...
research
05/15/2022

Incorporating Prior Knowledge into Neural Networks through an Implicit Composite Kernel

It is challenging to guide neural network (NN) learning with prior knowl...
research
06/01/2023

Linear Time GPs for Inferring Latent Trajectories from Neural Spike Trains

Latent Gaussian process (GP) models are widely used in neuroscience to u...
research
12/10/2014

GP-select: Accelerating EM using adaptive subspace preselection

We propose a nonparametric procedure to achieve fast inference in genera...

Please sign up or login with your details

Forgot password? Click here to reset