Amortized Inference for Gaussian Process Hyperparameters of Structured Kernels

06/16/2023
by   Matthias Bitzer, et al.
0

Learning the kernel parameters for Gaussian processes is often the computational bottleneck in applications such as online learning, Bayesian optimization, or active learning. Amortizing parameter inference over different datasets is a promising approach to dramatically speed up training time. However, existing methods restrict the amortized inference procedure to a fixed kernel structure. The amortization network must be redesigned manually and trained again in case a different kernel is employed, which leads to a large overhead in design time and training time. We propose amortizing kernel parameter inference over a complete kernel-structure-family rather than a fixed kernel structure. We do that via defining an amortization network over pairs of datasets and kernel structures. This enables fast kernel inference for each element in the kernel family without retraining the amortization network. As a by-product, our amortization network is able to do fast ensembling over kernel structures. In our experiments, we show drastically reduced inference time combined with competitive test performance for a large set of kernels and datasets.

READ FULL TEXT

page 6

page 7

page 9

page 13

page 15

page 16

page 17

research
06/12/2018

Differentiable Compositional Kernel Learning for Gaussian Processes

The generalization properties of Gaussian processes depend heavily on th...
research
11/06/2015

Deep Kernel Learning

We introduce scalable deep kernels, which combine the structural propert...
research
02/24/2018

Product Kernel Interpolation for Scalable Gaussian Processes

Recent work shows that inference for Gaussian processes can be performed...
research
05/24/2021

Adaptive Local Kernels Formulation of Mutual Information with Application to Active Post-Seismic Building Damage Inference

The abundance of training data is not guaranteed in various supervised l...
research
08/15/2023

Graph-Structured Kernel Design for Power Flow Learning using Gaussian Processes

This paper presents a physics-inspired graph-structured kernel designed ...
research
10/16/2018

Prediction of Atomization Energy Using Graph Kernel and Active Learning

Data-driven prediction of molecular properties presents unique challenge...
research
04/06/2020

On Negative Transfer and Structure of Latent Functions in Multi-output Gaussian Processes

The multi-output Gaussian process (MGP) is based on the assumption that ...

Please sign up or login with your details

Forgot password? Click here to reset