Fast variable selection makes Karhunen-Loève decomposed Gaussian process BSS-ANOVA a speedy and accurate choice for dynamic systems identification

05/26/2022
by   David S. Mebane, et al.
12

Many approaches for scalable GPs have focused on using a subset of data as inducing points. Another promising approach is the Karhunen-Loève (KL) decomposition, in which the GP kernel is represented by a set of basis functions which are the eigenfunctions of the kernel operator. Such kernels have the potential to be very fast, and do not depend on the selection of a reduced set of inducing points. However KL decompositions lead to high dimensionality, and variable selection thus becomes paramount. This paper reports a new method of forward variable selection, enabled by the ordered nature of the basis functions in the KL expansion of the Bayesian Smoothing Spline ANOVA kernel (BSS-ANOVA), coupled with fast Gibbs sampling in a fully Bayesian approach. The new algorithm determines how high the orders of included terms should reach, balancing model fidelity with model complexity using L^0 penalties inherent in Bayesian and Akaike information criteria. The inference speed and accuracy makes the method especially useful for modeling dynamic systems, by modeling the derivative in a dynamic system as a static problem, then integrating the learned dynamics using a high-order scheme. The methods are demonstrated on two dynamic datasets: a `Susceptible, Infected, Recovered' (SIR) toy problem, with the transmissibility used as forcing function, along with the experimental `Cascaded Tanks' benchmark dataset. Comparisons on the static prediction of derivatives are made with a random forest (RF), a residual neural network (ResNet), and the Orthogonal Additive Kernel (OAK) inducing points scalable GP, while for the timeseries prediction comparisons are made with LSTM and GRU recurrent neural networks (RNNs).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/05/2018

Scalable Gaussian Processes with Grid-Structured Eigenfunctions (GP-GRIEF)

We introduce a kernel approximation strategy that enables computation of...
research
04/14/2023

Dynamic variable selection in high-dimensional predictive regressions

We develop methodology and theory for a general Bayesian approach toward...
research
10/19/2017

Scalable Gaussian Processes with Billions of Inducing Inputs via Tensor Train Decomposition

We propose a method (TT-GP) for approximate inference in Gaussian Proces...
research
11/01/2022

Bayesian Structural Identification using Gaussian Process Discrepancy Models

Bayesian model updating based on Gaussian Process (GP) models has receiv...
research
08/28/2020

Locally induced Gaussian processes for large-scale simulation experiments

Gaussian processes (GPs) serve as flexible surrogates for complex surfac...
research
03/18/2020

An Application of Gaussian Process Modeling for High-order Accurate Adaptive Mesh Refinement Prolongation

We present a new polynomial-free prolongation scheme for Adaptive Mesh R...

Please sign up or login with your details

Forgot password? Click here to reset