State Space representation of non-stationary Gaussian Processes

01/07/2016
by   Alessio Benavoli, et al.
0

The state space (SS) representation of Gaussian processes (GP) has recently gained a lot of interest. The main reason is that it allows to compute GPs based inferences in O(n), where n is the number of observations. This implementation makes GPs suitable for Big Data. For this reason, it is important to provide a SS representation of the most important kernels used in machine learning. The aim of this paper is to show how to exploit the transient behaviour of SS models to map non-stationary kernels to SS models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2023

Non-stationary Gaussian Process Surrogates

We provide a survey of non-stationary surrogate models which utilize Gau...
research
05/24/2019

Sequential Gaussian Processes for Online Learning of Nonstationary Functions

Many machine learning problems can be framed in the context of estimatin...
research
03/18/2021

Data-Driven Wireless Communication Using Gaussian Processes

Data-driven paradigms are well-known and salient demands of future wirel...
research
03/11/2020

General linear-time inference for Gaussian Processes on one dimension

Gaussian Processes (GPs) provide a powerful probabilistic framework for ...
research
07/17/2018

Mixed-Stationary Gaussian Process for Flexible Non-Stationary Modeling of Spatial Outcomes

Gaussian processes (GPs) are commonplace in spatial statistics. Although...
research
05/27/2019

Interpretable deep Gaussian processes

We propose interpretable deep Gaussian Processes (GPs) that combine the ...
research
11/18/2022

Active Learning with Convolutional Gaussian Neural Processes for Environmental Sensor Placement

Deploying environmental measurement stations can be a costly and time-co...

Please sign up or login with your details

Forgot password? Click here to reset