Hierarchical Non-Stationary Temporal Gaussian Processes With L^1-Regularization

05/20/2021
by   Zheng Zhao, et al.
0

This paper is concerned with regularized extensions of hierarchical non-stationary temporal Gaussian processes (NSGPs) in which the parameters (e.g., length-scale) are modeled as GPs. In particular, we consider two commonly used NSGP constructions which are based on explicitly constructed non-stationary covariance functions and stochastic differential equations, respectively. We extend these NSGPs by including L^1-regularization on the processes in order to induce sparseness. To solve the resulting regularized NSGP (R-NSGP) regression problem we develop a method based on the alternating direction method of multipliers (ADMM) and we also analyze its convergence properties theoretically. We also evaluate the performance of the proposed methods in simulated and real-world datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2023

Non-stationary Gaussian Process Surrogates

We provide a survey of non-stationary surrogate models which utilize Gau...
research
01/11/2018

Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence

We revisit the classical Douglas-Rachford (DR) method for finding a zero...
research
06/24/2019

Sequential Neural Processes

Neural processes combine the strengths of neural networks and Gaussian p...
research
12/25/2019

Scalable Gaussian Process Regression for Kernels with a Non-Stationary Phase

The application of Gaussian processes (GPs) to large data sets is limite...
research
05/23/2019

Learning spectrograms with convolutional spectral kernels

We introduce the convolutional spectral kernel (CSK), a novel family of ...
research
05/17/2016

Exact Simulation of Noncircular or Improper Complex-Valued Stationary Gaussian Processes using Circulant Embedding

This paper provides an algorithm for simulating improper (or noncircular...
research
10/09/2020

Sparse Spectrum Warped Input Measures for Nonstationary Kernel Learning

We establish a general form of explicit, input-dependent, measure-valued...

Please sign up or login with your details

Forgot password? Click here to reset