Linear Multiple Low-Rank Kernel Based Stationary Gaussian Processes Regression for Time Series

04/21/2019
by   Feng Yin, et al.
0

Gaussian processes (GP) for machine learning have been studied systematically over the past two decades and they are by now widely used in a number of diverse applications. However, GP kernel design and the associated hyper-parameter optimization are still hard and to a large extend open problems. In this paper, we consider the task of GP regression for time series modeling and analysis. The underlying stationary kernel can be approximated arbitrarily close by a new proposed grid spectral mixture (GSM) kernel, which turns out to be a linear combination of low-rank sub-kernels. In the case where a large number of the sub-kernels are used, either the Nyström or the random Fourier feature approximations can be adopted to deal efficiently with the computational demands. The unknown GP hyper-parameters consist of the non-negative weights of all sub-kernels as well as the noise variance; their estimation is performed via the maximum-likelihood (ML) estimation framework. Two efficient numerical optimization methods for solving the unknown hyper-parameters are derived, including a sequential majorization-minimization (MM) method and a non-linearly constrained alternating direction of multiplier method (ADMM). The MM matches perfectly with the proven low-rank property of the proposed GSM sub-kernels and turns out to be a part of efficiency, stable, and efficient solver, while the ADMM has the potential to generate better local minimum in terms of the test MSE. Experimental results, based on various classic time series data sets, corroborate that the proposed GSM kernel-based GP regression model outperforms several salient competitors of similar kind in terms of prediction mean-squared-error and numerical stability.

READ FULL TEXT
research
10/04/2022

Log-Linear-Time Gaussian Processes Using Binary Tree Kernels

Gaussian processes (GPs) produce good probabilistic models of functions,...
research
09/15/2023

Gaussian Processes with Linear Multiple Kernel: Spectrum Design and Distributed Learning for Multi-Dimensional Data

Gaussian processes (GPs) have emerged as a prominent technique for machi...
research
06/06/2019

A General O(n^2) Hyper-Parameter Optimization for Gaussian Process Regression with Cross-Validation and Non-linearly Constrained ADMM

Hyper-parameter optimization remains as the core issue of Gaussian proce...
research
10/31/2017

Tensor Regression Meets Gaussian Processes

Low-rank tensor regression, a new model class that learns high-order cor...
research
06/05/2020

Sparse Gaussian Processes via Parametric Families of Compactly-supported Kernels

Gaussian processes are powerful models for probabilistic machine learnin...
research
03/11/2020

General linear-time inference for Gaussian Processes on one dimension

Gaussian Processes (GPs) provide a powerful probabilistic framework for ...
research
03/06/2022

Fully Decentralized, Scalable Gaussian Processes for Multi-Agent Federated Learning

In this paper, we propose decentralized and scalable algorithms for Gaus...

Please sign up or login with your details

Forgot password? Click here to reset