Nyström Regularization for Time Series Forecasting

11/13/2021
by   Zirui Sun, et al.
0

This paper focuses on learning rate analysis of Nyström regularization with sequential sub-sampling for τ-mixing time series. Using a recently developed Banach-valued Bernstein inequality for τ-mixing sequences and an integral operator approach based on second-order decomposition, we succeed in deriving almost optimal learning rates of Nyström regularization with sequential sub-sampling for τ-mixing time series. A series of numerical experiments are carried out to verify our theoretical results, showing the excellent learning performance of Nyström regularization with sequential sub-sampling in learning massive time series data. All these results extend the applicable range of Nyström regularization from i.i.d. samples to non-i.i.d. sequences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2020

Distributed Learning with Dependent Samples

This paper focuses on learning rate analysis of distributed kernel ridge...
research
02/12/2016

Lasso Guarantees for Time Series Estimation Under Subgaussian Tails and β-Mixing

Many theoretical results on estimation of high dimensional time series r...
research
11/25/2019

A Note on Mixing in High Dimensional Time series

Various mixing conditions have been imposed on high dimensional time ser...
research
05/25/2017

Neural Decomposition of Time-Series Data for Effective Generalization

We present a neural network technique for the analysis and extrapolation...
research
05/09/2014

Training Deep Fourier Neural Networks To Fit Time-Series Data

We present a method for training a deep neural network containing sinuso...
research
05/03/2015

Optimal Time-Series Motifs

Motifs are the most repetitive/frequent patterns of a time-series. The d...
research
01/29/2016

Kernels for sequentially ordered data

We present a novel framework for kernel learning with sequential data of...

Please sign up or login with your details

Forgot password? Click here to reset