Rademacher complexity of stationary sequences

06/03/2011
by   Daniel J. McDonald, et al.
0

We show how to control the generalization error of time series models wherein past values of the outcome are used to predict future values. The results are based on a generalization of standard i.i.d. concentration inequalities to dependent data without the mixing assumptions common in the time series setting. Our proof and the result are simpler than previous analyses with dependent data or stochastic adversaries which use sequential Rademacher complexities rather than the expected Rademacher complexity for i.i.d. processes. We also derive empirical Rademacher results without mixing assumptions resulting in fully calculable upper bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2018

Theory and Algorithms for Forecasting Time Series

We present data-dependent learning bounds for the general scenario of no...
research
03/04/2011

Estimating β-mixing coefficients

The literature on statistical learning for time series assumes the asymp...
research
08/02/2021

Generalization bounds for nonparametric regression with β-mixing samples

In this paper we present a series of results that permit to extend in a ...
research
06/21/2019

Learning from weakly dependent data under Dobrushin's condition

Statistical learning theory has largely focused on learning and generali...
research
02/22/2018

Learning Without Mixing: Towards A Sharp Analysis of Linear System Identification

We prove that the ordinary least-squares (OLS) estimator attains nearly ...
research
05/18/2023

The noise level in linear regression with dependent data

We derive upper bounds for random design linear regression with dependen...
research
05/12/2015

Permutational Rademacher Complexity: a New Complexity Measure for Transductive Learning

Transductive learning considers situations when a learner observes m lab...

Please sign up or login with your details

Forgot password? Click here to reset