Generalization bounds for nonparametric regression with β-mixing samples

08/02/2021
by   David Barrera, et al.
0

In this paper we present a series of results that permit to extend in a direct manner uniform deviation inequalities of the empirical process from the independent to the dependent case characterizing the additional error in terms of β-mixing coefficients associated to the training sample. We then apply these results to some previously obtained inequalities for independent samples associated to the deviation of the least-squared error in nonparametric regression to derive corresponding generalization bounds for regression schemes in which the training sample may not be independent. These results provide a framework to analyze the error associated to regression schemes whose training sample comes from a large class of β-mixing sequences, including geometrically ergodic Markov samples, using only the independent case. More generally, they permit a meaningful extension of the Vapnik-Chervonenkis and similar theories for independent training samples to this class of β-mixing samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2022

Bernstein-type Inequalities and Nonparametric Estimation under Near-Epoch Dependence

The major contributions of this paper lie in two aspects. Firstly, we fo...
research
06/03/2011

Rademacher complexity of stationary sequences

We show how to control the generalization error of time series models wh...
research
01/14/2019

The Bahadur representation for sample quantiles under dependent sequence

On the one hand, we investigate the Bahadur representation for sample qu...
research
02/25/2014

Novel Deviation Bounds for Mixture of Independent Bernoulli Variables with Application to the Missing Mass

In this paper, we are concerned with obtaining distribution-free concent...
research
03/20/2022

Confidence intervals for nonparametric regression

We demonstrate and discuss nonasymptotic bounds in probability for the c...
research
07/26/2018

Rademacher Generalization Bounds for Classifier Chains

In this paper, we propose a new framework to study the generalization pr...
research
06/20/2019

Data Interpolating Prediction: Alternative Interpretation of Mixup

Data augmentation by mixing samples, such as Mixup, has widely been used...

Please sign up or login with your details

Forgot password? Click here to reset