A maximal inequality for local empirical processes under weak dependence

07/03/2023
by   Luis Alvarez, et al.
0

We introduce a maximal inequality for a local empirical process under strongly mixing data. Local empirical processes are defined as the (local) averages 1/nh∑_i=1^n 1{x - h ≤ X_i ≤ x+h}f(Z_i), where f belongs to a class of functions, x ∈ℝ and h > 0 is a bandwidth. Our nonasymptotic bounds control estimation error uniformly over the function class, evaluation point x and bandwidth h. They are also general enough to accomodate function classes whose complexity increases with n. As an application, we apply our bounds to function classes that exhibit polynomial decay in their uniform covering numbers. When specialized to the problem of kernel density estimation, our bounds reveal that, under weak dependence with exponential decay, these estimators achieve the same (up to a logarithmic factor) sharp uniform-in-bandwidth rates derived in the iid setting by <cit.>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2021

Empirical process theory for nonsmooth functions under functional dependence

We provide an empirical process theory for locally stationary processes ...
research
07/11/2020

Empirical process theory for locally stationary processes

We provide a framework for empirical process theory of locally stationar...
research
01/02/2019

Kernel Density Estimation Bias under Minimal Assumptions

Kernel Density Estimation is a very popular technique of approximating a...
research
08/24/2022

An Improved Bernstein-type Inequality for C-Mixing-type Processes and Its Application to Kernel Smoothing

There are many processes, particularly dynamic systems, that cannot be d...
research
09/21/2021

Minimax Rates for Conditional Density Estimation via Empirical Entropy

We consider the task of estimating a conditional density using i.i.d. sa...
research
10/06/2015

Local Rademacher Complexity Bounds based on Covering Numbers

This paper provides a general result on controlling local Rademacher com...
research
05/29/2019

Improved Generalisation Bounds for Deep Learning Through L^∞ Covering Numbers

Using proof techniques involving L^∞ covering numbers, we show generalis...

Please sign up or login with your details

Forgot password? Click here to reset