Strongly consistent autoregressive predictors in abstract Banach spaces

08/05/2018
by   MD Ruiz-Medina, et al.
University of Granada
0

This work derives new results on strong consistent estimation and prediction for autoregressive processes of order 1 in a separable Banach space B. The consistency results are obtained for the component-wise estimator of the autocorrelation operator in the norm of the space L(B) of bounded linear operators on B. The strong consistency of the associated plug-in predictor then follows in the B-norm. A Gelfand triple is defined through the Hilbert space constructed in Kuelbs (1970)' lemma. A Hilbert--Schmidt embedding introduces the Reproducing Kernel Hilbert space (RKHS), generated by the autocovariance operator, into the Hilbert space conforming the Rigged Hilbert space structure. This paper extends the work of Bosq (2000) and Labbas and Mourid 2002.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

01/26/2018

Strong-consistent autoregressive predictors in abstract Banach spaces

This work derives new results on the strong-consistency of a componentwi...
08/15/2018

A note on strong-consistency of componentwise ARH(1) predictors

New results on strong-consistency, in the Hilbert-Schmidt and trace oper...
08/28/2020

An optimal linear filter for estimation of random functions in Hilbert space

Let be a square-integrable, zero-mean, random vector with observable re...
08/26/2021

Minimal Stinespring Representations of Operator Valued Multilinear Maps

A completely positive linear map φ from a C*-algebra A into B(H) has a S...
12/19/2019

Parseval Proximal Neural Networks

The aim of this paper is twofold. First, we show that a certain concaten...
05/07/2018

Nonparametric regression estimation for quasi-associated Hilbertian processes

We establish the asymptotic normality of the kernel type estimator for t...
09/26/2013

Hilbert Space Embeddings of Predictive State Representations

Predictive State Representations (PSRs) are an expressive class of model...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

In the last few decades, there has been a growing interest in the statistical analysis of high-dimensional data from the Functional Data Analysis (FDA) perspective. The book by Ramsay and Silverman

RamsaySilverman05 provides an overview of FDA techniques adapted from the multivariate data context or specifically formulated for the FDA framework. The monograph by Hsing and Eubank HsingEubank15 introduces functional analytical tools useful for the estimation of random elements in function spaces. The book by Horváth and Kokoszka HorvathandKokoszka is mainly concerned with inference based on second order statistics; a central topic in this book is the analysis of functional data exhibiting dependent structures in time and space. The methodological survey paper by Cuevas Cuevas14 discusses central topics in FDA. Recent advances in the statistical analysis of high-dimensional data from the parametric, semiparametric and nonparametric FDA viewpoints are collected in the JMVA Special Issue by Goia and Vieu GoiaVieu16 .

Linear time series models traditionally arise for processing temporal linear correlated data. In the FDA context, Bosq’s monograph Bosq00

introduces linear functional time series theory. The RKHS generated by the autocovariance operator plays a crucial role in the estimation approach presented therein. In particular, the eigenvectors of the autocovariance operator are considered for projection; see also

Alvarez17 . Its empirical version is computed when they are unknown. The resulting plug-in predictor is obtained as a linear functional of the observations, based on the empirical approximation of the autocorrelation operator. This approach exploits the Hilbert space structure; its extension to the metric space context, particularly in the Banach space context, relies on a relationship (continuous embeddings) between the Banach space norm and the RKHS norm induced by the extended autocovariance operator. This is in contrast with the nonparametric regression approach, where semi-metric spaces are usually considered; see, e.g., FerratyKeilegom12 , where asymptotic normality is derived, in the regression model with functional response and covariates. In particular, a linear combination of the observed response values is considered, in the nonparametric local-weighting-based approach. Here, the weights are defined from an isotropic kernel depending on the metric or semi-metric of the space in which the regressors take their values; see, e.g., Ferraty06 , and, in particular, Ferraty02 in the functional time series framework. However, the more flexible nonparametric approach also presents some computational drawbacks, requiring the resolution of several selection problems. For instance, a choice of smoothing parameter and the kernel involved in the definition of the weights should be performed. Real-valued covariates were incorporated in the semiparametric kernel-based proposal by Aneiros-Pérez and Vieu Aneiros08 , which involves an extension to the functional partial linear time series framework; see also AneirosVieu06

on semi-functional partial linear regression. Goia and Vieu 

GoiaVieu15 also adopted a semiparametric approach in their formulation of a two-term Partitioned Functional Single Index Model. Geenens Geenens11 exploited the alternative provided by semi-metrics to avoid the so-called curse of dimensionality.

In a parametric linear framework, functional time series models in Banach spaces were introduced in MasPumo10 . Strong mixing conditions and the absolute regularity of Banach-valued autoregressive processes were studied in Allam11 . Empirical estimators for Banach-valued autoregressive processes were discussed in Bosq02 where, under some regularity conditions, and for the case of orthogonal innovations, the empirical mean was shown to be asymptotically optimal, with respect to almost sure convergence and convergence of order . The empirical autocovariance operator was also interpreted as a sample mean of an autoregressive process in a suitable space of linear operators. The extension of these results to the case of weakly dependent innovations was obtained in Dehling05 . A strongly consistent sieve estimator of the autocorrelation operator of a Banach-valued autoregressive process was considered in RachediMourid03 . Limit theorems for a seasonality estimator were given in Mourid02 in the case of Banach autoregressive perturbations. Confidence regions were also obtained for the seasonality function in the Banach space of continuous functions. An approximation of Parzen’s optimal predictor in the RKHS framework was used in MokhtariMouri

to predict temporal stochastic processes in Banach spaces. The existence and uniqueness of an almost surely strictly periodically correlated solution to the first order autoregressive model in Banach spaces was derived in

Parvardeha . Under some regularity conditions, limit results were obtained for AR(1) processes in Hajj11 , where denotes the Skorokhod space of right-continuous functions on having a left limit at all . Conditions for the existence of strictly stationary solutions of ARMA equations in Banach spaces with independent and identically distributed noise innovations can be found in Spangenberg13 .

In deriving strong consistency results for ARB component-wise estimators and predictors, Bosq Bosq00 restricted his attention to the case of the Banach space of continuous functions on with the sup norm. Labbas and Mourid Labbas02 considered an ARB context, for an arbitrary real separable Banach space , under the construction of a Hilbert space , where is continuously embedded as in Kuelbs’ lemma Kuelbs70 . Assuming the existence of a continuous extension to of the autocorrelation operator they proved the strong consistency of the formulated component-wise estimator of and of its associated plug-in predictor in the norms of and , respectively.

The linear time series framework in Banach spaces studied here is motivated by the statistical analysis of temporal correlated functional data in nuclear spaces, arising notably in the observation of the solution to stochastic differential or fractional pseudodifferential equations; see, e.g., Anh16a ; Anh16b . The scales of Banach spaces constituted by fractional Sobolev and Besov spaces play a central role in the context of nuclear spaces. Continuous (nuclear) embeddings usually connect the elements of these scales; see, e.g., Triebel83 . In this paper a Rigged-Hilbert-Space structure is defined, involving the separable Hilbert space appearing in the construction of Kuelbs’ lemma Kuelbs70 . A key assumption here is the existence of a continuous (Hilbert–Schmidt class) embedding introducing the RKHS associated with the extended autocovariance operator of the ARB process, into the Hilbert space generating the Gelfand triple, equipped with a finer topology than the -topology. Under this scenario, strong consistency results are derived in the space of bounded linear operators on , considering an abstract separable Banach space framework.

This paper is structured as follows. Background material and notation are first given in Section 2. Section 3 states the basic assumptions and key lemmas which are then proved in Section 4. This paper’s main strong consistency result is derived in Section 5, and examples are presented in Section 6. Closing comments are in Section 7. The results are illustrated numerically in an Online Supplement under the scenario described in Section 6.

2 Preliminaries

Let be a real separable Banach space equipped with the norm and let be the space of zero mean

-valued random variables

such that

Let be a zero mean

-valued stochastic process on the probability space

such that, for all ,

(1)

where denotes the autocorrelation operator of ; see Bosq00 . In Eq. (1), the -valued innovation process on

is assumed to be a strong white noise uncorrelated with the random initial condition. Thus

is a zero mean Banach-valued stationary process with iid components and for all .

Assume that there exists an integer such that

(2)

Then Eq. (1) admits a unique strictly stationary solution with i.e., belonging to given by for each ; see Bosq00 . Under (2), the autocovariance operator of an ARB process is defined from the autocovariance operator of as for all , and the cross-covariance operator is given by for all . Since is assumed to be a nuclear operator, then as per Eq. (6.24) on p. 156 of Bosq00 , there exists a sequence such that, for every ,

If is also assumed to be a nuclear operator, then, as per Eq. (6.23) on p. 156 of Bosq00 , there exist sequences and such that, for every

From Eqs. (6.45) and (6.58) on pp. 164–168 of Bosq00 , empirical estimators of and are respectively given, for all and any integer , by

Lemma 2.1 in Kuelbs70 , recalled just below, plays a key role in our approach.

Lemma 1.

If is a real separable Banach space with norm then there exists an inner product on such that the norm generated by is weaker than The completion of under the norm defines the Hilbert space where is continuously embedded.

Denote by a dense sequence in and by a sequence of bounded linear functionals on satisfying

(3)

such that, for all ,

(4)

The inner product and associated norm, in Lemma 1, is defined, for all , by

while for all ,

(5)

where is a sequence of positive numbers summing up to .

3 Main assumptions and preliminary results

In view of Lemma 1, for every satisfies almost surely, for all ,

where is any orthonormal basis of The trace auto-covariance operator

of the extended ARB process is a trace operator on

admitting a diagonal spectral representation in terms of its eigenvalues

and eigenvectors that provide an orthonormal system in . In what follows, the following identities in will be considered, for the extended version of ARB process . For arbitrary

(6)
(7)
(8)

where, for arbitrary integer is a complete orthonormal system in and

The following assumption plays a crucial role in the derivation of the main results in this paper.

Assumption A1.

is almost surely bounded, and the eigenspace

associated with in (6) is one-dimensional for every integer .


Under Assumption A1, we can define the following quantities:

(9)
Remark 1.

This assumption can be relaxed to consider multidimensional eigenspaces by redefining the quantities as the quantities given in Lemma 4.4 of Bosq00 .

Assumption A2. Let be such that a.s., and both and as .

Remark 2.

Consider

(10)

Then for sufficiently large , we have

Assumption A3. As ,


Assumption A4. The constants are such that the inclusion of into is continuous, i.e., where denotes the continuous embedding.

Let us consider the closed subspace of with the norm induced by the inner product defined as follows:

(11)

Then is continuously embedded into and the following remark provides the isometric isomorphism established by the Riesz Representation Theorem between the spaces and its dual

Remark 3.

Let and such that, for every integer , consider and for certain Then the following identities hold:

Lemma 2.

Under Assumption A4, the following continuous embeddings hold:

(12)

where

Proof. Let us consider the following inequalities, valid for all such that :

(13)

Under Assumption A4 (see also Remark 3), for every , we have

(14)

From Eqs. (13) and (14), the inclusions in (12) are continuous. Thus the proof is complete.


It is well known that the set is also an orthogonal system in Futhermore, under Assumption A4, from Lemma 2, . Therefore, from (11), for every

(15)

The following assumption is now considered on the norm (15).


Assumption A5. The continuous embedding belongs to the Hilbert–Schmidt class, i.e., . Let be defined as in Lemma 1. Assumption A5 leads to

(16)

where, in particular, from (16),

(17)

The following preliminary results are deduced from Theorem 4.1 (pp. 98–99), Corollary 4.1 (pp. 100–101), and Theorem 4.8 (pp. 116–117) in Bosq00 .

Lemma 3.

Under Assumption A1, the following identities hold, for any standard AR process, e.g., the extension to of ARB process satisfying Eq. (1), as :

(18)

Here, denotes almost surely convergence, and is the norm in the Hilbert space of Hilbert–Schmidt operators on , i.e., the subspace of compact operators such that for any orthonormal basis of

Lemma 4.

Under Assumption A1, let and in (6) and (7), respectively. Then, as ,

The following lemma is Corollary 4.3 on p. 107 of Bosq00 .

Lemma 5.

Under Assumption A1, consider in (10) satisfying as Then, as ,

where, for every integer and , with denoting an indicator function.

An upper bound for is obtained next.

Lemma 6.

Under Assumption A5, the following inequality holds:

where has been introduced in (17), denotes the space of bounded linear operators on and the usual uniform norm on such a space.

Let us consider the following notation:

(19)
Remark 4.

From Lemma 3, for sufficiently large , there exist positive constants and such that, for all ,

In particular, for every considering sufficiently large, we find

(20)

Eq. (20) means that, for sufficiently large , the norm of the RKHS of is equivalent to the norm of the RKHS generated by with spectral kernel given in (19).

Lemma 7.

Under Assumptions A1–A5, consider in (10) satisfying

(21)

as , where has been introduced in Assumption A2. The following almost sure inequality then holds:

Therefore, as

Lemma 8.

For a standard ARB process satisfying Eq. (1), under Assumptions A1–A5, consider in (10) such that

(22)

as , where has been introduced in Assumption A2. The following inequality then is established

(23)

Therefore, under (22), as

Lemma 9.

Under Assumption A3, if as then also

(24)
Remark 5.

Under the conditions of Lemma 8, Eq. (24) holds as soon as

Let us know consider the projection operators defined, for all , by

(25)
Remark 6.

Under the conditions of Remark 5, let

Then, as ,

4 Proofs of the lemmas

4.1 Proof of Lemma 6

Applying the Cauchy–Schwarz inequality, we have, for all integers ,

(26)

where has been introduced in Eq. (3), and satisfies (4)–(5). Under Assumption A5, from Eq. (17),

This completes the proof.

4.2 Proof of Lemma 7

First observe that