A note on strong-consistency of componentwise ARH(1) predictors

08/15/2018
by   M. D. Ruiz-Medina, et al.
University of Granada
0

New results on strong-consistency, in the Hilbert-Schmidt and trace operator norms, are obtained, in the parameter estimation of an autoregressive Hilbertian process of order one (ARH(1) process). In particular, a strongly-consistent diagonal componentwise estimator of the autocorrelation operator is derived, based on its empirical singular value decomposition.

READ FULL TEXT VIEW PDF

Authors

page 1

page 2

page 3

page 4

08/05/2018

Strongly consistent autoregressive predictors in abstract Banach spaces

This work derives new results on strong consistent estimation and predic...
01/26/2018

Strong-consistent autoregressive predictors in abstract Banach spaces

This work derives new results on the strong-consistency of a componentwi...
05/03/2019

A Uniform Bound of the Operator Norm of Random Element Matrices and Operator Norm Minimizing Estimation

In this paper, we derive a uniform stochastic bound of the operator norm...
07/19/2019

A Note on Exploratory Item Factor Analysis by Singular Value Decomposition

In this note, we revisit a singular value decomposition (SVD) based algo...
12/15/2015

Increasing the Action Gap: New Operators for Reinforcement Learning

This paper introduces new optimality-preserving operators on Q-functions...
02/03/2020

A Frame Decomposition of the Atmospheric Tomography Operator

We consider the problem of atmospheric tomography, as it appears for exa...
12/01/2012

Computing Strong and Weak Permissions in Defeasible Logic

In this paper we propose an extension of Defeasible Logic to represent a...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction.

There exists an extensive literature on Functional Data Analysis (FDA) techniques. In the past few years, the primary focus of FDA was mainly on independent and identically distributed (i.i.d.) functional observations. The classical book by Ramsay and Silverman RamsaySilverman05

provides a wide overview on FDA techniques (e.g., regression, principal components analysis, linear modeling, canonical correlation analysis, curve registration, and principal differential analysis, etc). An introduction to nonparametric statistical approaches for FDA can be found in Ferraty and Vieu

Ferraty06 . We also refer to the recent monograph by Hsing and Eubank HsingEubank15 , where the usual functional analytical tools in FDA are introduced, addressing several statistical and estimation problems for random elements in function spaces. Special attention is paid to the monograph by Horváth and Kokoszka HorvathandKokoszka covering functional inference based on second order statistics.

We refer the reader to the methodological survey paper by Cuevas Cuevas14

, covering nonparametric techniques and discussing central topics in FDA. Recent advances on statistics in high/infinite dimensional spaces are collected in the IWFOS’14 Special Issue published in the Journal of Multivariate Analysis (see Goia and Vieu

GoiaVieu16 who summarized its contributions, providing a brief discussion on the current literature).

A central issue in FDA is to take into account the temporal dependence of the observations. Although the literature on scalar and vector time series is huge, there are relatively few contributions dealing with functional time series, and, in general, with dependent functional data. For instance, Part III (Chapters 13–18) of the monograph by Horváth and Kokoszka

HorvathandKokoszka

is devoted to this issue, including topics related to functional time series (in particular, the functional autoregressive model), and the statistical analysis of spatially distributed functional data. The moment-based notion of weak dependence introduced in Hörmann and Kokoszka

HormannKokoszka10 is also accommodated to the statistical analysis of functional time series. This notion does not require the specification of a data model, and can be used to study the properties of many nonlinear sequences (see e.g., Hörmann H2008 ; Berkes et al. Berkes , for recent applications).

This paper adopts the methodological approach presented in Bosq Bosq00 for functional time series. That monograph studies the theory of linear functional time series, both in Hilbert and Banach spaces, focusing on the functional autoregressive model. Several authors have studied the asymptotic properties of componentwise estimators of the autocorrelation operator of an ARH(1) process, and of the associated plug-in predictors. We refer to Guillas01 ; Mas99 ; Mas04 ; Mas07 , where the efficiency, consistency and asymptotic normality of these estimators are addressed, in a parametric framework (see also Álvarez-Liébana, Bosq and Ruiz-Medina Alvarez1600 , on estimation of the Ornstein-Uhlenbeck processes in Banach spaces, and Alvarez16 , on weak consistency in the Hilbert-Schmidt operator norm of componentwise estimators). Particularly, strong-consistency in the norm of the space of bounded linear operators was derived in Bosq00

. In the derivation of these results, the autocorrelation operator is usually assumed to be a Hilbert-Schmidt operator, when the eigenvectors of the autocovariance operator are unknown. This paper proves that, under basically the same setting of conditions as in the cited papers, the componentwise estimator of the autocorrelation operator proposed in

Bosq00 , based on the empirical eigenvectors of the autocovariance operator, is also strongly-consistent in the Hilbert-Schmidt and trace operator norms.

The dimension reduction problem constitutes also a central topic in the parametric, nonparametric and semiparametric FDA statistical frameworks. Special attention to this topic has been paid, for instance, in the context of functional regression with functional response and functional predictors (see, for example, Ferraty et al. FerratyKeilegom12 , where asymptotic normality is derived, and, Ferraty et al. Ferraty02 , in the functional time series framework). In the semiparametric and nonparametric estimation techniques, a kernel-based formulation is usually adopted. Real-valued covariates were incorporated in the novel semiparametric kernel-based proposal by Aneiros-Pérez and Vieu Aneiros08 , providing an extension to the functional partial linear time series framework (see also Aneiros-Pérez and Vieu AneirosVieu06 ). Motivated by spectrometry applications, a two-terms Partitioned Functional Single Index Model is introduced in Goia and Vieu GoiaVieu15 , in a semiparametric framework. In the ARH(1) process framework, the present paper provides a new diagonal componentwise estimator of the autocorrelation operator, based on its empirical singular value decomposition. Its strong-consistency is proved as well. The diagonal design leads to an important dimension reduction, going beyond the usual isotropic restriction on the kernels involved in the approximation of the regression operator (respectively, autocorrelation operator), in the nonparametric framework. Recently, Petrovich and Reimherr PetrovichReimherr17

address the dimension reduction provided by the functional principal component projections in the general case when eigenvalues can be repeated, instead of the classical assumptions that their multiplicity should be one.

The outline of the paper is the following. Section 2 introduces basic definitions and preliminary results. Section 3 derives strong-consistency of the estimator introduced in Bosq Bosq00 , in the trace norm. Section 4 formulates a strongly-consistent diagonal componentwise estimator of the autocorrelation operator. Proofs of the results are given in Section 5.

2 Preliminaries.

Let be a real separable Hilbert space, and let be a zero-mean ARH(1) process on the probability space satisfying:

(1)

where with being the space of bounded linear operators, with the uniform norm for every In our case, satisfies for and for some where denotes the th power of i.e., the composition operator The -valued innovation process

is assumed to be a strong white noise, and to be uncorrelated with the random initial condition.

then admits the MAH() representation for providing the unique stationary solution to equation (1) (see Bosq00 ).

The trace autocovariance operator of is given by for and its empirical version is defined as

(2)

where, for and the random operator is given by In the following, and denote the respective sequence of eigenvalues and eigenvectors of the autocovariance operator satisfying for Also, by and we respectively denote the empirical eigenvalues and eigenvectors of (see Bosq00 , pp. 102–103),

(3)

Consider now the nuclear cross-covariance operator and its empirical version

The following assumption will appear in the subsequent development.

Assumption A1. The random initial condition of in (1) satisfies for some Here, a.s. denotes almost surely.

Theorem 1

(see Theorem 4.1 on pp. 98–99, Corollary 4.1 on pp. 100–101 and Theorem 4.8 on pp. 116–117, in Bosq00 ). If for any as

(4)

where means almost surely convergence. Under Assumption A1,

(5)

where is the Hilbert-Schmidt operator norm.

Let be a truncation parameter such that and

(6)

3 Strong-consistency in the trace operator norm

This section derives the strong-consistency of the componentwise estimator (see equation (9) below), in the trace norm, which also implies its strong-consistency in the Hilbert-Schmidt operator norm. As it is well-known, for a trace operator on its trace norm is finite, and, for an orthonormal basis of such a norm is given by

(7)

In Theorem 2 below, the following lemma will be applied:

Lemma 1

Under Assumption A1, if, as

(8)

The proof of this lemma is given in Section 5.

The following condition is assumed in the remainder of this section:

Assumption A2. The empirical eigenvalue where denotes the truncation parameter introduced in the previous section.

Under Assumption A2, from the observations of consider the componentwise estimator of (see (8.59) p.218 in Bosq00 )

(9)

where is the inverse of the restriction of

to its principal eigenspace of dimension

which is bounded under Assumption A2. Here, denotes the projection operator into the principal eigenspace of dimension and is its adjoint or inverse.

Theorem 2

Let be the autocorrelation operator defined as before. Assume in (6) satisfies as for Then, for in (9), the following assertions hold:

(i) If under Assumption A2,

(10)

(ii) Under Assumptions A1-A2, if is a trace operator, then,

(11)

The proof of this result is given in Section 5.

The strong consistency in of the associated ARH(1) plug-in predictor of then follows (see also Bosq00 and Section 5).

4 A strongly-consistent diagonal componentwise estimator

In this section, we consider the following assumption:

Assumption A3. Assume that is strictly positive, i.e., for every and is a nuclear operator such that is compact.

Under Assumption A3, admits the singular value decomposition(svd)

(12)

where, for every with being the singular value, and and the right and left eigenvectors, respectively. Since is a nuclear operator, it admits the svd   where and are the respective right and left eigenvectors of and are the singular values. is also nuclear, and with and being the right and left eigenvectors, respectively, and the singular values. Applying Lemma 4.2, on p. 103, in Bosq00 ,

(13)

From Theorem 1 (see equation (13)), under the conditions assumed in such a theroem, for sufficiently large, in view of Assumption A3, the composition operator is compact on admitting the svd

(14)

where for with and being the empirical right and left eigenvectors of

Proposition 1

Under conditions in Theorem 2(ii), and Assumption A3,

(15)

The proof of this proposition directly follows from

(16)

where with Under Assumption A3, equation (15) holds, if the conditions assumed in Bosq00 for the strong-consistency of in hold. From Proposition 1, and (12) and (14), applying Lemma 4.2, on p. 103 in Bosq00 ,

(17)

Let us define the following quantity:

(18)

where denotes the truncation parameter introduced in Section 2. We now apply the methodology of the proof of Lemma 4.3, on p. 104, and Corollary 4.3, on p. 107, in Bosq00 , to obtain the strong-consistency of the empirical right and left eigenvectors, and of under the following additional assumption:

Assumption A4. Consider

Lemma 2

Under Assumptions A3–A4, and the conditions of Theorem 2(ii), if in (18) is such that, as with a.s., then,

(19)

where, for with and .

The proof of this lemma is given in Section 5.

The following diagonal componentwise estimator of is formulated:

(20)

The next result derives the strong-consistency of

Theorem 3

Under the conditions of Lemma 2, if, as with a.s., then,

The proof of this result is given in Section 5.

5 Proofs of the results

Proof of Lemma 1

Let us denote where Applying the triangle and Cauchy–Schwarz inequalities, we obtain, as

(21)

since, from Corollary 4.3 in p.107 in Bosq00 ,

(22)

From (21), under the condition

applying Theorem 1, we obtain

Proof of Theorem 2

(i) Applying Hölder and triangle inequalities, since is bounded, from Theorem 1, under as for

(23)

for Then,

(ii) Under Assumptions A1–A2, from Theorem 1,

Hence, from equation (23), as

(24)

Let us now consider

(25)

From equation (24), the first term at the right-hand side of inequality (25) converges a.s. to zero. From Lemma 1, converges a.s. to in as Since is trace operator, Dominated Covergence Theorem leads to and

Strong-consistency of the plug-in predictor

Corollary 1

Under the conditions of Theorem 2(ii),

(26)

Proof. Let under Assumption A1. From Theorem 2(ii), we then have

(27)

Proof of Lemma 2

Under Assumption A3, and are self-adjoint compact operators, admitting the following diagonal spectral series representations in

(28)

From (28), applying triangle inequality,

(30)

On the other hand,

(31)

Furthermore,