DeepAI
Log In Sign Up

The Berry-Esséen Upper Bounds of Vasicek Model Estimators

The Berry-Esséen upper bounds of moment estimators and least squares estimators of the mean and drift coefficients in Vasicek models driven by general Gaussian processes are studied. When studying the parameter estimation problem of Ornstein-Uhlenbeck (OU) process driven by fractional Brownian motion, the commonly used methods are mainly given by Kim and Park, they show the upper bound of Kolmogorov distance between the distribution of the ratio of two double Wiener-Itô stochastic integrals and the Normal distribution. The main innovation in this paper is extending the above ratio process, that is to say, the numerator and denominator respectively contain triple Wiener-Itô stochastic integrals at most. As far as we know, the upper bounds between the distribution of above estimators and the Normal distribution are novel.

READ FULL TEXT VIEW PDF

page 1

page 2

page 3

page 4

07/03/2020

Least Squares Estimator for Vasicek Model Driven by Sub-fractional Brownian Processes from Discrete Observations

We study the parameter estimation problem of Vasicek Model driven by sub...
05/19/2021

Point process simulation of Generalised inverse Gaussian processes and estimation of the Jaeger Integral

In this paper novel simulation methods are provided for the Generalised ...
05/06/2022

Improvements of Polya Upper Bound for Cumulative Standard Normal Distribution and Related Functions

Although there is an extensive literature on the upper bound for cumulat...
03/22/2021

Performance Bounds for Neural Network Estimators: Applications in Fault Detection

We exploit recent results in quantifying the robustness of neural networ...
05/17/2020

Berry-Esséen bound for drift estimation of fractional Ornstein Uhlenbeck process of second kind

In the present paper we consider the Ornstein-Uhlenbeck process of the s...
02/14/2022

Stochastic models of Jaya and semi-steady-state Jaya algorithms

We build stochastic models for analyzing Jaya and semi-steady-state Jaya...

1. Introduction

Vasicek model is a type of 1-dimensional stochastic processes, it is used in various fields, such as economy, finance, environment. It was originally used to describe short-term interest rate fluctuations influenced by single market factors. Proposed by O. Vasicek [19], it is the first stochastic process model to describe the “mean reversion” characteristic of short-term interest rates. In the financial field, it can also be used as a random investment model in Wu et al.[20] and Han et al.[9].

Definition 1.

Consider the Vasicek model driven by general Gaussian process, it satisfies the following Stochastic Differential Equation (SDE):

(1)

where and is a general one-dimensional centered Gaussian process that satisfies creftypecap 1.

This paper mainly focuses on the convergence rate of estimators of coefficient . Without loss of generality, we assume , then Vasicek model can be represent by the following form:

When the coefficients in the drift function is unknown, an important problem is to estimate the drift coefficients based on the observation. Based on the Brownian motion, Fergusson and Platen [8] present the maximum likelihood estimators of coefficients in Vasicek model. When the Vasicek model driven by the fractional Brownian motion, Xiao and Yu [21] consider the least squares estimators and their asymptotic behaviors. When , Hu and Nualart [10] study the moment estimation problem.

Since the Gaussian process mainly determines the trajectory properties of Vasicek model. Therefore, following the assumptions in Chen and Zhou [7], we make the following Hypothesis about .

Hypothesis 1 ([7] Hypothesis 1.1).

Let and , Covariance function of Gaussian process satisfies the following condition:

(2)

where

are constants independent with . Besides, for any .

Remark.

The covariance functions of Gaussian processes such as fractional Brownian motion, subfractional Brownian motion and double fractional Brownian motion satisfy the above Hypothesis [7, Examples 1.5-1.8].

Assuming that there is only one trajectory , we can construct the least squares estimators (LSE) and the moment estimators (ME) (See [23, 22, 21, 3] for more details).

Proposition 1 ([6] (4) and (5)).

The estimator of is the continuous-time sample mean:

(3)

The second moment estimator of is given by

(4)

Following from Xiao and Yu [21], we present the LSE in Vasicek model.

Proposition 2 ([6] (7) and (8)).

The LSE is motivated by the argument of minimize a quadratic function of and :

Solving the equation, we can obtain the LSE of and , denoted by and respectively.

(5)
(6)

where the integral is an Itô-Skorohod integral.

Pei et al.[6] prove the following consistencies and central limit theorems (CLT) of estimators.

Theorem 3 ([6], Theorem 1.2).

When creftypecap 1 is satisfied, both ME and LSE of are strongly consistent, that is

Theorem 4 ([6], Theorem 1.3).

Assume creftypecap 1 is satisfied. When is self-similar and , and are asymptotically normal as , that is,

When ,

where

(7)

Simalarly, is also asymptotically normal as :

We now present the main Theorems for the whole paper, and their details are given in the following sections.

Theorem 5.

Let

be a standard Normal random variable, and

be the constant defined by (7). Assume and creftypecap 1 is satisfied. When is large enough, there exists a constant such that

(8)
(9)

where .

Next, we show the convergence speed of mean coefficient estimators and .

Theorem 6.

Assume , and is a self-similar Gaussian process satisfying creftypecap 1 and . Then there exists a constant such that

(10)
(11)

2. Preliminary

In this section, we recall some basic facts about Malliavin calculus with respect to Gaussian process. The reader is referred to [15, 18, 17] for a more detailed explanation. Let be a continuous centered Gaussian process with and covariance function

(12)

defined on a complete probability space

, where is generated by the Gaussian family . Denote as the the space of all real valued step functions on . The Hilbert space is defined as the closure of endowed with the inner product:

(13)

We denote as the isonormal Gaussian process on the probability space, indexed by the elements in , which satisfies the following isometry relationship:

(14)

The following Proposition shows the inner products representation of the Hilbert space [11].

Proposition 7 ([7] Proposition 2.1).

Denote as the set of bounded variation functions on . Then is dense in and

where is the Lebesgue-Stieljes signed measure associated with defined as

When the covariance function satisfies creftypecap 1,

(15)

Furthermore, the norm of the elements in can be induced naturally:

Remark ([7] Notation 1).

Let and be the constants given in creftypecap 1. For any , we define two norms as

For any in , define an operator from to to be

(16)
Proposition 8 ([7] Proposition 3.2).

Suppose that creftypecap 1 holds, then for any ,

(17)

and for any ,

Let and be the

-th tensor product and the

-th symmetric tensor product of . For every , denote as the -th Wiener chaos of . It is defined as the closed linear subspace of generated by , where is the -th Hermite polynomial. Let such that , then for every and ,

where is the -th Wiener-Itô stochastic integral.

Denote as a complete orthonormal system in . The -th contraction between and is an element in :

The following proposition shows the product formula for the multiple integrals.

Proposition 9 ([15] Theorem 2.7.10).

Let and be two symmetric function. Then

(18)

where is the symmetrization of .

We then introduce the derivative operator and the divergence operator. For these details, see sections 2.3-2.5 of [15]. Let be the class of smooth random variables of the form:

where , which partial derivatives have at most polynomial growth, and for , . Then, the Malliavin derivative of (with respect to ) is the element of defined by

Given and integer , let denote the closure of with respect to the norm

Denote (the divergence operator) as the adjoint of . The domain of is composed of those elements:

and is denoted by . If , then is the unique element of characterized by the duality formula:

We now introduce the infinitesimal generator of the Ornstein-Uhlenbeck semigroup. Let be a square integrable random variable. Denote as the orthogonal projection on the -th Wiener chaos . The operator is defined by . The domain of is

For any , define . is called the Pseudo-inverse of . Note that and holds for any .

The following Lemma 10 provides the Berry-Esséen upper bound on the sum of two random variables.

Lemma 10 ([4] Lemma 2).

For any variable and , the following inequality holds:

(19)

where is the standard Normal distribution function.

Using Malliavin calculus, Kim and Park [12] provide the Berry-Esséen upper bound of the quotient of two random variables.

Let be a zero-mean process, and satisfies a.s.. For simplicity, we define the following four functions:

Theorem 11 ([12] Theorem 2 and Corollary 1).

Let be a standard Normal variable. Assuming that for every , has an absolutely continuous law with respect to Lebesgue measure and , , as . Then, there exists a constant such that for large enough,

3. Berry-Esséen upper bounds of moment estimators

In this section, we will prove the Berry-Esséen upper bounds of Vasicek model moment estimators and . For the convenience of the following discussion, we first define :

(20)

where is a standard Normal variable. Next, we introduce the CLT of .

Theorem 12 ([6] Proposition 4.19).

Assume , and is a self-similar Gaussian process satisfying creftypecap 1 and . Then is asymptotically normal as :

(21)

where

is stochastic integral with respect to .

Following from the above Theorem, we can obtain the expanded form of (20):

Then, we can prove the convergence speed of .

Proof of formula (10).

Let , According to Lemma 10, we have

Since is self-similar, is standard Normal variable,

Following from Chebyshev inequality, we can obtain

where

The Proposition 3.10 of [6] ensures that is bounded. Combining the above results, we have

(22)

When is sufficiently large, there exists the constant such that the formula (10) holds. ∎

Similarly, we review the central limit theorem of .

Theorem 13 ([6] Proposition 4.18).

Assume and is a Gaussian process satisfying creftypecap 1. Then is asymptotically normal as :

The following Lemma shows the upper bound of the expectation of .

Lemma 14.

Let be the process defined by

When , there exists constant independent of such that

(23)
Proof.

According to (14) and (17), we can obtain

where .

It is easy to see that

(24)

where is a constant. Also, we have

(25)

Combining the above two formulas, we obtain (23). ∎

Denote as

Then we can obtain the Berry-Esséen upper bound of ME .

Proof of formula (8).

According to [6] Proposition 4.18, we have

We denote as the tail probability and

Then we can obtain

(26)

Denote as

where , . The Lemma 5.4 of [7] ensures that

Combining with Lemma 15, we obtain the desired result. ∎

The following Lemma provides the upper bound of .

Lemma 15.

When is large enough, there exists constant such that

where .

Proof.

Since the Normal distribution is symmetric, we have

Consider the following processes:

(27)

where is an OU process driven by . According to [6] formula (63), we can obtain

where