# Nonparametric Estimation for I.I.D. Paths of a Martingale Driven Model with Application to Non-Autonomous Fractional SDE

This paper deals with a projection least square estimator of the function J_0 computed from multiple independent observations on [0,T] of the process Z defined by dZ_t = J_0(t)d⟨ M⟩_t + dM_t, where M is a centered, continuous and square integrable martingale vanishing at 0. Risk bounds are established on this estimator and on an associated adaptive estimator. An appropriate transformation allows to rewrite the differential equation dX_t = V(X_t)(b_0(t)dt +σ(t)dB_t), where B is a fractional Brownian motion of Hurst parameter H∈ (1/2,1), as a model of the previous type. So, the second part of the paper deals with risk bounds on a nonparametric estimator of b_0 derived from the results on the projection least square estimator of J_0. In particular, our results apply to the estimation of the drift function in a non-autonomous extension of the fractional Black-Scholes model introduced in Hu et al. (2003).

## Authors

• 11 publications
04/04/2020

### Nonparametric Estimation for I.I.D. Paths of Fractional SDE

This paper deals with nonparametric projection estimators of the drift f...
05/14/2021

### Nadaraya-Watson Estimator for I.I.D. Paths of Diffusion Processes

This paper deals with a nonparametric Nadaraya-Watson estimator b of the...
04/02/2021

### Projection Estimators of the Stationary Density of a Differential Equation Driven by the Fractional Brownian Motion

The paper deals with projection estimators of the density of the station...
07/22/2019

### Nonparametric Estimation of the Trend in Reflected Fractional SDE

This paper deals with the consistency, a rate of convergence and the asy...
03/11/2020

### Adaptive estimation of the stationary density of a stochastic differential equation driven by a fractional Brownian motion

We build and study a data-driven procedure for the estimation of the sta...
11/30/2021

### Parameter estimation for an Ornstein-Uhlenbeck Process driven by a general Gaussian noise with Hurst Parameter H∈ (0,1/2)

In Chen and Zhou 2021, they consider an inference problem for an Ornstei...
12/16/2020

### Limit distribution of the least square estimator with observations sampled at random times driven by standard Brownian motion

In this article, we study the limit distribution of the least square est...
##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1. Introduction

Since the 1980’s, the statistical inference for stochastic differential equations (SDE) driven by a Brownian motion has been widely investigated by many authors in the parametric and in the nonparametric frameworks. Classically (see Kutoyants [18]), the estimators of the drift function are computed from one path of the solution to the SDE and converge when the time horizon goes to infinity. The existence and the uniqueness of the stationary solution to the SDE are then required, and obtained thanks to restrictive conditions on the drift function.
Since few years, a new type of parametric and nonparametric estimators is investigated ; those computed from multiple independent observations on of the SDE solution. Indeed, this functional data analysis problem is already studied in the parametric framework (see Ditlevsen and De Gaetano [14], Overgaard et al. [21], Picchini, De Gaetano and Ditlevsen [22], Picchini and Ditlevsen [23], Comte, Genon-Catalot and Samson [6], Delattre and Lavielle [10], Delattre, Genon-Catalot and Samson [9], Dion and Genon-Catalot [13], Delattre, Genon-Catalot and Larédo [8], etc.) and more recently in the nonparametric framework (see Comte and Genon-Catalot [4, 5], Della Maestra and Hoffmann [11], and Marie and Rosier [19]). In [4, 5], the authors extend to the diffusion processes framework the projection least squares estimators already well studied in the regression framework (see Cohen et al. [2] and Comte and Genon-Catalot [3]). Our paper deals with a nonparametric estimation problem close to this last one.

Consider the stochastic process , defined by

 (1) Zt=∫t0J0(s)d⟨M⟩s+Mt ; ∀t∈[0,T],

where is a centered, continuous and square integrable martingale vanishing at , and is an unknown function which belongs to . By assuming that the quadratic variation of is deterministic for every , our paper deals with the estimator of minimizing the objective function

 J⟼γm,N(J):=1NN∑i=1(∫T0J(s)2d⟨Mi⟩s−2∫T0J(s)dZis)

on a -dimensional function space , where (resp. ) are independent copies of (resp. ) and . Precisely, risk bounds are established on and on the adaptive estimator , where

 ˆm=argminm∈MN{γm,N(ˆJm,N)+pen(m)}withpen(.):=ccal.NandMN⊂{1,…,N}.

Now, consider the differential equation

 (2) Xt=x0+∫t0V(Xs)(b0(s)ds+σ(s)dBs) ; t∈[0,T],

where , is a fractional Brownian motion of Hurst parameter , the stochastic integral with respect to is taken pathwise (in Young’s sense), and , and are regular enough. An appropriate transformation (see Subsection 4.1) allows to rewrite Equation (2) as a model of type (1) driven by the Molchan martingale which quadratic variation is for every . Our paper also deals with a risk bound on an estimator of derived from .
Finally, let us consider a financial market model in which the prices of the risky asset are modeled by the following equation of type (2):

 (3) St=S0+∫t0Su((¯¯b0(u)−σ22u2H−1)du+σdBu) ; t∈[0,T]

where and . This is a non-autonomous extension of the fractional Black-Scholes model defined in Hu et al. [15]. An estimator of is derived from at Subsection 4.3.
Up to our knowledge, only Comte and Marie [7] deals with a nonparametric estimator of the drift function computed from multiple independent observations on of the solution to a fractional SDE.

At Section 2, a detailed definition of the projection least square estimator of is provided. Section 3 deals with risk bounds on and on the adaptive estimator . At Section 4, the results of Section 3 on the estimator of are applied to the estimation of in Equation (2) and then of in Equation (3). Finally, at Section 5, some numerical experiments on Model (1) are provided when is the Molchan martingale.

## 2. A projection least square estimator of the map J0

In the sequel, the quadratic variation of fulfills the following assumption.

###### Assumption 2.1.

The (nonnegative, increasing and continuous) process is a deterministic function.

For some results, fulfills the following stronger assumption.

###### Assumption 2.2.

There exists such that is continuous from into , and such that

 ⟨M⟩t=∫t0μ(s)ds ; ∀t∈[0,T].

### 2.1. The objective function

In order to define a least square projection estimator of , let us consider independent copies (resp. ) of (resp. ), and the objective function defined by

 γm,N(J):=1NN∑i=1(∫T0J(s)2d⟨Mi⟩s−2∫T0J(s)dZis)

for every , where , and are continuous functions from into such that is an orthonormal family in .

Remark. Note that since is nonnegative, increasing and continuous, and since the ’s are continuous from into , the objective function is well-defined.

For any ,

 E(γm,N(J)) = ∫T0J(s)2d⟨M⟩s−2∫T0J(s)J0(s)d⟨M⟩s−2E(∫T0J(s)dMs) = ∫T0(J(s)−J0(s))2d⟨M⟩s−∫T0J0(s)2d⟨M⟩s.

Then, the more is close to , the more is small. For this reason, the estimator of minimizing is studied in this paper.

### 2.2. The projection least square estimator

Consider the estimator

 (4) ˆJm,N:=argminJ∈Smγm,N(J)

of . Since , there exist

square integrable random variables

such that

 ˆJm,N=m∑j=1ˆθjφj.

Then,

 ∇γm,N(ˆJm,N)=(1NN∑i=1(2m∑k=1ˆθk∫T0φj(s)φk(s)d⟨Mi⟩s−2∫T0φj(s)dZis))j∈{1,…,m}.

Therefore, by (4), necessarily

 ˆθm,N:=(ˆθ1,…,ˆθm)∗=Ψ−1mzm,N,

where

 Ψm:=(∫T0φj(s)φk(s)d⟨M⟩s)j,k∈{1,…,m}andzm,N:=(1NN∑i=1∫T0φj(s)dZis)j∈{1,…,m}.

## 3. Risk bound and model selection

In the sequel, the space is equipped with the scalar product defined by

 ⟨φ,ψ⟩⟨M⟩:=∫T0φ(s)ψ(s)d⟨M⟩s

for every . The associated norm is denoted by .

First, the following proposition provides a risk bound on for a fixed .

###### Proposition 3.1.

Under Assumption 2.1,

 (5) E(∥ˆJm,N−J0∥2⟨M⟩)⩽minJ∈Sm∥J−J0∥2⟨M⟩+2mN.

Note that Inequality (5

) says first that the bound on the variance of our least square estimator of

is of order , as in the usual nonparametric regression framework. Under Assumption 2.2, the following corollary provides a more understandable expression of the bound on the bias in Inequality (5).

###### Corollary 3.2.

Under Assumption 2.2,

 E(∥ˆJm,N−J0∥22)⩽∥μ−1∥∞,T∥p⊥Sm(μ)(μ1/2J0)−μ1/2J0∥22+2∥μ−1∥∞,TmN

where

 Sm(μ):={ι∈L2([0,T],dt):∃φ∈Sm, ∀t∈(0,T], ι(t)=μ(t)1/2φ(t)}.

For instance, assume that , where

 ¯¯¯¯φ1(t):=√1μ(t)T,¯¯¯¯φ2j(t):=√2μ(t)Tcos(2πjtT)and¯¯¯¯φ2j+1(t):=√2μ(t)Tsin(2πjtT)

for every and satisfying . The basis of , orthonormal in , is obtained from

via the Gram-Schmidt process. Consider also the Sobolev space

 Wβ2([0,T]):={ι∈Cβ−1([0,T];R):∫T0ι(β)(t)2dt<∞} ; β∈N∗,

and assume that there exists such that for every . Then, by DeVore and Lorentz [12], Theorem 2.3 p. 205, there exists a constant , not depending on , such that

 ∥p⊥Sm(μ)(μ1/2J0)−μ1/2J0∥22=∥p⊥Sm(μ)(ι0)−ι0∥22⩽cβ,Tm−2β.

Therefore, by Corollary 3.2,

 E(∥ˆJm,N−J0∥22)⩽∥μ−1∥∞,T(cβ,Tm−2β+2mN).

Finally, consider , and

 ˆm=argminm∈MN{γm,N(ˆJm,N)+pen(m)}withpen(.):=ccal.N,

where is a constant to calibrate in practice via, for instance, the

slope heuristic

. In the sequel, the ’s fulfill the following assumption.

###### Assumption 3.3.

For every , if , then .

The following theorem provides a risk bound on the adaptive estimator .

###### Theorem 3.4.

Under Assumptions 2.1 and 3.3, there exists a deterministic constant , not depending on , such that

 E(∥ˆJˆm,N−J0∥2⟨M⟩)⩽c???(minm∈MN{E(∥ˆJm,N−J0∥2⟨M⟩)+pen(m)}+1N).

Moreover, under Assumption 2.2,

 E(∥ˆJˆm,N−J0∥22)⩽c???∥μ−1∥∞,T(minm∈MN{∥p⊥Sm(μ)(μ1/2J0)−μ1/2J0∥22+(2+ccal)mN}+1N).

As in the usual nonparametric regression framework, since is of same order than the bound on the variance term of for every , Theorem 3.4 says that the risk of our adaptive estimator is controlled by the minimal risk of on up to a multiplicative constant not depending on .

## 4. Application to differential equations driven by the fractional Brownian motion

Throughout this section, is twice continuously differentiable with bounded derivatives, is -Hölder continuous with , and is continuous. Under these conditions on , and , Equation (2) has a unique solution which paths are -Hölder continuous from into for every (see Kubilius et al. [17], Theorem 1.42). The maps and are known and our purpose is to provide a nonparametric estimator of .

### 4.1. Auxiliary model

The model transformation used in the sequel has been introduced in Kleptsyna and Le Breton [16] in the parametric estimation framework. Consider the function space

 Q:={Q:the function t↦t1/2−HQ(t) belongs % to L1([0,T],dt)},

let be the map defined by

 Q0(t):=b0(t)σ(t) ; ∀t∈[0,T],

and assume that . Consider also the Molchan martingale defined by

 Mt:=∫t0ℓ(t,s)dBs ; ∀t∈[0,T],

where

 ℓ(t,s):=cHs1/2−H(t−s)1/2−H1(0,t)(s)% ; ∀s,t∈[0,T]

with

 cH=(Γ(3−2H)2HΓ(3/2−H)3Γ(H+1/2))1/2,

and the process defined by

 Zt:=∫t0ℓ(t,s)dYswithYt:=∫t0dXsV(Xs)σ(s)

for every . Then, Equation (2) leads to

 (6) Zt = j(Q0)(t)+Mt = ∫t0J(Q0)(s)d⟨M⟩s+Mt,

where

 j(Q)(t):=∫t0ℓ(t,s)Q(s)dsandJ(Q)(t):=(2−2H)−1t2H−1j(Q)′(t)

for every and almost every . Note that the Molchan martingale fulfills Assumption 2.2 with for every .

### 4.2. An estimator of Q0

In Model (6), for any , the solution to Problem (4) is a nonparametric estimator of . So, this subsection deals with an estimator of solving the inverse problem

 (7) J(ˆQ)=ˆJm,N.

Let us consider the function space

 J:={ι:the function t∈[0,T]⟼∫t0s1−2Hι(s)ds belongs to I3/2−H0+(L1([0,T],dt))},

where is the Riemann-Liouville left-sided fractional integral of order . The reader can refer to Samko et al. [25] on fractional calculus.

In order to provide an estimator of with a closed-form expression, let us establish first the following technical proposition.

###### Proposition 4.1.

The map is one-to-one from into . Moreover, for every and almost every ,

 J−1(φ)(t)=¯cHtH−1/2∫t0(t−s)H−3/2s1−2Hι(s)ds

with

 ¯cH=2−2HcHΓ(3/2−H)Γ(H−1/2).

By Proposition 4.1, if , then

 ˆQm,N(t):=¯cHtH−1/2∫t0(t−s)H−3/2s1−2HˆJm,N(s)ds ; t∈[0,T]

is the solution to Problem (7) in . Note that even if the ’s don’t belong to , since these functions are continuous from into , is well-defined but not necessarily a solution to Problem (7

). A simple vector subspace of

is provided at the end of this subsection.

The following proposition provides risk bounds on , , and on the adaptive estimator .

###### Proposition 4.2.

If the ’s belong to , then there exists a deterministic constant , not depending on , such that

 E(∥ˆQm,N−Q0∥22)⩽c???,1(minι∈Sm∥ι−J(Q0)∥2⟨M⟩+mN) ; ∀m∈{1,…,N}.

If in addition the ’s fulfill Assumption 3.3, then there exists a deterministic constant , not depending on , such that

 E(∥ˆQˆm,N−Q0∥22)⩽c???,2(minm∈MN{minι∈Sm∥ι−J(Q0)∥2⟨M⟩+mN}+1N).

Proposition 4.2 says that the MISE of , , (resp. ) has at most the same bound than the MISE of (resp. ).

Finally, the following proposition provides a simple vector subspace of .

###### Proposition 4.3.

The function space

 J:={ι∈C1([0,T];R):limt→0+t−2Hι(t) and limt→0+t1−2Hι′(t) exist and are finite}

is a subset of .

Consider and such that is an orthonormal family of . In particular, note that are linearly independent. Moreover, assume that with for every and . The basis of , orthonormal in , is obtained from via the Gram-Schmidt process, and the ’s belong to . For every , there exist such that

 ι=m∑j=1αj(ι)¯¯¯¯φj=υμ−1/2¯ι,

where for every , and

 ¯ι:=m∑j=1αj(ι)ψj∈SmwithSm:=span{ψ1,…,ψm}.

So, by assuming that , the bound on the bias term of in the inequalities of Proposition 4.2 can be controlled the following way:

 minι∈Sm∥ι−J(Q0)∥2⟨M⟩ = min¯ι∈Sm∥υ(¯ι−υ−1μ1/2J(Q0))∥22 ⩽ T2H+1min¯ι∈Sm∥¯ι−¯¯¯μJ(Q0)∥22=∥p⊥Sm(¯μJ(Q0))−¯¯¯μJ(Q0)∥22

where, for every ,

 ¯¯¯μ(t):=υ(t)−1μ(t)1/2=t−2H.

If is the -dimensional trigonometric basis, and if there exists such that for every , then

 minι∈Sm∥ι−J(Q0)∥2⟨M⟩⩽T2H+1∥p⊥Sm(ι0)−ι0∥22⩽cβ,TT2H+1m−2β.

### 4.3. Example: drift estimation in a non-autonomous fractional Black-Scholes model

Let us consider a financial market model in which the prices process of the risky asset is defined by

 St:=S0exp(∫t0(¯¯b0(u)−σ22u2H−1)du+σBt) ; ∀t∈[0,T],

where and . This is a non-autonomous extension of the fractional Black-Scholes model defined in Hu et al. [15]. Thanks to the change of variable formula for Young’s integral,

 St=S0+∫t0Su((¯¯b0(u)−σ22u2H−1)du+σdBu) ; ∀t∈[0,T].

Then, is the solution to Equation (2) with , and

 b0(t)=¯¯b0(t)−σ22t2H−1 ; ∀t∈[0,T].

Consider independent copies of . For every and , consider also

 Zit:=cHσ∫t0u1/2−H(t−u)1/2−HSiudSiu.

If the volatility constant is known, thanks to Subsection 4.2, a nonparametric estimator of is given by

 ˆ¯¯b0(m,N;t):=σ22t2H−1+σˆQm,N(t) ; t∈[0,T],

where

 ˆQm,N(t):=¯cHtH−1/2∫t0(t−s)H−3/2s1−2HˆJm,N(s)ds ; t∈[0,T]

and

 ˆJm,N=argminJ∈Sm{1NN∑i=1(∫T0J(s)2s1−2Hds−2∫T0J(s)dZis)}.

Since

 ∥ˆ¯¯b0(m,N;.)−¯¯b0∥22=σ2∥ˆQm,N−Q0∥22withQ0=¯¯b0σ,

if the ’s belong to , then Proposition 4.2 provides risk bounds on , , and on the adaptive estimator .

## 5. Numerical experiments

Some numerical experiments on our estimation method of in Equation (1) are presented in this subsection when is the Molchan martingale:

 Mt=∫t0ℓ(t,s)dBs=(2−2H)1/2∫t0s1/2−HdWs ; t∈[0,1]

with and the Brownian motion driving the Mandelbrot-Van Ness representation of the fractional Brownian motion . The estimation method investigated on the theoretical side at Section 3 is implemented here for the three following examples of functions :

 J0,1:t∈[0,1]↦10t2,J0,2:t∈(0,1]↦10(−log(t))1/2andJ0,3:t∈(0,1]↦20t−0.05.

These functions belong to as required. Indeed, one the one hand, is continuous on and

 −∫10log(t)d⟨M⟩t = −∫10log(t)t1−2Hdt = 12−2H(limε→0+log(ε)ε2−2H+∫10t1−2Hdt)=1(2−2H)2<∞.

On the other hand, for every such that ,

 ∫10t−2αd⟨M⟩t = ∫10t1−2α−2Hdt = 12(1−α−H)(1−limε→0+ε2(1−α−H))=12(1−α−H)<∞.

Since for every , with , and since in our numerical experiments, is square-integrable with respect to