1 Introduction
Parameteric and nonparameteric estimation in the context of random effects models has been recently investigated by many authors (e.g. [1, 4, 5, 6, 19, 20]). In these models, the noise is represented by a Brownian motion characterized by independence property of its increments. Such a property is not valid for longmemory phenomena arising in a variety of different scientific fields, including hydrology [17], biology [3], medicine [14], economics [9] or traffic network [24]. As a result selfsimilar processes have been used to successfully model data exhibiting longrange dependence. Among the simplest models that display longrange dependence, one can consider the fractional Brownian motion (fBm), introduced in the statistics community by Mandelbrot and Van Ness [16]. A normalized fBm with the Hurst index is centered Gaussian process having the covariance
In modeling, the problems of statistical estimation of model parameters are of particular importance, so the growing number of papers devoted to statistical methods for equations with fractional noise is not surprising. We will cite only few of them; further references can be found in [18, 22]. In [12] the authors proposed and studied maximum likelihood estimators for fractional OrnsteinUhlenbeck process. Related results were obtained by Prakasa [21], where a more general model was considered. In [10] the author proposed a least squares (LS) estimator for fractional OrnsteinUhlenbeck process and proved its asymptotic normality. Recently, the same results are obtained using the same approach (LS) in [25], for the fractional Vasicek model with longmemory.
It is worth to mentioning the papers [11, 23] that deal with the whole range of Hurst parameter . Meanwhile, we have cited other papers that only the case where (which corresponds to long range dependence); recall that in the case , we get a classical diffusion extensively treated in literature [15].
This paper deals with statistical estimation of population parameters for fractional SDE’s with random effects. To our knowledge, this problem has not been yet investigated. Precisely, we concider only a fractional diffusion processes of the form
(1) 
where
is a random variable relying on parameter
to be estimated, and is a normalized fBm with Hurst parameter to be estimated. We study the additive linear case, , when . The estimators , and respectively of , and respectively are constructed and their asymptotic behaviors are investigated. The model (1) is simple and we can derive explicit estimators, also the model generalizes the model considered in [11], while the techniques used here to investigate asymptotic properties are elementary (due to the incorporation of the random effects, hence we avoid the Malliavin techniques), which gives as a first reason to choose it. The second reason is that (1) is widely applied in various fields. in fact the Vasiceck model is an example of type (1). The third reason is that the estimation of the population parameters requires few observations per subject, which coincide with several natural phenomena where the repeated measurements are rarely available if not impossible. Finally let’s note that nonparametric estimation has been realized recently by us for similar model[8].The rest of the paper is organized as follows. In Section 2, we introduce the model and some preliminaries about the likelihood function. In Section 3 we derive the parameters estimators and we establish consistency and asymptotic normality. The simulations are presented in Section 4 while Section 5 contains some concluding remarks and gives directions of further research.
Throughout the paper the notations , and
mean, respectively, simple convergence, convergence almost surely with respect to the probability measure
and convergence in distribution.2 Model and Preliminary results
Before introducing our estimation techniques, we first state some basic facts about fractional Brownian motions and likelihood function. Let be a stochastic basis satisfying the usual conditions. The natural filtration of a stochastic process is understood as the completion of the filtration generated by this process. Let , be independent normalized fractional Brownian motion (fBm) with a common Hurst parameter . Let be independent and identically distributed (i.i.d) valued random variables on the common probability space independent of . Consider subjects with dynamics ruled by the following general linear stochastic differential equations:
(2)  
where and are supposed to be known in their own spaces. Let the random effects be measurable with common density , where is some dominating measure on and is unknown parameter. Set , where is an open set in . Sufficient conditions for the existence and uniqueness of solutions to (2) can be found in [18, p. 197] and references therein.
Let denote the space of real continuous functions defined on endowed with field . The field is associated with the topology of uniform convergence on . We introduce the distribution on of the process . On ,
denotes the joint distribution of
. Let be the marginal distribution of on . Since the subjects are independent (this is inherited from the independence of and ), the distribution of the whole sample on is defined by . Thus the likelihood can be defined aswhere and , provided that for some fixed . It is well known that coincides with the distribution of the process defined by:
when , since in this case the process is markovian (e.g. [7]); hence, the Girsanov formula can be applied to get the derivative . When , the non Markovian property of the coupled process makes the construction of the likelihood very difficult. But in our case, the process is transformed into a for which the law of coincides with the distribution of a parametric fractional diffusion process .
3 Construction of estimators and their asymptotic properties
Consider the following process
(3)  
(4) 
Since and are independent the process is a Gaussian process. Furthermore, for each , we have and . For each subject , we consider observations where is a subdivision of . The density of given is expressed as
where and is the common covariance matrix of the subjects , . The loglikelihood of the whole sample is defined as
(5) 
For a specific distribution (say , we can solve the integrals given in (5). Indeed,
(6)  
3.1 Estimation of the Hurst parameter
Using data induced by one single subject (without loss of generality, say with ), we may construct a class of estimators of the Hurst index . More precisely, for all and for any filter of order , that is,
(7) 
Let consider the following arguments : , where , and , with is the usual gamma function. For invertibility of the function , we refer to [2, p. 7].
Theorem 3.1.
The following statements holds true, as the number of observations ,
 (i)

 (ii)

, where
3.2 Estimation of the population parameter
Now, assume that is known. From the loglikelihood given by (5) and (6), we derive an estimator given by
(8) 
For the parameter it sounds very difficult to derive an estimator. However, we can construct an alternative estimator and study its asymptotic behavior. Observing that
is a sample mean drawn from a sequence of i.i.d random variables, one might think that sample variance could also be used to estimate
. Unfortunately, simple computations shows that such a sample variance is not consistent. Thus, as an alternative we propose the following estimator for :(9) 
Theorem 3.2.
The estimator is unbaised, and as .
Proof.
Set . Substituting by , we have so . For the second statement, we consider the random variables defined by
(10) 
Clearly, are i.i.d random variables with
, then by strong law of large numbers,
converges almost surely to as . Set , we have∎
Before, we establish the bias of the estimator of , we first give the following result:
Lemma 3.3.
where are random variables given by (10).
Proof.
Substituting by , the independence of and gets
For the last equality we used the same techniques as in the proof of Theorem 2. For the second statement; by using the random variables s defined previously, we have
∎
Theorem 3.4.
The estimator is asymptotically unbiased, and as .
Proof.
By virtue of Lemma 3, we get
Applying the strong law of large numbers and the continuous mapping theorem for almost sure convergence, we get
Similar computations lead to
where . In the last equality we used the fact that is a centered Gaussian random variables with variance . ∎
Remark
For the case of continuous observation with horizon , we propose the following estimator defined by
It is easy to see that as and is consistent when . The reason we choose this double asymptotic framework, is that we proceed in two steps; in the first step we estimate random effects as the horizon increases to , then we use the empirical mean and variance to estimate , where the random effects are replaced by their estimators.
Theorem 3.5.
The estimators and are asymptotically normal, i.e.
(11) 
and
(12) 
Proof.
Since is the average of i.i.d random variables with finite mean and finite variance, (11
) follows imediately from the central limit theorem. In order to show (
12) we consider the following random variables , and set . We see that is centered Gaussian process, with variance , and . So using the strong law of large numbers, we have as . Furthermore, the central limit theorem leads to as . Since . Therefore, using Slutsky theorem, the convergence in (12) is easily concluded. ∎4 Simulations
We will implement the two population parameter estimators for the model that we have studied to show their empirical behavior. We will simulate the observed vectors
using (4) for two numbers of subjects and with different lengths of observations per subject; , and . The fractional Brownian motions are simulated as in [13]. The experiment is as follows : we set equal to , and . For each case, replications involving samples are obtained by resampling trajectories of .The averages of the estimators and their exact against empirical standard deviations are reported in the Tables
13. The tables show that the parameter estimations are generally much closer to their true values as the number of subjects increases. Figures 13 display the histograms densities of the estimators, which reveal the convergence toward a limit distribution also as is sufficiently large, this confirms what was established before. Looking at Table 1, we see that the estimating for is not really close to exact values when there are very few observations () per subject when , this case has been observed every time when becomes large than . In this situation, for the real cases where the true value of is not available, it will be better to choose as large as possible () but this leads to huge computational cost for large values of . Yet, to keep the balance between the computational cost and goodness of fit, a small values of and sufficiently large values of should be considered.True values  















True values  














