Problems of parametric inference when we observe over a long time interval a process of type
with unknown parameters or and with a given set of functions have been considered in a number of papers; alternatively, such models can be written as
with related functions . Also, driving Brownian motion in the Ornstein-Uhlenbeck type equations has been replaced by certain Lévy processes or by fractional Brownian motion. Many papers focus on orthonormal sets of periodic functions with known periodicity. To determine estimators and limit laws for rescaled estimation errors in this case, periodicity allows to exploit ergodicity or stationarity with respect to the time grid of multiples of the periodicity. We mention Dehling, Franke and Kott , Franke and Kott  and Dehling, Franke and Woerner  where limit distributions for least squares estimators and maximum likelihood estimators are obtained. Rather than in asymptotic properties, Pchelintsev 
is interested in methods which allow to reduce squared risk –i.e. risk defined with respect to one particular loss function– uniformly over determined subsets of the parameter space, at fixed and finite sample size. Asymptotic efficiency of estimators is the topic of Höpfner and Kutoyants, where sums as above are replaced by periodic functions of known periodicity whose shape depends on parameters . When the parametrization is smooth enough, local asymptotic normality in the sense of LeCam (see LeCam , Hajek , Davies , Pfanzagl , LeCam and Yang ; with a different notion of local neighbourhood see Ibragimov and Khasminskii  and Kutoyants ) allows to identify a limit experiment with the following property: risk –asymptotically as the time of observation tends to , and with some uniformity over small neighbourhoods of the true parameter– is bounded below by a corresponding minimax risk in a limit experiment. This assertion holds with respect to a broad class of loss functions.
With a view to an estimation problem which arises in stochastic Hodgkin-Huxley models and which we explain below, the present paper deals with parameter estimation when one observes a process
with leading coefficient so that paths of almost surely tend to . Then good estimators for the parameters based on observation of up to time show the following behaviour: whereas estimation of parameters and works at the ’usual’ rate , parameters with can be estimated at rate as . With rescaled time , we prove local asymptotic normality as in the sense of LeCam with local scale
and with limit information process
at every . As a consequence of local asymptotic normality, there is a local asymptotic minimax theorem (Ibragimov and Khasminskii , Davies , LeCam and Yang , Kutoyants , Höpfner ) which allows to identify optimal limit distributions for rescaled estimation errors in the statistical model (1); the theorem also specifies a particular expansion of rescaled estimation errors (in terms of the central sequence in local experiments at ) which characterizes asymptotic efficiency. We can construct asymptotically efficient estimators for the model (1), and these estimators have a simple and explicit form.
We turn to an application of the results obtained for model (1
). Consider the problem of parameter estimation in a stochastic Hodgkin-Huxley model for the spiking behaviour of a single neuron belonging to an active network
where input received by the neuron is modelled by the increments of the stochastic process
by taking into account ’noise’ in the dendritic tree where incoming excitatory or inhibitory spike trains emitted by a large number of other neurons in the network add up and decay. See Hodgkin and Huxley , Izhikevich , Ermentrout and Terman  and the literature quoted there for the role of this model in neuroscience. Stochastic Hodgkin-Huxley models have been considered in Höpfner, Löcherbach and Thieullen , ,  and Holbach . For suitable data sets, membrane potential data hint to the existence of a quadratic variation which indicates the need for a stochastic modelization.
In systems (2) or (4), the variable represents the membrane potential in the neuron; the variables , , are termed gating variables and represent –in the sense of averages over a large number of channels– opening and closing of ion channels of certain types. The membrane potential can be measured intracellularly in good time resolution whereas the gating variables in the Hodgkin-Huxley model are not accessible to direct measurement.
In a sense of equivalence of experiments as in Holbach , the stochastic Hodgkin Huxley model (2)+(3) corresponds to a submodel of (1). This is of biological importance. Under the assumption that the stochastic model admits a fixed starting point which does not depend on , we can estimate the components and of the unknown parameter in equations (2)+(3) from the evolution of the membrane potential alone, and have at our disposal simple and explicit estimators with the following two properties and .
With local parameter parametrizing shrinking neighbourhoods of , risks
converge as to
where is two-dimensional standard Brownian motion. Here is an arbitrary constant, and any loss function which is continuous, subconvex and bounded.
We can compare the sequence of estimators for in (5) to arbitrary estimator sequences which can be defined from observation of the membrane potential up to time , provided their rescaled estimation errors –using the same norming as in (5)– are tight. For all such estimator sequences,
is always greater or equal than the limit in (6). This is the assertion of the local asymptotic minimax theorem. It makes sure that asymptotically as , it is impossible to outperform the simple and explicit estimator sequence which we have at hand.
The paper is organized as follows. Section 2 collects for later use convergence results for certain functionals of the Ornstein-Uhlenbeck process. Section 3 deals with local asymptotic normality (LAN) for the model (1): proposition 1 and theorem 1 in section 3.1 prove LAN, the local asymptotic minimax theorem is corollary 1 in section 3.1; we introduce and investigate estimators for in sections 3.2 and 3.3; theorem 2 in section 3.4 states their asymptotic efficiency. The application to parameter estimation in the stochastic Hodgkin-Huxley model (2)+(3) based on observation of the membrane potential is the topic of the final section 4: see theorem 3 and corollary 2 there.
2 Functionals of the Ornstein Uhlenbeck process
We state for later use properties of some functionals of the Ornstein Uhlenbeck process
with fixed starting point . and are fixed, and is the invariant measure of the process in (7); is defined on some .
Lemma 1: For defined by (7), for every and , we have almost sure convergence as
Proof: ( lemma 2.2,  lemma 2.5, compare to  thm. 1.6.4 p. 33)
1) We consider functions which satisfy . The case is the well known ration limit theorem for additive functionals of the ergodic diffusion (,  p. 214). Assuming that the assertion holds for , define . Stieltjes product formula for semimartingales with paths of locally bounded variation yields
Under our assumption, both terms on the right hand side are of stochastic order : since converges to almost surely as , the second term on the right hand side behaves as as ; the first term on the right hand side behaves as . This proves the assertion for .
2) We consider functions such that . For arbitrarly large but fixed, step 1) applied to functions and yields almost sure convergence
as . Since and , we have and as , and comparison
of trajectories gives the result in this case.
Lemma 2: For as above we have for every
Proof: This is integration by parts
and lemma 1 (with and ) applied to the right hand side.
Lemma 3: For defined by (7), for every , we have convergence in law
as to the limit
where is standard Brownian motion.
Proof: Rearranging SDE (7) we write
and have for every
In case , the right hand side is , and the scaling property of Brownian motion combined with ergodicity of yields weak convergence as asserted. In case , lemma 2 transforms the first term on the right hand side of (9), and we have
The martingale convergence theorem (Jacod and Shiryaev , VIII.3.24) shows that
converges weakly in the Skorohod path space to a continuous limit martingale with angle bracket , i.e. to
Scaled in the same way, the first term on the right hand side of (10)
is negligible in comparison to (11), uniformly on compact -intervals, by ergodicity of .
Lemma 4: a) For every we have an expansion
where almost surely. In case we have
b) For every , we have joint weak convergence as
with limit law
3 The statistical model of interest
Consider now a more general problem of parameter estimation from continuous-time observation of
where is a sufficiently smooth deterministic function which depends on some finite-dimensional parameter , and where the Ornstein Uhlenbeck process , unique strong solution to
depends on a parameter . The starting point is deterministic. Then solves the SDE
where depending on and is given by
For examples of parametric models of this type, see e.g., , , , and example 2.3 in . The constant in (15) is fixed and known: the quadratic variation of the semimartingale , to be calculated from the trajectory observed in continuous time, cannot be considered as a parameter.
We wish to estimate the unknown parameter based on time-continuous observation of in (14) over a long time interval, in the model
where trajectories of tend to almost surely as . Thus the parametrization is
and in SDE (16) which governs the observation , depending on has the form
Let denote the canonical path space for continuous processes; with the canonical process (i.e. for , ) and ,
For pairs in
, probability measures, are locally equivalent relative to , and we write
In the integrand,
so we exploit (14) to write for short
where under is the Ornstein Uhlenbeck process (15), and where
Localization at will be as follows: with notation
in place of into (23); finally we rescale time. Define
where is some process of remainder terms, a martingale with respect to and
(again by (14), stands for under ), and the angle bracket of under .
Proposition 1 : a) For fixed , components of converge -almost surely as to those of the deterministic process
For every , the matrix is invertible.
b) Let denote a two-dimensional standard Brownian motion with components and . In the cadlag path space (, chapters VI and VIII), martingales under converge weakly as to the limit martingale
Proof : The proof is in several steps.
1) We specify the angle bracket process of under . Its state at time
is a symmetric matrix of size . Taking into account the norming factor in front of in (27) we consider throughout . The entries are given as follows. We have
for all . In the first line of we have
in first and last position, and in-between for
For the last column of , the first entry has been given above, the last entry is
in-between we have for
It remains to consider the three integrals which are not deterministic: here lemma 1 establishes almost sure convergence
as under . This proves almost sure convergence of the components of