Robust and Efficient Parameter Estimation for Discretely Observed Stochastic Processes
In various practical situations, we encounter data from stochastic processes which can be efficiently modelled by an appropriate parametric model for subsequent statistical analyses. Unfortunately, the most common estimation and inference methods based on the maximum likelihood (ML) principle are susceptible to minor deviations from assumed model or data contamination due to their well known lack of robustness. Since the alternative non-parametric procedures often lose significant efficiency, in this paper, we develop a robust parameter estimation procedure for discretely observed data from a parametric stochastic process model which exploits the nice properties of the popular density power divergence measure in the framework of minimum distance inference. In particular, here we define the minimum density power divergence estimators (MDPDE) for the independent increment and the Markov processes. We establish the asymptotic consistency and distributional results for the proposed MDPDEs in these dependent stochastic process set-ups and illustrate their benefits over the usual ML estimator for common examples like Poisson process, drifted Brownian motion and auto-regressive models.
READ FULL TEXT