The B-Exponential Divergence and its Generalizations with Applications to Parametric Estimation
In this paper a new family of minimum divergence estimators based on the Bregman divergence is proposed, where the defining convex function has an exponential nature. These estimators avoid the necessity of using an intermediate kernel density and many of them also have strong robustness properties. It is further demonstrated that the proposed approach can be extended to construct a class of generalized estimating equations, where the pool of the resultant estimators encompass a large variety of minimum divergence estimators and range from highly robust to fully efficient based on the choice of the tuning parameters. All of the resultant estimators are M-estimators, where the defining functions make explicit use of the form of the parametric model. The properties of these estimators are discussed in detail; the theoretical results are substantiated by simulation and real data examples. It is observed that in many cases, certain robust estimators from the above generalized class provide better compromises between robustness and efficiency compared to the existing standards.
READ FULL TEXT