Characterizing Logarithmic Bregman Functions

05/12/2021
by   Souvik Ray, et al.
0

Minimum divergence procedures based on the density power divergence and the logarithmic density power divergence have been extremely popular and successful in generating inference procedures which combine a high degree of model efficiency with strong outlier stability. Such procedures are always preferable in practical situations over procedures which achieve their robustness at a major cost of efficiency or are highly efficient but have poor robustness properties. The density power divergence (DPD) family of Basu et al.(1998) and the logarithmic density power divergence (LDPD) family of Jones et al.(2001) provide flexible classes of divergences where the adjustment between efficiency and robustness is controlled by a single, real, non-negative parameter. The usefulness of these two families of divergences in statistical inference makes it meaningful to search for other related families of divergences in the same spirit. The DPD family is a member of the class of Bregman divergences, and the LDPD family is obtained by log transformations of the different segments of the divergences within the DPD family. Both the DPD and LDPD families lead to the Kullback-Leibler divergence in the limiting case as the tuning parameter α→ 0. In this paper we study this relation in detail, and demonstrate that such log transformations can only be meaningful in the context of the DPD (or the convex generating function of the DPD) within the general fold of Bregman divergences, giving us a limit to the extent to which the search for useful divergences could be successful.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2021

Characterizing the Functional Density Power Divergence Class

The density power divergence (DPD) and related measures have produced ma...
research
10/27/2019

Density Power Downweighting and Robust Inference: Some New Strategies

Preserving the robustness of the procedure has, at the present time, bec...
research
12/21/2020

Robust Inference Using the Exponential-Polynomial Divergence

Density-based minimum divergence procedures represent popular techniques...
research
06/22/2021

On Selection Criteria for the Tuning Parameter in Robust Divergence

While robust divergence such as density power divergence and γ-divergenc...
research
04/15/2022

Two new families of bivariate APN functions

In this work, we present two new families of quadratic APN functions. Th...
research
01/18/2019

Gambling and Rényi Divergence

For gambling on horses, a one-parameter family of utility functions is p...
research
01/22/2021

The extended Bregman divergence and parametric estimation

Minimization of suitable statistical distances (between the data and mod...

Please sign up or login with your details

Forgot password? Click here to reset