On f-divergences between Cauchy distributions
We prove that the f-divergences between univariate Cauchy distributions are always symmetric and can be expressed as strictly increasing functions of the chi-squared divergence. We report the corresponding functions for the total variation distance, the Kullback-Leibler divergence, the LeCam-Vincze divergence, the squared Hellinger divergence, the Taneja divergence, and the Jensen-Shannon divergence. We then show that this symmetric f-divergence property does not hold anymore for multivariate Cauchy distributions. Finally, we present several metrizations of f-divergences between univariate Cauchy distributions.
READ FULL TEXT