A Note on the Kullback-Leibler Divergence for the von Mises-Fisher distribution

02/25/2015
by   Tom Diethe, et al.
0

We present a derivation of the Kullback Leibler (KL)-Divergence (also known as Relative Entropy) for the von Mises Fisher (VMF) Distribution in d-dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/12/2020

Fisher Auto-Encoders

It has been conjectured that the Fisher divergence is more robust to mod...
research
04/01/2021

Schrödinger encounters Fisher and Rao: a survey

In this short note we review the dynamical Schrödinger problem on the no...
research
02/23/2023

An Explicit Expansion of the Kullback-Leibler Divergence along its Fisher-Rao Gradient Flow

Let V_* : ℝ^d →ℝ be some (possibly non-convex) potential function, and c...
research
07/11/2012

An Extended Cencov-Campbell Characterization of Conditional Information Geometry

We formulate and prove an axiomatic characterization of conditional info...
research
05/27/2013

Some results on a χ-divergence, an extended Fisher information and generalized Cramér-Rao inequalities

We propose a modified χ^β-divergence, give some of its properties, and s...
research
07/17/2018

Fisher zeros and correlation decay in the Ising model

In this note, we show that the zero field Ising partition function has n...
research
06/08/2020

The Power Spherical distribution

There is a growing interest in probabilistic models defined in hyper-sph...

Please sign up or login with your details

Forgot password? Click here to reset