Independent Gaussian Distributions Minimize the Kullback-Leibler (KL) Divergence from Independent Gaussian Distributions

11/04/2020
by   Song Fang, et al.
0

This short note is on a property of the Kullback-Leibler (KL) divergence which indicates that independent Gaussian distributions minimize the KL divergence from given independent Gaussian distributions. The primary purpose of this note is for the referencing of papers that need to make use of this property entirely or partially.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2021

Empirical Evaluation of Biased Methods for Alpha Divergence Minimization

In this paper we empirically evaluate biased methods for alpha-divergenc...
research
02/28/2022

KL Divergence Estimation with Multi-group Attribution

Estimating the Kullback-Leibler (KL) divergence between two distribution...
research
08/23/2020

Learn to Talk via Proactive Knowledge Transfer

Knowledge Transfer has been applied in solving a wide variety of problem...
research
09/15/2021

How to use KL-divergence to construct conjugate priors, with well-defined non-informative limits, for the multivariate Gaussian

The Wishart distribution is the standard conjugate prior for the precisi...
research
05/18/2016

The Quality of the Covariance Selection Through Detection Problem and AUC Bounds

We consider the problem of quantifying the quality of a model selection ...
research
05/09/2018

Description of a Tracking Metric Inspired by KL-divergence

A unified metric is given for the evaluation of tracking systems. The me...

Please sign up or login with your details

Forgot password? Click here to reset