DeepAI
Log In Sign Up

Independent Gaussian Distributions Minimize the Kullback-Leibler (KL) Divergence from Independent Gaussian Distributions

11/04/2020
βˆ™
by   Song Fang, et al.
βˆ™
0
βˆ™

This short note is on a property of the Kullback-Leibler (KL) divergence which indicates that independent Gaussian distributions minimize the KL divergence from given independent Gaussian distributions. The primary purpose of this note is for the referencing of papers that need to make use of this property entirely or partially.

READ FULL TEXT

page 1

page 2

page 3

page 4

βˆ™ 05/13/2021

Empirical Evaluation of Biased Methods for Alpha Divergence Minimization

In this paper we empirically evaluate biased methods for alpha-divergenc...
βˆ™ 04/16/2021

On the Robustness to Misspecification of Ξ±-Posteriors and Their Variational Approximations

Ξ±-posteriors and their variational approximations distort standard poste...
βˆ™ 02/28/2022

KL Divergence Estimation with Multi-group Attribution

Estimating the Kullback-Leibler (KL) divergence between two distribution...
βˆ™ 08/23/2020

Learn to Talk via Proactive Knowledge Transfer

Knowledge Transfer has been applied in solving a wide variety of problem...
βˆ™ 05/18/2016

The Quality of the Covariance Selection Through Detection Problem and AUC Bounds

We consider the problem of quantifying the quality of a model selection ...