The Representation Jensen-Shannon Divergence

05/25/2023
by   Jhoan K. Hoyos-Osorio, et al.
0

Statistical divergences quantify the difference between probability distributions finding multiple uses in machine-learning. However, a fundamental challenge is to estimate divergence from empirical samples since the underlying distributions of the data are usually unknown. In this work, we propose the representation Jensen-Shannon Divergence, a novel divergence based on covariance operators in reproducing kernel Hilbert spaces (RKHS). Our approach embeds the data distributions in an RKHS and exploits the spectrum of the covariance operators of the representations. We provide an estimator from empirical covariance matrices by explicitly mapping the data to an RKHS using Fourier features. This estimator is flexible, scalable, differentiable, and suitable for minibatch-based optimization problems. Additionally, we provide an estimator based on kernel matrices without having an explicit mapping to the RKHS. We show that this quantity is a lower bound on the Jensen-Shannon divergence, and we propose a variational approach to estimate it. We applied our divergence to two-sample testing outperforming related state-of-the-art techniques in several datasets. We used the representation Jensen-Shannon divergence as a cost function to train generative adversarial networks which intrinsically avoids mode collapse and encourages diversity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2021

The Representation Jensen-Rényi Divergence

We introduce a divergence measure between data distributions based on op...
research
12/05/2018

A Short Note on the Jensen-Shannon Divergence between Simple Mixture Distributions

This short note presents results about the symmetric Jensen-Shannon dive...
research
05/02/2019

Estimating Kullback-Leibler Divergence Using Kernel Machines

Recently, a method called the Mutual Information Neural Estimator (MINE)...
research
05/25/2017

Non-parametric estimation of Jensen-Shannon Divergence in Generative Adversarial Network training

Generative Adversarial Networks (GANs) have become a widely popular fram...
research
06/18/2012

Tighter Variational Representations of f-Divergences via Restriction to Probability Measures

We show that the variational representations for f-divergences currently...
research
05/03/2023

Quantifying the Dissimilarity of Texts

Quantifying the dissimilarity of two texts is an important aspect of a n...
research
06/29/2019

Statistical estimation of the Kullback-Leibler divergence

Wide conditions are provided to guarantee asymptotic unbiasedness and L^...

Please sign up or login with your details

Forgot password? Click here to reset