Spread Divergences

11/21/2018
by   David Barber, et al.
0

For distributions p and q with different support, the divergence generally will not exist. We define a spread divergence on modified p and q and describe sufficient conditions for the existence of such a divergence. We give examples of using a spread divergence to train implicit generative models, including linear models (Principal Components Analysis and Independent Components Analysis) and non-linear models (Deep Generative Networks).

READ FULL TEXT

page 8

page 13

page 14

page 15

research
06/15/2021

Divergence Frontiers for Generative Models: Sample Complexity, Quantization Level, and Frontier Integral

The spectacular success of deep generative models calls for quantitative...
research
03/06/2020

Training Deep Energy-Based Models with f-Divergence Minimization

Deep energy-based models (EBMs) are very flexible in distribution parame...
research
08/29/2023

Maximum information divergence from linear and toric models

We study the problem of maximizing information divergence from a new per...
research
08/11/2020

Conditions for the existence of a generalization of Rényi divergence

We give necessary and sufficient conditions for the existence of a gener...
research
09/03/2015

Training of CC4 Neural Network with Spread Unary Coding

This paper adapts the corner classification algorithm (CC4) to train the...
research
11/27/2014

On the Expressive Efficiency of Sum Product Networks

Sum Product Networks (SPNs) are a recently developed class of deep gener...
research
06/08/2021

Manifold Topology Divergence: a Framework for Comparing Data Manifolds

We develop a framework for comparing data manifolds, aimed, in particula...

Please sign up or login with your details

Forgot password? Click here to reset