On the Rényi Cross-Entropy

06/28/2022
by   Ferenc Cole Thierrin, et al.
0

The Rényi cross-entropy measure between two distributions, a generalization of the Shannon cross-entropy, was recently used as a loss function for the improved design of deep learning generative adversarial networks. In this work, we examine the properties of this measure and derive closed-form expressions for it when one of the distributions is fixed and when both distributions belong to the exponential family. We also analytically determine a formula for the cross-entropy rate for stationary Gaussian processes and for finite-alphabet Markov sources.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2022

Rényi Cross-Entropy Measures for Common Distributions and Processes with Memory

Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi c...
research
03/22/2022

A Quantitative Comparison between Shannon and Tsallis Havrda Charvat Entropies Applied to Cancer Outcome Prediction

In this paper, we propose to quantitatively compare loss functions based...
research
04/27/2021

A Dual Process Model for Optimizing Cross Entropy in Neural Networks

Minimizing cross-entropy is a widely used method for training artificial...
research
10/26/2018

Information Bottleneck Methods for Distributed Learning

We study a distributed learning problem in which Alice sends a compresse...
research
10/14/2020

Temperature check: theory and practice for training models with softmax-cross-entropy losses

The softmax function combined with a cross-entropy loss is a principled ...
research
12/04/2018

Set Cross Entropy: Likelihood-based Permutation Invariant Loss Function for Probability Distributions

We propose a permutation-invariant loss function designed for the neural...
research
02/06/2013

Probability Update: Conditioning vs. Cross-Entropy

Conditioning is the generally agreed-upon method for updating probabilit...

Please sign up or login with your details

Forgot password? Click here to reset