Set Cross Entropy: Likelihood-based Permutation Invariant Loss Function for Probability Distributions

12/04/2018
by   Masataro Asai, et al.
0

We propose a permutation-invariant loss function designed for the neural networks reconstructing a set of elements without considering the order within its vector representation. Unlike popular approaches for encoding and decoding a set, our work does not rely on a carefully engineered network topology nor by any additional sequential algorithm. The proposed method, Set Cross Entropy, has a natural information-theoretic interpretation and is related to the metrics defined for sets. We evaluate the proposed approach in two object reconstruction tasks and a rule learning task.

READ FULL TEXT

page 6

page 8

page 14

page 16

research
04/27/2018

Negative Log Likelihood Ratio Loss for Deep Neural Network Classification

In deep neural network, the cross-entropy loss function is commonly used...
research
06/28/2022

On the Rényi Cross-Entropy

The Rényi cross-entropy measure between two distributions, a generalizat...
research
11/22/2019

An Alternative Cross Entropy Loss for Learning-to-Rank

Listwise learning-to-rank methods form a powerful class of ranking algor...
research
03/22/2022

A Quantitative Comparison between Shannon and Tsallis Havrda Charvat Entropies Applied to Cancer Outcome Prediction

In this paper, we propose to quantitatively compare loss functions based...
research
05/19/2018

Optimizing the F-measure for Threshold-free Salient Object Detection

Current CNN-based solutions to salient object detection (SOD) mainly rel...
research
02/06/2013

Probability Update: Conditioning vs. Cross-Entropy

Conditioning is the generally agreed-upon method for updating probabilit...
research
06/03/2021

The Earth Mover's Pinball Loss: Quantiles for Histogram-Valued Regression

Although ubiquitous in the sciences, histogram data have not received mu...

Please sign up or login with your details

Forgot password? Click here to reset