Computationally Efficient Wasserstein Loss for Structured Labels

03/01/2021
by   Ayato Toyokuni, et al.
0

The problem of estimating the probability distribution of labels has been widely studied as a label distribution learning (LDL) problem, whose applications include age estimation, emotion analysis, and semantic segmentation. We propose a tree-Wasserstein distance regularized LDL algorithm, focusing on hierarchical text classification tasks. We propose predicting the entire label hierarchy using neural networks, where the similarity between predicted and true labels is measured using the tree-Wasserstein distance. Through experiments using synthetic and real-world datasets, we demonstrate that the proposed method successfully considers the structure of labels during training, and it compares favorably with the Sinkhorn algorithm in terms of computation time and memory usage.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2021

Fixed Support Tree-Sliced Wasserstein Barycenter

The Wasserstein barycenter has been widely studied in various fields, in...
research
10/10/2019

On Scalable Variant of Wasserstein Barycenter

We study a variant of Wasserstein barycenter problem, which we refer to ...
research
06/24/2022

Approximating 1-Wasserstein Distance with Trees

Wasserstein distance, which measures the discrepancy between distributio...
research
01/27/2021

Supervised Tree-Wasserstein Distance

To measure the similarity of documents, the Wasserstein distance is a po...
research
06/17/2015

Learning with a Wasserstein Loss

Learning to predict multi-label outputs is challenging, but in many prob...
research
08/09/2014

Conditional Probability Tree Estimation Analysis and Algorithms

We consider the problem of estimating the conditional probability of a l...
research
06/01/2023

Hinge-Wasserstein: Mitigating Overconfidence in Regression by Classification

Modern deep neural networks are prone to being overconfident despite the...

Please sign up or login with your details

Forgot password? Click here to reset